GET /1/crawlers/{id}/{logId}/download

The Crawler Logs file allows you to monitor and debug your crawler’s activity by recording detailed logs for each crawl run. Logs are useful for troubleshooting crawl issues, verifying site coverage, and monitoring crawler performance over time.

Servers

Path parameters

Name Type Required Description
id String Yes

Crawler ID.

logId String Yes

Crawler log ID.

How to start integrating

  1. Add HTTP Task to your workflow definition.
  2. Search for the API you want to integrate with and click on the name.
    • This loads the API reference documentation and prepares the Http request settings.
  3. Click Test request to test run your request to the API and see the API's response.