GET /1/crawlers/{id}/crawl_runs

The Crawler Logs feature allows you to monitor and debug your crawler’s activity by recording detailed logs for each crawl run. Logs are useful for troubleshooting crawl issues, verifying site coverage, and monitoring crawler performance over time.

Servers

Path parameters

Name Type Required Description
id String Yes

Crawler ID.

Query parameters

Name Type Required Description
order String No

Order of the query 'ASC' or 'DESC'.

Valid values:

  • "DESC"
  • "ASC"
limit Integer No

Limit of the query results.

Default value: 10

until String No

Date 'until' filter.

status String No

Status to filter 'DONE', 'SKIPPED' or 'FAILED'.

Valid values:

  • "DONE"
  • "FAILED"
  • "SKIPPED"
from String No

Date 'from' filter.

offset Integer No

Offset of the query results.

How to start integrating

  1. Add HTTP Task to your workflow definition.
  2. Search for the API you want to integrate with and click on the name.
    • This loads the API reference documentation and prepares the Http request settings.
  3. Click Test request to test run your request to the API and see the API's response.