GET /1/crawlers/{id}/crawl_runs
The Crawler Logs feature allows you to monitor and debug your crawler’s activity by recording detailed logs for each crawl run. Logs are useful for troubleshooting crawl issues, verifying site coverage, and monitoring crawler performance over time.
Servers
- https://crawler.algolia.com/api
Path parameters
| Name | Type | Required | Description |
|---|---|---|---|
id |
String | Yes |
Crawler ID. |
Query parameters
| Name | Type | Required | Description |
|---|---|---|---|
order |
String | No |
Order of the query 'ASC' or 'DESC'. Valid values:
|
limit |
Integer | No |
Limit of the query results. Default value: 10 |
until |
String | No |
Date 'until' filter. |
status |
String | No |
Status to filter 'DONE', 'SKIPPED' or 'FAILED'. Valid values:
|
from |
String | No |
Date 'from' filter. |
offset |
Integer | No |
Offset of the query results. |
How to start integrating
- Add HTTP Task to your workflow definition.
- Search for the API you want to integrate with and click on the name.
- This loads the API reference documentation and prepares the Http request settings.
- Click Test request to test run your request to the API and see the API's response.