GET /1/crawlers/{id}/{logId}/download
The Crawler Logs file allows you to monitor and debug your crawler’s activity by recording detailed logs for each crawl run. Logs are useful for troubleshooting crawl issues, verifying site coverage, and monitoring crawler performance over time.
Servers
- https://crawler.algolia.com/api
Path parameters
| Name | Type | Required | Description |
|---|---|---|---|
id |
String | Yes |
Crawler ID. |
logId |
String | Yes |
Crawler log ID. |
How to start integrating
- Add HTTP Task to your workflow definition.
- Search for the API you want to integrate with and click on the name.
- This loads the API reference documentation and prepares the Http request settings.
- Click Test request to test run your request to the API and see the API's response.