PATCH /sites/{site_id}/robots_txt
Update the robots.txt
configuration for various user agents.
This endpoint requires an Enterprise workspace.
Required scope | site_config:write
Servers
- https://api.webflow.com/v2
Path parameters
Name | Type | Required | Description |
---|---|---|---|
site_id |
String | Yes |
Unique identifier for a Site |
Request headers
Name | Type | Required | Description |
---|---|---|---|
Content-Type |
String | Yes |
The media type of the request body.
Default value: "application/json" |
Request body fields
Name | Type | Required | Description |
---|---|---|---|
rules[] |
Array | No |
List of rules for user agents. |
rules[].disallows[] |
Array | No |
List of paths disallowed for this user agent. |
rules[].allows[] |
Array | No |
List of paths allowed for this user agent. |
rules[].userAgent |
String | Yes |
The user agent the rules apply to. |
sitemap |
String | No |
URL to the sitemap. |
How to start integrating
- Add HTTP Task to your workflow definition.
- Search for the API you want to integrate with and click on the name.
- This loads the API reference documentation and prepares the Http request settings.
- Click Test request to test run your request to the API and see the API's response.