POST /accounts/{account_id}/ai/run/@cf/meta/llama-guard-3-8b

Servers

Path parameters

Name Type Required Description
account_id String Yes

Request headers

Name Type Required Description
Content-Type String Yes The media type of the request body.

Default value: "application/json"

Query parameters

Name Type Required Description
queueRequest String No

Request body fields

Name Type Required Description
messages[] Array Yes

An array of message objects representing the conversation history.

messages[].content String Yes

The content of the message as a string.

messages[].role String Yes

The role of the message sender must alternate between 'user' and 'assistant'.

Possible values:

  • "assistant"
  • "user"
temperature Number No

Controls the randomness of the output; higher values produce more random results.

Default value: 0.6

max_tokens Integer No

The maximum number of tokens to generate in the response.

Default value: 256

response_format Object No

Dictate the output format of the generated response.

response_format.type String No

Set to json_object to process and output generated text as JSON.

How to start integrating

  1. Add HTTP Task to your workflow definition.
  2. Search for the API you want to integrate with and click on the name.
    • This loads the API reference documentation and prepares the Http request settings.
  3. Click Test request to test run your request to the API and see the API's response.