Evaluate Prompt
POST
/evaluate_prompt/predict
Post Evaluate Prompt Predict
Evaluate an AI Prompt
Request Body
messages[role]
Requiredarray<string>
messages[content]
Requiredarray<string>
max_tokens
integer
Maximum number of output tokens, maximum 400
Default:300
Format: "int32"
temperature
number
How creative the response should be. Between 0 and 2, the lower the less creative
Format:"float"
model_kind
string
Which model provider should be used
Default:"openai"
Value in: "openai" | "anthropic"
Status code | Description |
---|---|
201 | Evaluate an AI Prompt |