Skip to main content

LLM Node (type = "llm")

Handles interaction with underlying AI services.

Schema

  • See: LlmNode.json
  • Common fields:
    • data.inputsValues: for modelName, temperature, systemPrompt, prompt, etc.
    • data.outputs: usually result: string or a structured JSON string

Execution is handled by LlmNodeExecutor.

Example

Minimal LLM node:

{
"id": "llm_0",
"type": "llm",
"data": {
"inputsValues": {
"modelName": { "type": "constant", "content": "gpt-4.1-mini" },
"temperature": { "type": "constant", "content": 0.2 },
"systemPrompt": { "type": "constant", "content": "You are an assistant that only outputs JSON." },
"prompt": { "type": "ref", "content": ["start_0", "query"] }
}
}
}