Installing models

You can add to the available models by first linking a supported service (Fal, OpenRouter, Groq or Gemini), then adding models as required from the URL input field at the bottom of the model select dropdown on the Agent and Create nodes. After linking OpenRouter or Fal, you can install any of the models provided by their service. This includes many free and experimental models, as well as models from providers like OpenAI, Anthropic and DeepSeek. Models typically trade off between speed, cost and features.

Using a model

The Agent and Code nodes will use the default Runchat model unless you specify something else. Use the dropdown on the node settings to select from an installed model and use this to make the request.