LLM Settings

e.g., http://localhost:11434/v1 or http://host.docker.internal:11434/v1/ for Ollama or Infomaniak uri; leave empty to use OpenAI.
Leave empty if your local LLM server does not require authentication.
Save your changes before to query the model list.โœ… Models successfully loaded
Provide here a valid URL for SQL guidelines. By default, the system will use its internal guidelines.