Changelog
llmModule 25.06.0
New features
-
Added option to use LLM models from a local installation of Ollama (#4):
- New classes
LocalLlmApi,OllamaModel, andOllamaModelManagerprovide support for model configuration, and response parsing using locally hosted models. - The
llm_api_ui/server,llm_prompt_config_ui/serverandllm_generate_prompt_ui/serverfunctions now detect and support Ollama-based backends.
- New classes