Skip to contents

All functions

as_table(<LlmResponse>)
Extract and format LLM response as a table
as_table()
Generic extractor for LlmResponse outputs
get_llm_models(<LocalLlmApi>)
Retrieve Available LLM Models
get_llm_models()
Generic extractor for LlmApi outputs
get_llm_models(<RemoteLlmApi>)
Retrieve Available LLM Models
has_internet()
Has Internet
llm_generate_prompt_server()
LLM Prompt Generator Server Module
llm_generate_prompt_ui()
LLM Prompt Generator UI Module
new_LlmPromptConfig()
Create and Manage LLM Prompt Settings
new_LlmResponse()
Create and Structure LLM Response Object
new_LocalLlmApi()
Create and Validate Local LLM API Credentials
new_OllamaModel()
Create an Ollama Model Object
new_OllamaModelManager()
Ollama Model Manager
new_RemoteLlmApi()
Create and Validate Remote LLM API Credentials
print(<LlmPromptConfig>)
Print method for better readability
print(<LlmResponse>)
Print method for user-friendly display
print(<LocalLlmApi>)
Print method for LocalLlmApi
print(<OllamaModel>)
Print method for OllamaModel
print(<OllamaModelManager>)
Print method for OllamaModelManager
print(<RemoteLlmApi>)
Print method for RemoteLlmApi
send_prompt(<LocalLlmApi>)
Send a prompt to a local llm API (e.g., Ollama)
send_prompt()
Generic LLM prompt sender This function is a generic method for sending prompts to a remote or local LLM API. It dispatches to the appropriate method based on the class of the `api` argument.
send_prompt(<RemoteLlmApi>)
Send a prompt to a remote LLM API (e.g., OpenAI, DeepSeek) This function sends a prompt to the remote LLM API and returns the response in a structured format.
startApplication()
Start Application
update(<OllamaModelManager>)
Update the list of locally available models
update()
Generic update function