Ported 2026-02-23 — implementation now delegates to
ObjAiMcpOllama. Prefer the MCP layer in new code:from ObjAI import ObjAI ai = ObjAI(model="mcp:ollama:mistral") response = ai.prompt("You are helpful.", "What is Python?")
from ObjAILlmOllama import ObjPrompt
p = ObjPrompt(DB=0, model="mistral")
response = p.query_prompt(role="You are helpful.", prompt="What is Python?")
# Switch model at runtime
p.set_model("llama3.2")
| Method | Description |
|---|---|
__init__(DB, model) |
Initialises provider; defaults to mistral |
set_model(model) |
Switch the active model |
query_prompt(role, prompt, image) |
One-shot text prompt; returns str |
cythonize -3 -a -i ObjAILlmOllama.py