Add the following code to your Python file
from agno.agent import Agent from agno.models.ollama.chat import Ollama agent = Agent( model=Ollama(id="llama3.2:latest"), reasoning_model=Ollama(id="deepseek-r1:14b", max_tokens=4096), ) agent.print_response( "Solve the trolley problem. Evaluate multiple ethical frameworks. " "Include an ASCII diagram of your solution.", stream=True, )
Set up your virtual environment
uv venv --python 3.12 source .venv/bin/activate
Install dependencies
uv pip install -U agno ollama
Install Ollama
ollama pull llama3.2:latest
Run Agent
python ollama.py
Was this page helpful?