from agno.agent import Agent, RunOutput # noqa from agno.models.ollama import Ollama from ollama import Client as OllamaClient agent = Agent( model=Ollama(id="llama3.1:8b", client=OllamaClient()), markdown=True, ) # Print the response in the terminal agent.print_response("Share a 2 sentence horror story")
Set up your virtual environment
uv venv --python 3.12 source .venv/bin/activate
Install Ollama
ollama pull llama3.1:8b
Install dependencies
uv pip install -U ollama agno
Run Agent
python cookbook/11_models/ollama/set_client.py
Was this page helpful?