Why MCP + Ollama is awesome (and how)
In this video I further explore the power of MCP, this time connected to Ollama, an open- source tool to pull, build or run LLM models on your own machine.
0:00 Intro
0:50 Self-hosted LLM
1:24 Ollama
2:51 The experiment
3:09 Installing Ollama
4:33 Running Ollama
5:17 Choosing your model
6:29 Pulling your model
6:54 Running your model
7:34 MCP agents and Ollama
9:14 Running Ollama LLM Agent
12:31 Changing LLM model parameters
14:46 Outro
MCP Client+Server: https://github.com/Marcus-Forte/learning-mcp.git
Raspberry Pi Blinky Server: https://github.com/Marcus-Forte/rpi-grpc-led
Ollama: https://ollama.com/