allthethings
about
Category:
Geek
14.04
You can easily run an AI/LLM locally with Ollama