images/avatar.png

Kaspar's own blog

Running an Open Language Model Locally

In the scope of bbv’s Focus Day 2024 (see LinkedIn post), I started using open large language models (LLM) locally. While having tested (and liked!) GitHub Copilot in 2022, it has some drawbacks for me: it costs a monthly subscription fee and - for me more critical - transmits contents to the cloud 1. Therefore, I was happy to learn about the free alternative: Ollama. So, this page is about getting Ollama running locally on a Windows computer using only free tools.