AI Using the Continue VSCode Extension and Local LLMs for Improved Coding Welcome back to another post on local LLMs. In this post, we’ll look at setting up a fully local Suresh Vinasiththamby Jan 30, 2025 4 min read
AI Using Ollama with a Web-Based GUI When I first started using local LLMs with Ollama, I quickly realised it relies on a command-line interface to interact Suresh Vinasiththamby Jan 30, 2025 5 min read
AI Running Large Language Models (LLM) on Your Own Machine Using Ollama I’m going to start by saying I’m totally new to LLMs and running them locally, so I’m Suresh Vinasiththamby Jan 30, 2025 7 min read