A tutorial thread showed how to route Claude Code through Ollama, choose a local coding model, and point Claude at a local base URL for private work. Use it if you want agent-style coding on your own machine without cloud API spend.

This is not a new Claude Code product launch. It is a creator walkthrough showing that Claude Code can be routed through Ollama locally by swapping the backend and running an open coding model on your own machine. The thread positions that as “free” and “fully private,” with Ollama doing the local model serving on Mac or Windows via Ollama.
The important technical move is in the base URL step: instead of sending requests to Anthropic’s servers, Claude Code is pointed at a local endpoint. That makes the story less about Claude itself changing and more about a practical local-agent workflow creative coders can reproduce.
The setup in the install step starts with Ollama running quietly in the background, then verifying the service is live locally, which the verification clip says should appear on localhost after install. From there, the creator pulls a coding model sized to the machine: qwen3-coder:30b for higher-end hardware, or smaller choices like qwen2.5-coder:7b and gemma:2b when memory is tighter.
Once the base URL is redirected, the launch example has Claude Code started inside a project folder with the selected local model. The payoff in the demo is agent-style behavior on local files: a prompt like “make a hello world website” leads the tool to inspect files, modify code, and complete tasks without a cloud call in the loop.
🚨BREAKING: You can now run Claude Code for FREE. Here's how to run Claude Code locally (100% free & fully private):
Step 3: Connect Claude to Your Computer This step is super important. Normally, Claude talks to Anthropic’s servers, but now you’ll make it talk to your computer instead. First, let Claude know where your computer is by setting the base URL.
STEP 1: Select Your Local “Brain” (Ollama) First you need a local engine that can run AI models and handle tool or function calls. Here we will use Ollama so download ollama.com Once it’s installed, Ollama runs quietly in the background on both Mac and Windows.
Step 4: Start Claude and Try It Out Now you can use Claude Code for real work. Go to any project folder on your computer and start Claude with the model you picked. For example: