Skip to content
Subs -10% SUB-10

Use Your Own LLM at Home

9 min read Advanced Last updated March 05, 2026

What This Guide Is About

You can run an AI model on your own computer and connect it to your ClawHosters instance. This means your AI agent uses your local hardware for thinking, instead of a cloud provider like OpenAI or Anthropic. The result: free AI responses, no API costs, full privacy.

The trick is ZeroTier. It creates a secure, encrypted tunnel between your computer and your ClawHosters instance. Think of it like a private cable connecting the two machines, even though they might be on different continents. No port forwarding, no fiddling with your router, no exposing anything to the public internet.

This guide walks you through the entire process from scratch. You do not need any experience with servers, networking, or AI models. Just follow the steps.

What You Need

Before you start, make sure you have these three things:

  1. A ClawHosters instance. Any tier works (Budget, Balanced, or Pro). If you do not have one yet, create one first.
  2. A computer that can run a local LLM. At least 8 GB of RAM is recommended. 16 GB or more is better. A modern laptop or desktop from the last few years will do fine. Mac, Windows, or Linux all work.
  3. An internet connection. Both your computer and your ClawHosters instance need to be online for this to work.

That is it. Everything else in this guide is free.

Step 1: Create a ZeroTier Account

ZeroTier is a free service that creates virtual private networks. You need an account to set up the tunnel.

  1. Go to zerotier.com and click Sign Up
  2. Create a free account with your email
  3. Log in to ZeroTier Central
  4. Click Create A Network
  5. A new network appears with a 16-character Network ID (something like a1b2c3d4e5f60a8b)
  6. Write down or copy this Network ID. You will need it in the next steps.

The free plan supports up to 25 devices. That is more than enough for connecting your computer to your ClawHosters instance.

Step 2: Install ZeroTier on Your Computer

Download the ZeroTier client for your operating system and join the network you just created.

On Mac:

  1. Download ZeroTier from zerotier.com/download
  2. Open the downloaded file and install it
  3. ZeroTier appears as a small icon in your menu bar (top right)
  4. Click the icon and select Join New Network...
  5. Paste your 16-character Network ID and click Join

On Windows:

  1. Download ZeroTier from zerotier.com/download
  2. Run the installer
  3. ZeroTier appears as a small icon in your system tray (bottom right)
  4. Right-click the icon and select Join New Network...
  5. Paste your 16-character Network ID and click Join

On Linux:

  1. Open a terminal
  2. Install ZeroTier: curl -s https://install.zerotier.com | sudo bash
  3. Join the network: sudo zerotier-cli join YOUR_NETWORK_ID

Authorize your machine:

After joining, go back to ZeroTier Central, open your network, and scroll down to the Members section. You will see your computer listed. Check the Auth checkbox next to it.

Once authorized, your computer gets a ZeroTier IP address (something like 10.147.x.x). Note this IP address. You will need it later when configuring BYOK.

Step 3: Install a Local LLM

Now you need software that can run AI models on your computer. There are two popular options.

Option A: Ollama (Recommended)

Ollama is the simplest way to run local models. It works from the command line and is very easy to set up.

  1. Go to ollama.com and download the installer for your system
  2. Run the installer
  3. Open a terminal (Terminal on Mac, Command Prompt or PowerShell on Windows)
  4. Download a model. For a good starting point, try one of these: ollama pull llama3.2 Or if you want a smaller, faster model: ollama pull mistral

The download takes a few minutes depending on your internet speed. The model files are several gigabytes.

Option B: LM Studio

LM Studio gives you a graphical interface. Good if you prefer clicking over typing commands.

  1. Go to lmstudio.ai and download the app
  2. Open LM Studio
  3. Use the search bar to find a model (try "Llama 3.2" or "Mistral")
  4. Click the download button next to the model you want
  5. Wait for the download to finish

Step 4: Start Your Local LLM Server

Your local LLM needs to run as a server so your ClawHosters instance can send requests to it.

If you use Ollama:

Open a terminal and run:

text
OLLAMA_HOST=0.0.0.0 ollama serve

The 0.0.0.0 part is important. It tells Ollama to accept connections from other devices (like your ClawHosters instance over ZeroTier), not just from your own computer. Without this, Ollama only listens on localhost and your instance cannot reach it.

On Mac and Windows, Ollama usually starts automatically in the background after installation. If you get an "address already in use" error, quit the Ollama app first (click the Ollama icon in your menu bar or system tray and select Quit), then run the command above.

On Windows, set the environment variable first: set OLLAMA_HOST=0.0.0.0 ollama serve

If you use LM Studio:

  1. Open LM Studio
  2. Click the Local Server tab on the left sidebar
  3. Select the model you downloaded
  4. Click Start Server
  5. LM Studio starts a server on port 1234 by default

Verify it works locally:

Before connecting your instance, check that the server responds. Open a new terminal and run:

For Ollama: curl http://localhost:11434/v1/models

For LM Studio: curl http://localhost:1234/v1/models

If you see a JSON response listing your model, the server is running correctly.

Step 5: Connect Your ClawHosters Instance

Now connect your ClawHosters instance to the same ZeroTier network.

  1. Go to your instance dashboard on ClawHosters
  2. Open the Access tab
  3. Find the ZeroTier section
  4. Enter your 16-character Network ID
  5. Click Join

Your instance sends a join request to ZeroTier. Now go to ZeroTier Central, open your network, and authorize the new member (your ClawHosters instance) by checking the Auth checkbox.

Back in the ClawHosters dashboard, click Refresh to check the status. Once authorized, it shows Connected along with the ZeroTier IP address of your instance.

For more details on this step, see the ZeroTier VPN documentation.

Step 6: Configure BYOK

Now tell your ClawHosters instance to use your local LLM instead of a cloud provider.

  1. Go to your instance dashboard on ClawHosters
  2. Open the Addons tab
  3. Switch to BYOK mode (Bring Your Own Key)
  4. Set the Endpoint to your local LLM server:

For Ollama: http://YOUR_ZEROTIER_IP:11434/v1

For LM Studio: http://YOUR_ZEROTIER_IP:1234/v1

Replace YOUR_ZEROTIER_IP with the ZeroTier IP address of your computer (the one you noted in Step 2, something like 10.147.x.x).

  1. Set the Model name to the model you downloaded:

    • For Ollama with Llama: llama3.2
    • For Ollama with Mistral: mistral
    • For LM Studio: check the model name in the Local Server tab
  2. You can leave the API Key field empty for local servers (they do not require one)

  3. Click Save

BYOK mode itself is free. There are no extra charges for using your own model.

Step 7: Test It

Time to see if everything works.

  1. Send a message to your AI agent through whatever channel you have set up (Telegram, WhatsApp, Discord, or the web chat)
  2. Wait for a response. Local models can be slower than cloud APIs, especially on less powerful hardware. Give it a moment.
  3. Check the Logs tab in your instance dashboard to see the request being processed

If you get a response back, congratulations. Your ClawHosters instance is now using your own local LLM through ZeroTier. No cloud API costs, full privacy, all running on your hardware.

Troubleshooting

No response from your AI agent:

  • Check that your local LLM server is still running (the terminal window should show activity)
  • Check the Logs tab in your ClawHosters dashboard for error messages
  • Make sure your computer is still connected to the ZeroTier network

"Connection refused" or timeout errors:

  • If using Ollama, make sure you started it with OLLAMA_HOST=0.0.0.0. Without this, Ollama rejects connections from other devices.
  • Double-check the ZeroTier IP address in your BYOK configuration. It should be your computer's ZeroTier IP, not your regular local IP.
  • Make sure both your computer and your instance show as "Connected" in ZeroTier Central

"Model not found" errors:

  • Verify you downloaded the model. Run ollama list to see installed models.
  • Check that the model name in your BYOK configuration matches exactly. For example, llama3.2 not Llama-3.2.

Slow responses:

  • Local models depend on your hardware. If responses are very slow, try a smaller model (e.g. mistral instead of llama3.2:70b)
  • Close other resource-heavy applications on your computer
  • Make sure your computer is not running on battery saver mode

LM Studio server not reachable:

  • In LM Studio, go to the Local Server tab and check that "Accept connections from other devices" or "Serve on 0.0.0.0" is enabled
  • Make sure the server is actually started (green indicator)

ZeroTier shows "Pending" or "Offline":

  • Go to ZeroTier Central and check that both members are authorized
  • On your computer, try leaving and re-joining the network
  • On your ClawHosters instance, click Disconnect and then re-join

Good to Know

ZeroTier is free for up to 25 devices. You only need 2 devices for this setup (your computer and your instance), so the free tier is plenty.

All traffic is encrypted. Data between your computer and your ClawHosters instance travels through an encrypted tunnel. Nobody can see your AI conversations in transit.

Your computer must be on. Since the LLM runs on your machine, it needs to be powered on and connected to the internet. If you shut down your computer or close your laptop, your AI agent stops responding until you start it again.

No extra cost for BYOK. Using your own model through BYOK does not cost any Claws. Your only cost is your ClawHosters instance subscription.

You can switch back anytime. If you want to go back to a cloud LLM provider, just change the BYOK settings back in the Addons tab. Your instance keeps working either way.

For more details on ZeroTier networking (disconnecting, firewall details, peer-to-peer behavior), see the full ZeroTier VPN documentation.

Related Documentation