Back
·2 min read

How to run powerful AI agents for free

After burning through my weekly token quota a few times recently, I decided to check out the free tier Nvidia offers for open-source models. The list includes GLM 5.1 (the "Chinese Opus" I mentioned before), DeepSeek, the Gemma family, and many more.

The setup is pretty fast: you open a free account, generate an API Key, and hook it up to Opencode or Claude Code.

What's the catch?

The response time is significantly slower. For interactive coding, it might generate too many coffee breaks for your liking, but it's an ideal solution for agents that operate asynchronously in the background.


How to set everything up?

First, go to build.nvidia.com, select a model, and click Get API Key. Their standard API endpoint is: https://integrate.api.nvidia.com/v1

1. Connecting to Opencode

Connecting Opencode is super straightforward using its built-in slash command:

  1. Open the tool and run the /connect command.
  2. Search for and select Nvidia from the provider list.
  3. Enter your generated API Key.
  4. Run /models and select your desired model (e.g., zai-org/glm-5). That's it!

2. Connecting to Claude Code

To connect Anthropic's agent (Claude Code) to Nvidia, we define a few environment variables. Nvidia's API is Anthropic-compatible (NIM supports /v1/messages), so you just need to export these variables in your terminal before running it:

# The model ID as it appears on Nvidia (e.g., for GLM 5.1)
export MODEL_NAME="zai-org/glm-5" 

export ANTHROPIC_API_KEY="YOUR_API_KEY"
export ANTHROPIC_BASE_URL="https://integrate.api.nvidia.com/v1"

# Setup the models for Claude to use as default
export ANTHROPIC_CUSTOM_MODEL_OPTION="${MODEL_NAME}"
export ANTHROPIC_DEFAULT_HAIKU_MODEL="${MODEL_NAME}"
export ANTHROPIC_DEFAULT_OPUS_MODEL="${MODEL_NAME}"
export ANTHROPIC_DEFAULT_SONNET_MODEL="${MODEL_NAME}"
export CLAUDE_CODE_SUBAGENT_MODEL="${MODEL_NAME}"

# Run the agent as usual
claude

Links & Resources

For those who want to dive deeper, here are the official links: