ComputeGauge MCP
OfficialProvides cost intelligence and a reputation scoring system to help AI agents optimize spending throu...
Open the FastMCP connection interface and click “Install Now”
- In your FastMCP UI click the “Install Now” button for the @computegauge/mcp integration.
- When the install form opens, you will paste the ENV names and values into the FastMCP connection fields (the values below).
COMPUTEGAUGE_API_KEY (optional) — create & copy
- Sign up or log in to the ComputeGauge dashboard (ComputeGauge → Dashboard or Account area).
- In the dashboard look for “API Keys”, “Access Tokens”, or “Integrations” and create a new API key.
- Copy the key immediately (paste into FastMCP field named COMPUTEGAUGE_API_KEY).
- Note: the README lists COMPUTEGAUGE_API_KEY and COMPUTEGAUGE_DASHBOARD_URL as optional environment variables.
ANTHROPIC_API_KEY (optional) — create & copy
- Sign in to your Anthropic Console (Anthropic account).
- Navigate to the API Keys (or Admin API → API Keys) section.
- Create a new API key (give it a name), copy it immediately, and paste it into the FastMCP field named ANTHROPIC_API_KEY. (docs.anthropic.com)
OPENAI_API_KEY (optional) — create & copy
- Sign in to the OpenAI (platform) dashboard.
- Go to the API keys / developer keys area and create a new secret key.
- Copy the key immediately and paste it into the FastMCP field named OPENAI_API_KEY (or set it locally as export OPENAI_API_KEY="…"). (platform.openai.com)
GOOGLE_API_KEY (optional) — create & copy
- Open Google Cloud Console for the project that will be billed.
- Go to APIs & Services → Credentials → Create credentials → API key.
- After creation, restrict the key to the specific APIs and/or HTTP referrers you need, then copy it into the FastMCP field named GOOGLE_API_KEY. (docs.cloud.google.com)
OLLAMA_HOST / OLLAMA_MODELS (optional) — configure for local inference
- If you run Ollama locally, the default HTTP host is http://localhost:11434. If you expose Ollama elsewhere, set OLLAMA_HOST to that URL.
- Set OLLAMA_MODELS to a comma-separated list of model identifiers you want ComputeGauge to detect, e.g.:
- OLLAMA_MODELS="llama3.3:70b,qwen2.5:7b,deepseek-r1:14b"
- If you use Ollama Cloud instead of local, follow Ollama’s docs to obtain the Ollama cloud API key and endpoint and set appropriate ENV values. (docs.ollama.com)
COMPUTEGAUGE_BUDGET_TOTAL and per-provider budget ENVs (optional)
- Decide your session budget in USD and enter it in the FastMCP field COMPUTEGAUGE_BUDGET_TOTAL (e.g., "50").
- Optionally set per-provider monthly limits like COMPUTEGAUGE_BUDGET_ANTHROPIC or COMPUTEGAUGE_BUDGET_OPENAI if you want granular caps.
Other local provider ENVs (optional)
- If you have other local endpoints, set VLLM_HOST, LOCALAI_HOST, LLAMACPP_HOST, TGI_HOST, or CUSTOM LOCAL_LLM_ENDPOINT values in the FastMCP form as needed.
Security & storage reminders
- Copy secret keys immediately when shown; some dashboards (OpenAI/Anthropic) display secrets only once—store them securely (password manager or encrypted vault). (platform.openai.com)
- Add API key restrictions in provider consoles (where available) to reduce risk (OpenAI/Google/Anthropic support restrictions or scoped keys).
Save, install, and restart
After pasting each ENV value into the FastMCP connection form, click Save / Install to apply.
Restart your agent (e.g., Claude Desktop/Code or your agent runtime) so the new environment variables are picked up by the @computegauge/mcp server integration (the README shows restarting Claude after adding the MCP server entry).
Verify
Use the ComputeGauge quickstart endpoints or the agent’s first session to call pick_model or session_cost and confirm providers are detected and the dashboard/credentials work.
If local models aren’t detected, double-check OLLAMA_HOST and OLLAMA_MODELS values and that the local server is reachable.
If you want, paste the FastMCP install form fields here and I will produce the exact ENV values and example strings to paste into each FastMCP input.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Run MCP servers without
local setup or downtime
Access to 1,000+ ready-to-use MCP servers
Skip installation, maintenance, and trial-and-error.
No local setup or infra
Run MCP servers without Docker, ports, or tunnels.
Always online
Your MCP keeps working even when your laptop is off.
One secure URL
Use the same MCP from any agent, anywhere.
Secure by default
Encrypted connections. Secrets never stored locally.
Configuration for
Environment Variables
Please provide values for the following environment variables:
HTTP Headers
Please provide values for the following HTTP headers:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for AI and Machine Learning
View All →Blender
Experience seamless AI-powered 3D modeling by connecting Blender with Claude AI via the Model Context Protocol. BlenderMCP enables two-way communication, allowing you to create, modify, and inspect 3D scenes directly through AI prompts. Control objects, materials, lighting, and execute Python code in Blender effortlessly. Access assets from Poly Haven and generate AI-driven models using Hyper3D Rodin. This integration enhances creative workflows by combining Blender’s robust tools with Claude’s intelligent guidance, making 3D content creation faster, interactive, and more intuitive. Perfect for artists and developers seeking AI-assisted 3D design within Blender’s environment.
Video & Audio Text Extraction
Extracts text from videos and audio files across platforms like YouTube, Bilibili, TikTok, Instagram, Twitter/X, Facebook, and Vimeo using Whisper speech recognition for transcription, content analysis, and accessibility improvements.
Video Edit (MoviePy)
MoviePy-based video editing server that provides comprehensive video and audio processing capabilities including trimming, merging, resizing, effects, format conversion, YouTube downloading, and text/image overlays through an in-memory object store for chaining operations efficiently.
Qwen Code
Bridges Qwen's code analysis capabilities through CLI integration, providing file-referenced queries with @filename syntax, automatic model fallback, and configurable execution modes for code review, codebase exploration, and automated refactoring workflows.
More for Developer Tools
View All →
GitHub
Extend your developer tools with the GitHub MCP Server—a powerful Model Context Protocol server enhancing automation and AI interactions with GitHub APIs. It supports diverse functionalities like managing workflows, issues, pull requests, repositories, and security alerts. Customize available toolsets to fit your needs, enable dynamic tool discovery to streamline tool usage, and run the server locally or remotely. With read-only mode and support for GitHub Enterprise, this server integrates deeply into your GitHub ecosystem, empowering data extraction and intelligent operations for developers and AI applications. Licensed under MIT, it fosters flexible and advanced GitHub automation.
Desktop Commander
Desktop Commander MCP transforms Claude Desktop into a powerful AI assistant for managing files, running terminal commands, and editing code with precision across your entire system. It supports in-memory code execution, interactive process control, advanced search and replace, plus comprehensive filesystem operations including reading from URLs and negative offset file reads. With detailed audit and fuzzy search logging, it enables efficient automation, data analysis, and multi-project workflows—all without extra API costs. Designed for developers seeking smarter automation, it enhances productivity by integrating all essential development tools into a single, intelligent chat interface.
Chrome DevTools
Provides direct Chrome browser control through DevTools for web automation, debugging, and performance analysis using accessibility tree snapshots for reliable element targeting, automatic page event handling, and integrated performance tracing with actionable insights.
FreeCAD
Enables AI-driven CAD modeling by providing a remote procedure call (RPC) server that allows programmatic control of FreeCAD, supporting operations like creating documents, inserting parts, editing objects, and executing Python code for generative design workflows.
Similar MCP Servers
Google Cloud Compute Engine
Discover official and open-source Model Context Protocol (MCP) servers from Google. This project provides an up-to-date directory of MCP servers for Google services like Compute Engine. Explore examples and resources that help you build, integrate, and extend intelligent agents using Google's ecosystem of MCP solutions—all designed to streamline context-aware app development and experimentation.