Escalates complex reasoning tasks to more powerful language models (OpenAI o3, Google Gemini 2.5 Pro, DeepSeek Reasoner) by forwarding markdown prompts with code context and git diffs, returning responses with detailed cost tracking.
Obtain your API keys for the supported models:
OpenAI API Key (for o3 model):
- Go to platform.openai.com/account/api-keys.
- Log in with your OpenAI account.
- Click "Create new secret key".
- Copy the generated API key.
Gemini API Key (for Gemini 2.5 Pro model):
- Go to ai.google.dev.
- Log in with your Google account.
- Click "Get API key" or "Get started" (you may need to create a Google Cloud project).
- Follow the prompts to enable the Gemini API for your project.
- Copy your Gemini API key from the console.
DeepSeek API Key (for DeepSeek Reasoner model):
- Go to platform.deepseek.com/api-keys.
- Log in or create a DeepSeek account.
- Click "Create API Key", provide a name, and confirm.
- Copy the generated API key.
Open the FastMCP connection interface.
Click the "Install Now" button for the Consult LLM MCP server.
In the connection interface, fill in the required environment variables:
- For OpenAI: set
OPENAI_API_KEY
to the value you copied from the OpenAI website.
- For Gemini: set
GEMINI_API_KEY
to the value you copied from Google AI.
- For DeepSeek: set
DEEPSEEK_API_KEY
to the value you copied from DeepSeek.
(You can supply any or all keys, depending on which model(s) you wish to use.)
(Optional) Set the default model:
- If you want to override the default model, fill in
CONSULT_LLM_DEFAULT_MODEL
with one of: o3
, gemini-2.5-pro
, or deepseek-reasoner
.
Save the configuration.
You are now ready to use Consult LLM MCP with your chosen models!