OpenAI MCP Integration Guide — Connect Any Tool to ChatGPT and the Agents SDK
OpenAI adopted the Model Context Protocol (MCP) in March 2025, making it possible to connect ChatGPT and the OpenAI API to thousands of external tools and data sources through a single standard. This guide covers everything you need to know about using MCP with OpenAI's products.
What is MCP?
MCP (Model Context Protocol) is an open standard — originally created by Anthropic and now governed by the Agentic AI Foundation under the Linux Foundation — that defines how AI models connect to external tools. Think of it as USB-C for AI: build one MCP server, and it works with ChatGPT, Claude, Gemini, Cursor, VS Code, and every other client that supports the protocol.
OpenAI's Three MCP Integration Points
OpenAI offers MCP support across three different surfaces:
1. ChatGPT Apps (End-User)
The simplest way to use MCP with OpenAI. Any ChatGPT user on a Team, Business, Enterprise, or Education plan can connect to MCP servers through the UI.
Setup:
- Go to Settings → Apps → Advanced settings
- Enable Developer Mode
- Add a connector by providing the server's public MCP endpoint URL
- ChatGPT automatically discovers available tools and uses them when relevant
OpenAI also maintains built-in connectors for Google Drive, Google Calendar, SharePoint, Dropbox, and Box. These require no server setup — just authorize and go.
2. Responses API (Developer)
The Responses API supports MCP as a first-class tool type. You can connect to any remote MCP server or use OpenAI's built-in connectors.
Remote MCP server:
{
"model": "gpt-4o",
"tools": [{
"type": "mcp",
"server_label": "my-server",
"server_url": "https://my-mcp-server.com/sse",
"require_approval": "never"
}],
"input": "Search for open issues in our repo"
}
Built-in connector:
{
"tools": [{
"type": "mcp",
"connector_id": "gdrive",
"require_approval": "never"
}]
}
The API handles tool discovery automatically — it calls list_tools() on the MCP server, presents the available tools to the model, and executes tool calls as needed.
3. OpenAI Agents SDK (Advanced)
The Agents SDK (available in Python and JavaScript) offers the deepest MCP integration with multiple transport types:
| Transport | Use Case |
|---|---|
HostedMCPTool |
OpenAI's cloud calls the MCP server for you |
MCPServerStreamableHttp |
Connect to HTTP-based MCP servers |
MCPServerSse |
Connect via Server-Sent Events |
MCPServerStdio |
Connect to local process-based servers |
Python example:
from agents import Agent
from agents.mcp import MCPServerStreamableHttp
mcp_server = MCPServerStreamableHttp(
url="https://my-mcp-server.com/mcp"
)
agent = Agent(
name="my-agent",
model="gpt-4o",
mcp_servers=[mcp_server]
)
The SDK handles connection management, tool discovery, schema conversion, and error handling automatically.
Authentication with OAuth 2.1
For MCP servers that require authentication, OpenAI supports OAuth 2.1 with PKCE. Your server needs to implement:
- A Protected Resource Metadata endpoint at
/.well-known/oauth-protected-resource - PKCE support in the authorization flow
- Resource parameter echoing in token responses
ChatGPT handles the OAuth flow in the UI — users just click "Authorize" when connecting to a protected server.
Best MCP Servers to Use with OpenAI
Here are the most popular MCP servers that work with all three OpenAI integration points:
- GitHub MCP— Repository management, issues, PRs, code searchG
GitHub Official1-Click ReadyRemoteDeveloper ToolsProject Management 3.8k 207 - Playwright MCP— Browser automation and web scrapingPPlaywright 1-Click ReadyBrowser AutomationDeveloper Tools 5.3k 118
- PostgreSQL MCP— Natural language database queriesPPostgres 1-Click ReadyRemote193 2
- Firecrawl MCP— Web scraping and data extractionFFirecrawl OfficialBrowser Automation 2.3k 109
- Slack MCP— Workspace messaging and channel managementS
SlackCommunication 697 19 - Docker MCP— Container management and orchestrationDDocker 1-Click ReadyAutomationDeveloper Tools 1.1k 37
- Sentry MCP— Production error tracking and monitoringSSentry Official1-Click ReadyRemoteAPI DevelopmentDeveloper Tools 1.6k 15
- Linear MCP— Project management and issue trackingL
Linear Official1-Click ReadyRemoteProductivity 1.8k 29
Browse all available servers on the FastMCP Explore page.
Building Your Own MCP Server for OpenAI
The fastest way to build an MCP server is with FastMCP (Python):
from fastmcp import FastMCP
mcp = FastMCP("my-server")
@mcp.tool
def search(query: str) -> list[dict]:
"""Search for documents matching the query."""
return my_search_function(query)
@mcp.tool
def fetch(document_id: str) -> str:
"""Fetch a specific document by ID."""
return my_fetch_function(document_id)
For ChatGPT apps specifically, OpenAI recommends implementing two key tools:
search— Returns relevant results from your data sourcefetch— Retrieves specific document content
Deploy on Vercel, Cloudflare Workers, Supabase Edge Functions, or any hosting provider that can expose a public HTTPS endpoint.
MCP vs Custom Function Calling
Before MCP, the standard approach was OpenAI function calling — defining tools in each API request and handling execution in your backend. MCP is different:
| Aspect | Function Calling | MCP |
|---|---|---|
| Tool definitions | Defined per-request | Discovered automatically |
| Execution | Your backend handles it | Direct model-to-server |
| Portability | OpenAI-specific | Works with any MCP client |
| Setup | Code required | URL-based connection |
MCP doesn't replace function calling — they serve different purposes. Use MCP when you want portable, reusable tool integrations. Use function calling when you need tight control over tool execution in your application.
The Bigger Picture
OpenAI adopting MCP — a standard created by rival Anthropic — signals that the industry is converging on interoperability. With the Agentic AI Foundation (co-founded by Anthropic, OpenAI, and Block) now governing MCP under the Linux Foundation, and Google, Microsoft, and AWS as supporting members, MCP is becoming the universal standard for AI tool integration.
For developers, this means: build your MCP server once, and it works everywhere — ChatGPT, Claude, Gemini, Cursor, VS Code, and whatever comes next.