OpenAI MCP Integration Guide — Connect Any Tool to ChatGPT and the Agents SDK

7 views

OpenAI adopted the Model Context Protocol (MCP) in March 2025, making it possible to connect ChatGPT and the OpenAI API to thousands of external tools and data sources through a single standard. This guide covers everything you need to know about using MCP with OpenAI's products.

What is MCP?

MCP (Model Context Protocol) is an open standard — originally created by Anthropic and now governed by the Agentic AI Foundation under the Linux Foundation — that defines how AI models connect to external tools. Think of it as USB-C for AI: build one MCP server, and it works with ChatGPT, Claude, Gemini, Cursor, VS Code, and every other client that supports the protocol.

OpenAI's Three MCP Integration Points

OpenAI offers MCP support across three different surfaces:

1. ChatGPT Apps (End-User)

The simplest way to use MCP with OpenAI. Any ChatGPT user on a Team, Business, Enterprise, or Education plan can connect to MCP servers through the UI.

Setup:

  1. Go to Settings → Apps → Advanced settings
  2. Enable Developer Mode
  3. Add a connector by providing the server's public MCP endpoint URL
  4. ChatGPT automatically discovers available tools and uses them when relevant

OpenAI also maintains built-in connectors for Google Drive, Google Calendar, SharePoint, Dropbox, and Box. These require no server setup — just authorize and go.

2. Responses API (Developer)

The Responses API supports MCP as a first-class tool type. You can connect to any remote MCP server or use OpenAI's built-in connectors.

Remote MCP server:

{
  "model": "gpt-4o",
  "tools": [{
    "type": "mcp",
    "server_label": "my-server",
    "server_url": "https://my-mcp-server.com/sse",
    "require_approval": "never"
  }],
  "input": "Search for open issues in our repo"
}

Built-in connector:

{
  "tools": [{
    "type": "mcp",
    "connector_id": "gdrive",
    "require_approval": "never"
  }]
}

The API handles tool discovery automatically — it calls list_tools() on the MCP server, presents the available tools to the model, and executes tool calls as needed.

3. OpenAI Agents SDK (Advanced)

The Agents SDK (available in Python and JavaScript) offers the deepest MCP integration with multiple transport types:

Transport Use Case
HostedMCPTool OpenAI's cloud calls the MCP server for you
MCPServerStreamableHttp Connect to HTTP-based MCP servers
MCPServerSse Connect via Server-Sent Events
MCPServerStdio Connect to local process-based servers

Python example:

from agents import Agent
from agents.mcp import MCPServerStreamableHttp

mcp_server = MCPServerStreamableHttp(
    url="https://my-mcp-server.com/mcp"
)

agent = Agent(
    name="my-agent",
    model="gpt-4o",
    mcp_servers=[mcp_server]
)

The SDK handles connection management, tool discovery, schema conversion, and error handling automatically.

Authentication with OAuth 2.1

For MCP servers that require authentication, OpenAI supports OAuth 2.1 with PKCE. Your server needs to implement:

  1. A Protected Resource Metadata endpoint at /.well-known/oauth-protected-resource
  2. PKCE support in the authorization flow
  3. Resource parameter echoing in token responses

ChatGPT handles the OAuth flow in the UI — users just click "Authorize" when connecting to a protected server.

Best MCP Servers to Use with OpenAI

Here are the most popular MCP servers that work with all three OpenAI integration points:

  • GitHub MCP
    G
    GitHub Official1-Click ReadyRemote

    Extend your developer tools with the GitHub MCP Server—a powerful Model Context Protocol server enhancing automation and AI interactions with GitHub APIs. It...

    Developer ToolsProject Management 3.8k 207
    — Repository management, issues, PRs, code search
  • Playwright MCP
    P
    Playwright 1-Click Ready

    Control browsers to perform sophisticated web interactions and visual tasks.

    Browser AutomationDeveloper Tools 5.3k 118
    — Browser automation and web scraping
  • PostgreSQL MCP
    P
    Postgres 1-Click ReadyRemote

    Connect to PostgreSQL databases to query data and schemas

    193 2
    — Natural language database queries
  • Firecrawl MCP
    F
    Firecrawl Official

    Unlock powerful web data extraction with Firecrawl, turning any website into clean markdown or structured data. Firecrawl lets you crawl all accessible pages...

    Browser Automation 2.3k 109
    — Web scraping and data extraction
  • Slack MCP
    S
    Slack

    Experience the most powerful MCP server for Slack workspaces with advanced message and channel support. This feature-rich server allows fetching messages fro...

    Communication 697 19
    — Workspace messaging and channel management
  • Docker MCP
    D
    Docker 1-Click Ready

    Manage containers and compose stacks through natural language.

    AutomationDeveloper Tools 1.1k 37
    — Container management and orchestration
  • Sentry MCP
    S
    Sentry Official1-Click ReadyRemote

    Streamline Sentry API integration with this remote MCP server middleware prototype. sentry-mcp acts as a bridge between clients and Sentry, supporting flexib...

    API DevelopmentDeveloper Tools 1.6k 15
    — Production error tracking and monitoring
  • Linear MCP
    L
    Linear Official1-Click ReadyRemote

    Issue tracking and project management for development teams.

    Productivity 1.8k 29
    — Project management and issue tracking

Browse all available servers on the FastMCP Explore page.

Building Your Own MCP Server for OpenAI

The fastest way to build an MCP server is with FastMCP (Python):

from fastmcp import FastMCP

mcp = FastMCP("my-server")

@mcp.tool
def search(query: str) -> list[dict]:
    """Search for documents matching the query."""
    return my_search_function(query)

@mcp.tool  
def fetch(document_id: str) -> str:
    """Fetch a specific document by ID."""
    return my_fetch_function(document_id)

For ChatGPT apps specifically, OpenAI recommends implementing two key tools:

  • search — Returns relevant results from your data source
  • fetch — Retrieves specific document content

Deploy on Vercel, Cloudflare Workers, Supabase Edge Functions, or any hosting provider that can expose a public HTTPS endpoint.

MCP vs Custom Function Calling

Before MCP, the standard approach was OpenAI function calling — defining tools in each API request and handling execution in your backend. MCP is different:

Aspect Function Calling MCP
Tool definitions Defined per-request Discovered automatically
Execution Your backend handles it Direct model-to-server
Portability OpenAI-specific Works with any MCP client
Setup Code required URL-based connection

MCP doesn't replace function calling — they serve different purposes. Use MCP when you want portable, reusable tool integrations. Use function calling when you need tight control over tool execution in your application.

The Bigger Picture

OpenAI adopting MCP — a standard created by rival Anthropic — signals that the industry is converging on interoperability. With the Agentic AI Foundation (co-founded by Anthropic, OpenAI, and Block) now governing MCP under the Linux Foundation, and Google, Microsoft, and AWS as supporting members, MCP is becoming the universal standard for AI tool integration.

For developers, this means: build your MCP server once, and it works everywhere — ChatGPT, Claude, Gemini, Cursor, VS Code, and whatever comes next.

Stay ahead of the MCP ecosystem

Get the top new MCP servers, trending tools, and dev tips delivered weekly. Free, no spam, unsubscribe anytime.

Join 2,847 developers. We send one email per week.