Better Qdrant
Connects AI systems to Qdrant vector database for semantic search capabilities through multiple embe...
- Obtain your Qdrant server URL and (if required) API key
- If you are using a local Qdrant server, the URL is typically
http://localhost:6333. - For remote Qdrant or managed cloud service, log in to your Qdrant host/provider and locate the HTTP endpoint and API key (if your deployment requires it).
- If you are using a local Qdrant server, the URL is typically
- (If using OpenAI for embeddings) Get your OpenAI API key
- Go to https://platform.openai.com/api-keys.
- Log in with your OpenAI account.
- Click "Create new secret key" and copy the token.
- (If using OpenRouter for embeddings) Get your OpenRouter API key
- Go to https://openrouter.ai/keys.
- Log in with your OpenRouter account.
- Click "Create Key" and copy the token.
- (If using a local Ollama instance for embeddings)
- Ensure Ollama is installed and running locally. By default, the endpoint is
http://localhost:11434. - No API key is needed for local Ollama.
- Ensure Ollama is installed and running locally. By default, the endpoint is
- Fill in the FastMCP connection interface
- Click the "Install Now" button for the Better Qdrant integration.
- In the FastMCP interface, provide the values you collected:
QDRANT_URL(e.g., http://localhost:6333)QDRANT_API_KEY(if required by your Qdrant instance; otherwise, leave empty)OPENAI_API_KEY(if using OpenAI embeddings)OPENROUTER_API_KEY(if using OpenRouter embeddings)OLLAMA_ENDPOINT(if using a local Ollama server; defaults to http://localhost:11434)
- Save the configuration
- Click "Save" or "Apply" in the FastMCP interface to store your environment variable values.
Note: Only the API keys for services you actually use are required; others may be left empty.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Configure Environment Variables for
Please provide values for the following environment variables:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Database
View All →Supabase MCP Server
Connect Supabase projects directly with AI assistants using the Model Context Protocol (MCP). This server standardizes communication between Large Language Models and Supabase, enabling AI to manage tables, query data, and interact with project features like edge functions, storage, and branching. Customize access with read-only or project-scoped modes and select specific tool groups to fit your needs. Integrated tools cover account management, documentation search, database operations, debugging, and more, empowering AI to assist with development, monitoring, and deployment tasks in your Supabase environment efficiently and securely.
ClickHouse
Unlock powerful analytics with the ClickHouse MCP Server—seamlessly run, explore, and manage SQL queries across ClickHouse clusters or with chDB’s embedded OLAP engine. This server offers easy database and table listing, safe query execution, and flexible access to data from files, URLs, or databases. Built-in health checks ensure reliability, while support for both ClickHouse and chDB enables robust data workflows for any project.
Postgres MCP Pro
Boost your Postgres database performance with Postgres MCP Pro, an AI-driven MCP server offering advanced index tuning, detailed explain plans, and comprehensive health checks. It combines proven optimization algorithms with schema intelligence for safe, context-aware SQL execution. Whether analyzing slow queries or recommending optimal indexes, Postgres MCP Pro empowers developers to improve efficiency and maintain database integrity. Designed for both development and production, it supports flexible transport options and robust access controls, making database management smarter, safer, and easier. Experience deterministic performance insights alongside AI assistance to keep your Postgres running at its best.
More for AI and Machine Learning
View All →Blender
Experience seamless AI-powered 3D modeling by connecting Blender with Claude AI via the Model Context Protocol. BlenderMCP enables two-way communication, allowing you to create, modify, and inspect 3D scenes directly through AI prompts. Control objects, materials, lighting, and execute Python code in Blender effortlessly. Access assets from Poly Haven and generate AI-driven models using Hyper3D Rodin. This integration enhances creative workflows by combining Blender’s robust tools with Claude’s intelligent guidance, making 3D content creation faster, interactive, and more intuitive. Perfect for artists and developers seeking AI-assisted 3D design within Blender’s environment.
Video Edit (MoviePy)
MoviePy-based video editing server that provides comprehensive video and audio processing capabilities including trimming, merging, resizing, effects, format conversion, YouTube downloading, and text/image overlays through an in-memory object store for chaining operations efficiently.
ElevenLabs
Unleash powerful Text-to-Speech and audio processing with the official ElevenLabs MCP server. It enables MCP clients like Claude Desktop, Cursor, and OpenAI Agents to generate speech, clone voices, transcribe audio, and create unique sounds effortlessly. Customize voices, convert recordings, and build immersive audio scenes with easy-to-use APIs designed for creative and practical applications. This server integrates seamlessly, expanding your AI toolkit to bring rich, dynamic audio experiences to life across various platforms and projects.
TypeScript Refactoring
Provides TypeScript/JavaScript code analysis and refactoring capabilities using ts-morph, enabling intelligent code transformations with symbol renaming, file moving with import path corrections, cross-file reference updates, type signature analysis, and module dependency exploration across entire codebases.
Dual-Cycle Reasoner
Provides dual-cycle metacognitive reasoning framework that detects when autonomous agents get stuck in repetitive behaviors through statistical anomaly detection and semantic analysis, then automatically diagnoses failure causes and generates recovery strategies using case-based learning.
Ultra (Multi-AI Provider)
Unified server providing access to OpenAI O3, Google Gemini 2.5 Pro, and Azure OpenAI models with automatic usage tracking, cost estimation, and nine specialized development tools for code analysis, debugging, and documentation generation.