AI Memory
Production-ready semantic memory management server that stores, retrieves, and manages contextual kn...
Prepare Your PostgreSQL Database
- Ensure you have a running PostgreSQL server with the
pgvectorextension enabled. - Create a new database (e.g.,
mcp_ai_memory) and user if you haven't already. - Enable the pgvector extension for your database:
CREATE EXTENSION IF NOT EXISTS vector;
- Ensure you have a running PostgreSQL server with the
Gather Your PostgreSQL Database Credentials
- You will need:
- Host (e.g.,
localhost) - Port (default:
5432) - Database name (e.g.,
mcp_ai_memory) - Username
- Password
- Host (e.g.,
- You will need:
Create the Connection URL
- Construct your PostgreSQL URL in the following format:
postgresql://<username>:<password>@<host>:<port>/<database>- Example:
postgresql://myuser:mypassword@localhost:5432/mcp_ai_memory
- Example:
- Construct your PostgreSQL URL in the following format:
[Optional] Prepare Redis (For Advanced Caching/Async Features)
- Set up a Redis server if you want caching or async job processing.
- Default Redis URL:
redis://localhost:6379
[Optional] Choose Embedding Model
- You can specify an embedding model or use the default.
- Default:
Xenova/all-mpnet-base-v2 - Alternative:
Xenova/all-MiniLM-L6-v2(smaller/faster, lower quality)
Fill out the ENV values in FastMCP
- Click the "Install Now" button to begin configuring the MCP AI Memory server.
- In the FastMCP connection interface, set or fill out the following ENV fields:
MEMORY_DB_URL: your PostgreSQL URL from Step 3 (Required)REDIS_URL: your Redis URL (optional)EMBEDDING_MODEL: model name (optional, defaults as above)- Any other advanced ENV values as needed (see table below):
| Variable | Description | Default |
|---|---|---|
| MEMORY_DB_URL | PostgreSQL connection string (Required) | None |
| REDIS_URL | Redis connection (optional) | None (defaults to in-memory) |
| EMBEDDING_MODEL | Transformers.js model (optional) | Xenova/all-mpnet-base-v2 |
| LOG_LEVEL | Logging (optional) | info |
| CACHE_TTL | General cache (optional) | 3600 |
| MAX_MEMORIES_PER_QUERY | Max results (optional) | 10 |
| MIN_SIMILARITY_SCORE | Min similarity (optional) | 0.5 |
- Save the Configuration
- Complete setup in FastMCP and start the server.
Summary:
You are only required to provide a valid PostgreSQL connection string (MEMORY_DB_URL) and optionally a Redis server URL (REDIS_URL) and embedding model name. All these values should be entered directly in the FastMCP connection interface after clicking "Install Now".
How to Install AI Memory
Install AI Memory MCP server with one click through FastMCP. Choose your preferred AI development tool below:
Claude Desktop
Click "Claude Desktop" in Quick Start
Cursor IDE
Click "Cursor IDE" in Quick Start
VS Code
Click "VS Code" in Quick Start
Alternatives to AI Memory
Looking for similar MCP servers? Browse other servers in the same categories on FastMCP, or check out the similar servers listed above.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Run MCP servers without
local setup or downtime
Access to 1,000+ ready-to-use MCP servers
Skip installation, maintenance, and trial-and-error.
No local setup or infra
Run MCP servers without Docker, ports, or tunnels.
Always online
Your MCP keeps working even when your laptop is off.
One secure URL
Use the same MCP from any agent, anywhere.
Secure by default
Encrypted connections. Secrets never stored locally.
Configuration for
Environment Variables
Please provide values for the following environment variables:
HTTP Headers
Please provide values for the following HTTP headers:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Memory Management
View All →Context7
Discover Context7 MCP, a powerful tool that injects fresh, version-specific code docs and examples from official sources directly into your AI prompts. Say goodbye to outdated or incorrect API info—Context7 ensures your language model answers come with the latest coding references. By simply adding "use context7" to your prompt, you get precise, reliable library documentation and working code snippets without leaving your editor. Designed for smooth integration with many MCP-compatible clients and IDEs, it enhances AI coding assistants with accurate, real-time context that boosts developer productivity and confidence.
Memory Bank
Maintains persistent project context through a structured Memory Bank system of five markdown files that track goals, status, progress, decisions, and patterns with automatic timestamp tracking and workflow guidance for consistent documentation across development sessions.
Kiro Memory
Provides intelligent memory management and task tracking for software development projects with automatic project detection, semantic search, and SQLite-based persistence that maintains context across coding sessions through memory classification, relationship building, and context-aware task creation.
In Memoria
Provides persistent intelligence infrastructure for codebase analysis through hybrid Rust-TypeScript architecture that combines Tree-sitter AST parsing with semantic concept extraction, developer pattern recognition, and SQLite-based persistence to build contextual understanding of codebases over time, learning from developer behavior and architectural decisions.
More for AI and Machine Learning
View All →Blender
Experience seamless AI-powered 3D modeling by connecting Blender with Claude AI via the Model Context Protocol. BlenderMCP enables two-way communication, allowing you to create, modify, and inspect 3D scenes directly through AI prompts. Control objects, materials, lighting, and execute Python code in Blender effortlessly. Access assets from Poly Haven and generate AI-driven models using Hyper3D Rodin. This integration enhances creative workflows by combining Blender’s robust tools with Claude’s intelligent guidance, making 3D content creation faster, interactive, and more intuitive. Perfect for artists and developers seeking AI-assisted 3D design within Blender’s environment.
Video & Audio Text Extraction
Extracts text from videos and audio files across platforms like YouTube, Bilibili, TikTok, Instagram, Twitter/X, Facebook, and Vimeo using Whisper speech recognition for transcription, content analysis, and accessibility improvements.
Video Edit (MoviePy)
MoviePy-based video editing server that provides comprehensive video and audio processing capabilities including trimming, merging, resizing, effects, format conversion, YouTube downloading, and text/image overlays through an in-memory object store for chaining operations efficiently.
Qwen Code
Bridges Qwen's code analysis capabilities through CLI integration, providing file-referenced queries with @filename syntax, automatic model fallback, and configurable execution modes for code review, codebase exploration, and automated refactoring workflows.
Similar MCP Servers
AGI MCP Server
Enables persistent memory for AI systems by providing tools for episodic, semantic, and procedural data storage through a vector-and-graph-enhanced database. It allows models to maintain long-term continuity using similarity search, thematic clustering, and identity tracking.
AgentKits Memory
A local, persistent memory system for AI coding assistants that stores decisions, patterns, and session context via MCP tools. It enables cross-session memory management using SQLite and optional vector search without external dependencies or cloud storage.
Cipher
Memory-powered agent framework that provides persistent memory capabilities across conversations and sessions using vector databases and embeddings, enabling context retention, reasoning pattern recognition, and shared workspace memory for team collaboration.
@contextable/mcp
A persistent AI memory server that enables storage and retrieval of context and project artifacts across conversations. It features full-text search, version history, and automatic content chunking using local SQLite or hosted cloud storage.
Doclea MCP
Provides persistent memory for AI coding assistants, storing and retrieving architectural decisions, patterns, and solutions across sessions using semantic search, while also offering git integration for commit messages and code expertise mapping.