AGI MCP Server
Enables persistent memory for AI systems by providing tools for episodic, semantic, and procedural d...
Open the FastMCP connection interface
- Click the "Install Now" button to open the FastMCP / MCP connection UI where you can add environment variables.
Clone the AGI Memory repository (where the DB runs)
- Run:
git clone https://github.com/cognitivecomputations/agi-memory.git cd agi-memory - (This repo contains the DB setup and the example env file.) (github.com)
- Run:
Create and edit the database .env file to obtain the credentials
- Copy the example environment file and open it for editing:
cp .env.local .env # edit .env with your database credentials - In .env set (or confirm) these values — these are the exact keys you will later paste into FastMCP:
- POSTGRES_HOST (e.g., localhost)
- POSTGRES_PORT (default: 5432)
- POSTGRES_DB (e.g., agi_db)
- POSTGRES_USER (e.g., agi_user)
- POSTGRES_PASSWORD (choose a strong password)
- NODE_ENV (set to development)
- Save .env. (github.com)
- Copy the example environment file and open it for editing:
Start the memory database and confirm host/port
- Start the DB:
docker compose up -d - Wait for the DB to initialize, then tail the DB logs to confirm readiness (the DB typically listens on port 5432). Example:
docker compose logs -f db - If docker-compose maps the DB port to the host, use the mapped host and port (commonly POSTGRES_HOST=localhost and POSTGRES_PORT=5432). (github.com)
- Start the DB:
(Optional) Generate a strong POSTGRES_PASSWORD
- Generate a secure password locally and paste it into .env:
openssl rand -base64 32 - Store that password in your .env and also in the FastMCP fields below.
- Generate a secure password locally and paste it into .env:
Verify you can connect to the DB (quick test)
- From the agi-memory directory, you can exec into the DB container or use psql to verify credentials:
# from host (if port mapped) psql -h localhost -p 5432 -U <POSTGRES_USER> -d <POSTGRES_DB> # or, exec into container docker compose exec db psql -U <POSTGRES_USER> -d <POSTGRES_DB> - If the connection succeeds, your credentials are correct.
- From the agi-memory directory, you can exec into the DB container or use psql to verify credentials:
Enter the values into the FastMCP connection interface
- In the FastMCP / MCP connection UI (opened via "Install Now"), create or edit the AGI MCP server entry and paste the environment variable names and values exactly as below:
- POSTGRES_HOST = <value from .env>
- POSTGRES_PORT = <value from .env>
- POSTGRES_DB = <value from .env>
- POSTGRES_USER = <value from .env>
- POSTGRES_PASSWORD = <value from .env>
- NODE_ENV = development
- Save the connection entry in the FastMCP UI.
- In the FastMCP / MCP connection UI (opened via "Install Now"), create or edit the AGI MCP server entry and paste the environment variable names and values exactly as below:
Start or test the MCP server using the same env values
- To validate end-to-end (optional), run the MCP server locally with the env values:
POSTGRES_HOST=localhost POSTGRES_PORT=5432 POSTGRES_DB=agi_db POSTGRES_USER=agi_user POSTGRES_PASSWORD=your_password NODE_ENV=development node mcp.js - You should see a startup message like: "Memory MCP Server running on stdio" if the connection works. (If you used the npx/remote option, ensure the FastMCP entry uses the same env values.)
- To validate end-to-end (optional), run the MCP server locally with the env values:
If anything fails — double-check
- Confirm the DB container is running and that .env values match the docker-compose settings (port mapping, service names).
- Inspect docker compose logs for errors and re-run the verification steps above. (github.com)
Notes and security
- Keep POSTGRES_PASSWORD and .env private — do not commit them to git.
- If you need to rotate the DB password later, update it in the database .env (or DB), then update the FastMCP entry with the new POSTGRES_PASSWORD.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Run MCP servers without
local setup or downtime
Access to 1,000+ ready-to-use MCP servers
Skip installation, maintenance, and trial-and-error.
No local setup or infra
Run MCP servers without Docker, ports, or tunnels.
Always online
Your MCP keeps working even when your laptop is off.
One secure URL
Use the same MCP from any agent, anywhere.
Secure by default
Encrypted connections. Secrets never stored locally.
Configuration for
Environment Variables
Please provide values for the following environment variables:
HTTP Headers
Please provide values for the following HTTP headers:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Memory Management
View All →Context7
Discover Context7 MCP, a powerful tool that injects fresh, version-specific code docs and examples from official sources directly into your AI prompts. Say goodbye to outdated or incorrect API info—Context7 ensures your language model answers come with the latest coding references. By simply adding "use context7" to your prompt, you get precise, reliable library documentation and working code snippets without leaving your editor. Designed for smooth integration with many MCP-compatible clients and IDEs, it enhances AI coding assistants with accurate, real-time context that boosts developer productivity and confidence.
Memory Bank
Maintains persistent project context through a structured Memory Bank system of five markdown files that track goals, status, progress, decisions, and patterns with automatic timestamp tracking and workflow guidance for consistent documentation across development sessions.
In Memoria
Provides persistent intelligence infrastructure for codebase analysis through hybrid Rust-TypeScript architecture that combines Tree-sitter AST parsing with semantic concept extraction, developer pattern recognition, and SQLite-based persistence to build contextual understanding of codebases over time, learning from developer behavior and architectural decisions.
Kiro Memory
Provides intelligent memory management and task tracking for software development projects with automatic project detection, semantic search, and SQLite-based persistence that maintains context across coding sessions through memory classification, relationship building, and context-aware task creation.
More for AI and Machine Learning
View All →Blender
Experience seamless AI-powered 3D modeling by connecting Blender with Claude AI via the Model Context Protocol. BlenderMCP enables two-way communication, allowing you to create, modify, and inspect 3D scenes directly through AI prompts. Control objects, materials, lighting, and execute Python code in Blender effortlessly. Access assets from Poly Haven and generate AI-driven models using Hyper3D Rodin. This integration enhances creative workflows by combining Blender’s robust tools with Claude’s intelligent guidance, making 3D content creation faster, interactive, and more intuitive. Perfect for artists and developers seeking AI-assisted 3D design within Blender’s environment.
Video & Audio Text Extraction
Extracts text from videos and audio files across platforms like YouTube, Bilibili, TikTok, Instagram, Twitter/X, Facebook, and Vimeo using Whisper speech recognition for transcription, content analysis, and accessibility improvements.
Video Edit (MoviePy)
MoviePy-based video editing server that provides comprehensive video and audio processing capabilities including trimming, merging, resizing, effects, format conversion, YouTube downloading, and text/image overlays through an in-memory object store for chaining operations efficiently.
Qwen Code
Bridges Qwen's code analysis capabilities through CLI integration, providing file-referenced queries with @filename syntax, automatic model fallback, and configurable execution modes for code review, codebase exploration, and automated refactoring workflows.
Similar MCP Servers
AI Memory
Production-ready semantic memory management server that stores, retrieves, and manages contextual knowledge across sessions using PostgreSQL with pgvector for vector similarity search, featuring intelligent caching, multi-user support, memory relationships, automatic clustering, and background job processing for persistent AI memory and knowledge management systems.
AgentKits Memory
A local, persistent memory system for AI coding assistants that stores decisions, patterns, and session context via MCP tools. It enables cross-session memory management using SQLite and optional vector search without external dependencies or cloud storage.
@contextable/mcp
A persistent AI memory server that enables storage and retrieval of context and project artifacts across conversations. It features full-text search, version history, and automatic content chunking using local SQLite or hosted cloud storage.
Doclea MCP
Provides persistent memory for AI coding assistants, storing and retrieving architectural decisions, patterns, and solutions across sessions using semantic search, while also offering git integration for commit messages and code expertise mapping.