AGI MCP Server
A Model Context Protocol server that provides persistent memory capabilities for AI systems, enablin...
Open the FastMCP connection interface and click “Install Now” — this will open the form where you enter environment variables for the AGI MCP server. Fill the fields in that interface with the exact ENV names listed below.
Determine the database credentials you will use
- If you are setting up the AGI Memory database (recommended): clone the AGI Memory repo, copy its example env file, and edit it with your chosen credentials:
- git clone https://github.com/cognitivecomputations/agi-memory.git
- cd agi-memory
- cp .env.local .env
- Edit .env and set POSTGRES_HOST, POSTGRES_PORT, POSTGRES_DB, POSTGRES_USER, POSTGRES_PASSWORD, and any other database values. (github.com)
- If you are setting up the AGI Memory database (recommended): clone the AGI Memory repo, copy its example env file, and edit it with your chosen credentials:
Start the AGI Memory database (if using the repo’s Docker setup) and wait for it to initialize
- docker compose up -d (run in the agi-memory directory)
- Wait ~2–3 minutes and confirm the DB is ready (docker compose logs -f db or docker compose ps). (github.com)
Copy the exact ENV names (and their values from your agi-memory .env) into the FastMCP connection interface
- Required ENV keys to enter in FastMCP:
- POSTGRES_HOST (e.g., localhost)
- POSTGRES_PORT (e.g., 5432)
- POSTGRES_DB (e.g., agi_db)
- POSTGRES_USER (e.g., agi_user)
- POSTGRES_PASSWORD (your chosen DB password)
- NODE_ENV (e.g., development)
- Use the same values you put into agi-memory’s .env so the MCP server can connect to the database. (github.com)
- Required ENV keys to enter in FastMCP:
If you need to create the database/user yourself (not using the supplied Docker), create them in Postgres
- Example using psql:
- CREATE ROLE agi_user WITH LOGIN PASSWORD 'your_password';
- CREATE DATABASE agi_db OWNER agi_user;
- (Ensure pgvector and Apache AGE extensions are installed per the AGI Memory setup if required.)
- Example using psql:
Save the connection in FastMCP and start the MCP server via the interface
- After saving, start/enable the server from FastMCP (the “Install Now” flow will register the MCP server with those ENVs).
- Alternatively, you can test locally by running (from the agi-mcp-server directory) with the same ENVs:
- POSTGRES_HOST=... POSTGRES_PORT=... POSTGRES_DB=... POSTGRES_USER=... POSTGRES_PASSWORD=... NODE_ENV=development node mcp.js
- You should see a startup message like “Memory MCP Server running on stdio.” (github.com)
Verify connectivity and check logs
- If using the repo’s docker: docker compose ps and docker compose logs -f db to confirm DB health. (github.com)
- If launched via FastMCP/Claude Desktop, open the MCP logs (e.g., ~/Library/Logs/Claude/mcp-server-agi-memory.log) or the FastMCP log view to confirm successful connection.
If you change credentials later
- Update the agi-memory .env (or your Postgres user) and update the same ENV values in the FastMCP connection interface, then restart the MCP connection so the changes take effect. (github.com)
Security reminder
- Use a strong password for POSTGRES_PASSWORD, keep the .env out of source control, and only paste credentials into the FastMCP interface when you trust the environment.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Run MCP servers without
local setup or downtime
Access to 1,000+ ready-to-use MCP servers
Skip installation, maintenance, and trial-and-error.
No local setup or infra
Run MCP servers without Docker, ports, or tunnels.
Always online
Your MCP keeps working even when your laptop is off.
One secure URL
Use the same MCP from any agent, anywhere.
Secure by default
Encrypted connections. Secrets never stored locally.
Configuration for
Environment Variables
Please provide values for the following environment variables:
HTTP Headers
Please provide values for the following HTTP headers:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Memory Management
View All →Context7
Discover Context7 MCP, a powerful tool that injects fresh, version-specific code docs and examples from official sources directly into your AI prompts. Say goodbye to outdated or incorrect API info—Context7 ensures your language model answers come with the latest coding references. By simply adding "use context7" to your prompt, you get precise, reliable library documentation and working code snippets without leaving your editor. Designed for smooth integration with many MCP-compatible clients and IDEs, it enhances AI coding assistants with accurate, real-time context that boosts developer productivity and confidence.
Memory Bank
Maintains persistent project context through a structured Memory Bank system of five markdown files that track goals, status, progress, decisions, and patterns with automatic timestamp tracking and workflow guidance for consistent documentation across development sessions.
In Memoria
Provides persistent intelligence infrastructure for codebase analysis through hybrid Rust-TypeScript architecture that combines Tree-sitter AST parsing with semantic concept extraction, developer pattern recognition, and SQLite-based persistence to build contextual understanding of codebases over time, learning from developer behavior and architectural decisions.
Kiro Memory
Provides intelligent memory management and task tracking for software development projects with automatic project detection, semantic search, and SQLite-based persistence that maintains context across coding sessions through memory classification, relationship building, and context-aware task creation.
More for AI and Machine Learning
View All →Blender
Experience seamless AI-powered 3D modeling by connecting Blender with Claude AI via the Model Context Protocol. BlenderMCP enables two-way communication, allowing you to create, modify, and inspect 3D scenes directly through AI prompts. Control objects, materials, lighting, and execute Python code in Blender effortlessly. Access assets from Poly Haven and generate AI-driven models using Hyper3D Rodin. This integration enhances creative workflows by combining Blender’s robust tools with Claude’s intelligent guidance, making 3D content creation faster, interactive, and more intuitive. Perfect for artists and developers seeking AI-assisted 3D design within Blender’s environment.
Video & Audio Text Extraction
Extracts text from videos and audio files across platforms like YouTube, Bilibili, TikTok, Instagram, Twitter/X, Facebook, and Vimeo using Whisper speech recognition for transcription, content analysis, and accessibility improvements.
Video Edit (MoviePy)
MoviePy-based video editing server that provides comprehensive video and audio processing capabilities including trimming, merging, resizing, effects, format conversion, YouTube downloading, and text/image overlays through an in-memory object store for chaining operations efficiently.
Qwen Code
Bridges Qwen's code analysis capabilities through CLI integration, providing file-referenced queries with @filename syntax, automatic model fallback, and configurable execution modes for code review, codebase exploration, and automated refactoring workflows.
Similar MCP Servers
Amicus MCP Server
A state persistence layer that enables seamless handoffs between different AI coding assistants by maintaining a shared context bus. It provides tools for tracking summaries, next steps, and active files to ensure continuity across development sessions.
DollhouseMCP
A comprehensive Model Context Protocol (MCP) server that enables dynamic AI persona management with an integrated GitHub-powered marketplace, allowing Claude and other compatible AI assistants to activate different behavioral personas.
Supabase MCP Server
Connect Supabase projects directly with AI assistants using the Model Context Protocol (MCP). This server standardizes communication between Large Language Models and Supabase, enabling AI to manage tables, query data, and interact with project features like edge functions, storage, and branching. Customize access with read-only or project-scoped modes and select specific tool groups to fit your needs. Integrated tools cover account management, documentation search, database operations, debugging, and more, empowering AI to assist with development, monitoring, and deployment tasks in your Supabase environment efficiently and securely.