A

AI Memory

Production-ready semantic memory management server that stores, retrieves, and manages contextual kn...

99 views
0 installs
Updated Sep 9, 2025
Not audited
Production-ready semantic memory management server that stores, retrieves, and manages contextual knowledge across sessions using PostgreSQL with pgvector for vector similarity search, featuring intelligent caching, multi-user support, memory relationships, automatic clustering, and background job processing for persistent AI memory and knowledge management systems.
  1. Prepare Your PostgreSQL Database

    • Ensure you have a running PostgreSQL server with the pgvector extension enabled.
    • Create a new database (e.g., mcp_ai_memory) and user if you haven't already.
    • Enable the pgvector extension for your database:
      CREATE EXTENSION IF NOT EXISTS vector;
      
  2. Gather Your PostgreSQL Database Credentials

    • You will need:
      • Host (e.g., localhost)
      • Port (default: 5432)
      • Database name (e.g., mcp_ai_memory)
      • Username
      • Password
  3. Create the Connection URL

    • Construct your PostgreSQL URL in the following format:
      postgresql://<username>:<password>@<host>:<port>/<database>
      
      • Example: postgresql://myuser:mypassword@localhost:5432/mcp_ai_memory
  4. [Optional] Prepare Redis (For Advanced Caching/Async Features)

    • Set up a Redis server if you want caching or async job processing.
    • Default Redis URL: redis://localhost:6379
  5. [Optional] Choose Embedding Model

    • You can specify an embedding model or use the default.
    • Default: Xenova/all-mpnet-base-v2
    • Alternative: Xenova/all-MiniLM-L6-v2 (smaller/faster, lower quality)
  6. Fill out the ENV values in FastMCP

    • Click the "Install Now" button to begin configuring the MCP AI Memory server.
    • In the FastMCP connection interface, set or fill out the following ENV fields:
      • MEMORY_DB_URL: your PostgreSQL URL from Step 3 (Required)
      • REDIS_URL: your Redis URL (optional)
      • EMBEDDING_MODEL: model name (optional, defaults as above)
      • Any other advanced ENV values as needed (see table below):
Variable Description Default
MEMORY_DB_URL PostgreSQL connection string (Required) None
REDIS_URL Redis connection (optional) None (defaults to in-memory)
EMBEDDING_MODEL Transformers.js model (optional) Xenova/all-mpnet-base-v2
LOG_LEVEL Logging (optional) info
CACHE_TTL General cache (optional) 3600
MAX_MEMORIES_PER_QUERY Max results (optional) 10
MIN_SIMILARITY_SCORE Min similarity (optional) 0.5
  1. Save the Configuration
    • Complete setup in FastMCP and start the server.

Summary:
You are only required to provide a valid PostgreSQL connection string (MEMORY_DB_URL) and optionally a Redis server URL (REDIS_URL) and embedding model name. All these values should be entered directly in the FastMCP connection interface after clicking "Install Now".

Quick Start

View on GitHub

More for Memory Management

View All →

More for AI and Machine Learning

View All →

Similar MCP Servers

Report Issue

Thank you! Your issue report has been submitted successfully.