Chroma Working Memory
Provides a persistent, searchable, automatically updated 'second brain' for development by integrati...
Install Chroma MCP Server (if not already done)
- Run the following command in your terminal:
pip install chroma-mcp-server
- Run the following command in your terminal:
Prepare Your Storage and Logging Directories
- Decide on the folder(s) for persistent data storage and logging.
- For example, create folders called
my_dataandmy_logsin your project directory if you want to use the paths./my_dataand./my_logs.
Collect the Required ENV Values
- You will need the following values for configuration:
CHROMA_CLIENT_TYPE: Usually set topersistentfor persistent storage.CHROMA_DATA_DIR: Absolute or relative path to your persistent data directory, e.g.,/path/to/your/data.CHROMA_LOG_DIR: Absolute or relative path to your logging directory, e.g.,/path/to/your/logs.LOG_LEVEL,MCP_LOG_LEVEL,MCP_SERVER_LOG_LEVEL(Optional): Set logging levels; commonlyINFOis used.
- You will need the following values for configuration:
Fill in Values in the FastMCP Connection Interface using "Install Now"
- Open the FastMCP integration interface.
- Click on the "Install Now" button for the Chroma MCP Server.
- In the connection form, fill in the following fields with the corresponding values:
CHROMA_CLIENT_TYPE→persistentCHROMA_DATA_DIR→ (your desired data directory path, e.g.,/path/to/your/data)CHROMA_LOG_DIR→ (your desired log directory path, e.g.,/path/to/your/logs)- Optional: Set
LOG_LEVEL,MCP_LOG_LEVEL,MCP_SERVER_LOG_LEVELtoINFOor your preferred log level.
Save the Configuration
- Complete the setup by saving your changes in the FastMCP interface.
Note: No external API keys, tokens, or credentials are required for Chroma MCP Server—only local directory paths and log settings.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Configure Environment Variables for
Please provide values for the following environment variables:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Memory Management
View All →Context7
Discover Context7 MCP, a powerful tool that injects fresh, version-specific code docs and examples from official sources directly into your AI prompts. Say goodbye to outdated or incorrect API info—Context7 ensures your language model answers come with the latest coding references. By simply adding "use context7" to your prompt, you get precise, reliable library documentation and working code snippets without leaving your editor. Designed for smooth integration with many MCP-compatible clients and IDEs, it enhances AI coding assistants with accurate, real-time context that boosts developer productivity and confidence.
Memory Bank
Maintains persistent project context through a structured Memory Bank system of five markdown files that track goals, status, progress, decisions, and patterns with automatic timestamp tracking and workflow guidance for consistent documentation across development sessions.
More for Developer Tools
View All →
GitHub
Extend your developer tools with the GitHub MCP Server—a powerful Model Context Protocol server enhancing automation and AI interactions with GitHub APIs. It supports diverse functionalities like managing workflows, issues, pull requests, repositories, and security alerts. Customize available toolsets to fit your needs, enable dynamic tool discovery to streamline tool usage, and run the server locally or remotely. With read-only mode and support for GitHub Enterprise, this server integrates deeply into your GitHub ecosystem, empowering data extraction and intelligent operations for developers and AI applications. Licensed under MIT, it fosters flexible and advanced GitHub automation.
Desktop Commander
Desktop Commander MCP transforms Claude Desktop into a powerful AI assistant for managing files, running terminal commands, and editing code with precision across your entire system. It supports in-memory code execution, interactive process control, advanced search and replace, plus comprehensive filesystem operations including reading from URLs and negative offset file reads. With detailed audit and fuzzy search logging, it enables efficient automation, data analysis, and multi-project workflows—all without extra API costs. Designed for developers seeking smarter automation, it enhances productivity by integrating all essential development tools into a single, intelligent chat interface.
Figma Context
Unlock seamless design-to-code with Framelink Figma MCP Server, letting AI coding tools access your Figma files directly. It simplifies Figma API data to supply only relevant design layouts and styles, boosting AI accuracy in implementing designs across frameworks. Specifically built for use with tools like Cursor, it transforms design metadata into precise code in one step. This server streamlines the workflow by providing clean, focused context, enabling faster and more reliable design-driven development. Enjoy a powerful bridge between design and coding that enhances productivity and code quality with minimal fuss.
Chrome DevTools
Provides direct Chrome browser control through DevTools for web automation, debugging, and performance analysis using accessibility tree snapshots for reliable element targeting, automatic page event handling, and integrated performance tracing with actionable insights.
Microsoft Docs
Access official Microsoft documentation instantly with the Microsoft Learn Docs MCP Server. This cloud service implements the Model Context Protocol (MCP) to enable AI tools like GitHub Copilot and others to perform high-quality semantic searches across Microsoft Learn, Azure, Microsoft 365 docs, and more. It delivers up to 10 concise, context-relevant content chunks in real time, ensuring up-to-date, accurate information. Designed for seamless integration with any MCP-compatible client, it helps AI assistants ground their responses in authoritative, current Microsoft resources for better developer support and productivity.