Cipher
Memory-powered agent framework that provides persistent memory capabilities across conversations and...
Copy the .env Example File (Optional, if not already present)
- Inside your project directory, run:
cp .env.example .env - This creates a
.envfile with all environment variable placeholders you need.
- Inside your project directory, run:
Locate Your API Keys
- You’ll need to gather at least one API key for the LLM (Language Model) you want to use. The supported providers are OpenAI, Anthropic, Gemini, and Qwen.
- OpenAI: Get your API key at https://platform.openai.com/api-keys
- Anthropic: Get your API key at https://console.anthropic.com/account/keys
- Gemini: Get your API key at https://aistudio.google.com/app/apikey
- Qwen: Check the documentation or your Qwen provider for details (often via https://platform.openai.com for compatible APIs)
- You’ll need to gather at least one API key for the LLM (Language Model) you want to use. The supported providers are OpenAI, Anthropic, Gemini, and Qwen.
(Optional) Set Up Vector Store or Chat History
- If you wish to use an external vector store (like Qdrant or Milvus) or PostgreSQL for chat history, also obtain corresponding URIs and API keys from your database/vector store service provider.
Open the FastMCP “Install Now” Connection Interface
- Click the “Install Now” button in your MCP/IDE interface for Byterover Cipher.
Enter Environment Variable Values
In the interface, fill in the relevant ENV fields using your keys:
- For OpenAI:
OPENAI_API_KEY=sk-... - For Anthropic:
ANTHROPIC_API_KEY=sk-ant-... - For Gemini:
GEMINI_API_KEY=... - For Qwen:
QWEN_API_KEY=...
- For OpenAI:
If using advanced features, fill in:
VECTOR_STORE_TYPE,VECTOR_STORE_URL, andVECTOR_STORE_API_KEY(for Qdrant or Milvus).CIPHER_PG_URLfor PostgreSQL chat history.AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY, andAWS_DEFAULT_REGIONfor AWS Bedrock.
Save and Complete Installation
- Confirm and save the values in the FastMCP connection interface. The integration will use these environment variables when running Byterover Cipher.
For more details, refer to the official Cipher documentation or LLM provider setup.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Configure Environment Variables for
Please provide values for the following environment variables:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Memory Management
View All →Context7
Discover Context7 MCP, a powerful tool that injects fresh, version-specific code docs and examples from official sources directly into your AI prompts. Say goodbye to outdated or incorrect API info—Context7 ensures your language model answers come with the latest coding references. By simply adding "use context7" to your prompt, you get precise, reliable library documentation and working code snippets without leaving your editor. Designed for smooth integration with many MCP-compatible clients and IDEs, it enhances AI coding assistants with accurate, real-time context that boosts developer productivity and confidence.
Memory Bank
Maintains persistent project context through a structured Memory Bank system of five markdown files that track goals, status, progress, decisions, and patterns with automatic timestamp tracking and workflow guidance for consistent documentation across development sessions.
More for Developer Tools
View All →
GitHub
Extend your developer tools with the GitHub MCP Server—a powerful Model Context Protocol server enhancing automation and AI interactions with GitHub APIs. It supports diverse functionalities like managing workflows, issues, pull requests, repositories, and security alerts. Customize available toolsets to fit your needs, enable dynamic tool discovery to streamline tool usage, and run the server locally or remotely. With read-only mode and support for GitHub Enterprise, this server integrates deeply into your GitHub ecosystem, empowering data extraction and intelligent operations for developers and AI applications. Licensed under MIT, it fosters flexible and advanced GitHub automation.
Desktop Commander
Desktop Commander MCP transforms Claude Desktop into a powerful AI assistant for managing files, running terminal commands, and editing code with precision across your entire system. It supports in-memory code execution, interactive process control, advanced search and replace, plus comprehensive filesystem operations including reading from URLs and negative offset file reads. With detailed audit and fuzzy search logging, it enables efficient automation, data analysis, and multi-project workflows—all without extra API costs. Designed for developers seeking smarter automation, it enhances productivity by integrating all essential development tools into a single, intelligent chat interface.
Figma Context
Unlock seamless design-to-code with Framelink Figma MCP Server, letting AI coding tools access your Figma files directly. It simplifies Figma API data to supply only relevant design layouts and styles, boosting AI accuracy in implementing designs across frameworks. Specifically built for use with tools like Cursor, it transforms design metadata into precise code in one step. This server streamlines the workflow by providing clean, focused context, enabling faster and more reliable design-driven development. Enjoy a powerful bridge between design and coding that enhances productivity and code quality with minimal fuss.
Chrome DevTools
Provides direct Chrome browser control through DevTools for web automation, debugging, and performance analysis using accessibility tree snapshots for reliable element targeting, automatic page event handling, and integrated performance tracing with actionable insights.
Microsoft Docs
Access official Microsoft documentation instantly with the Microsoft Learn Docs MCP Server. This cloud service implements the Model Context Protocol (MCP) to enable AI tools like GitHub Copilot and others to perform high-quality semantic searches across Microsoft Learn, Azure, Microsoft 365 docs, and more. It delivers up to 10 concise, context-relevant content chunks in real time, ensuring up-to-date, accurate information. Designed for seamless integration with any MCP-compatible client, it helps AI assistants ground their responses in authoritative, current Microsoft resources for better developer support and productivity.