DollhouseMCP
OfficialA comprehensive Model Context Protocol (MCP) server that enables dynamic AI persona management with ...
Open the FastMCP connection interface
- Open the FastMCP connection dialog where you have the "Install Now" button.
- You will enter the environment variable names and values into that interface before clicking "Install Now".
Prepare the required environment variable names
- At minimum, prepare these keys (values explained in following steps):
- GITHUB_TOKEN
- DOLLHOUSE_PORTFOLIO_DIR
- (optional) POSTHOG_API_KEY
- (optional telemetry flags) DOLLHOUSE_TELEMETRY_OPTIN, DOLLHOUSE_TELEMETRY_NO_REMOTE, DOLLHOUSE_TELEMETRY
- At minimum, prepare these keys (values explained in following steps):
Create a GitHub Personal Access Token (GITHUB_TOKEN)
- Sign in to github.com with the account you will use for portfolio/GitHub sync.
- Go to Settings → Developer settings → Personal access tokens → Tokens (classic) or Fine‑grained tokens → Generate new token.
- Select scopes your deployment needs (common choices):
- For repository sync: repo (or equivalent fine‑grained repo access)
- If the server will trigger workflows: workflow
- If you need org-level access: read:org (or appropriate fine‑grained permissions)
- Create the token and copy it immediately (GitHub shows it only once).
- Paste this token as the value for GITHUB_TOKEN in the FastMCP interface.
Choose a portfolio directory path (DOLLHOUSE_PORTFOLIO_DIR)
- On the machine that will run the MCP server, decide a directory to hold your portfolio (e.g., /home/you/.dollhouse/portfolio or C:\Users\You.dollhouse\portfolio).
- Create the directory if it does not exist and ensure the MCP process can read/write it.
- Enter that full path as the value for DOLLHOUSE_PORTFOLIO_DIR in the FastMCP interface.
(Optional) Obtain a PostHog API key (POSTHOG_API_KEY)
- Only needed if you want to supply your own PostHog project key (the README notes a default embedded write-only key for community telemetry).
- Sign in to your PostHog account and open the project you want to use.
- Go to Project Settings → API keys (or Keys & Tokens) and copy the project write/API key.
- Paste that key as POSTHOG_API_KEY in the FastMCP interface.
(Optional) Set telemetry flags
- If you want to control telemetry behavior, set one or more:
- DOLLHOUSE_TELEMETRY_OPTIN=true (enable remote PostHog telemetry)
- DOLLHOUSE_TELEMETRY_NO_REMOTE=true (local only, no PostHog)
- DOLLHOUSE_TELEMETRY=false (disable all telemetry)
- Enter the chosen flag(s) and boolean value(s) in the FastMCP interface.
- If you want to control telemetry behavior, set one or more:
Enter values into the FastMCP connection interface
- For each environment variable above, add a new entry in the FastMCP form:
- Key: (exact env name, e.g., GITHUB_TOKEN)
- Value: (token or path)
- Mark secret/sensitive fields (GITHUB_TOKEN, POSTHOG_API_KEY) as secret if the interface supports it.
- For each environment variable above, add a new entry in the FastMCP form:
Click "Install Now"
- After all desired envs are filled, click the Install Now button to apply the configuration and start the MCP server.
Verify the server and token access
- Once installed/restarted, test the MCP with a simple command (for example via Claude Desktop): list_elements type="personas"
- If you configured GitHub sync, verify repository access by running a sync command or checking that portfolio operations succeed.
- If telemetry was enabled, check logs (e.g., ~/.dollhouse/telemetry.log) or PostHog project events (if you provided a custom key).
If a token is lost or needs rotation
Generate a new token using the same provider steps above.
Update the value in the FastMCP connection interface (replace the old value).
Re-run Install Now / restart the MCP server to pick up the change.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Run MCP servers without
local setup or downtime
Access to 1,000+ ready-to-use MCP servers
Skip installation, maintenance, and trial-and-error.
No local setup or infra
Run MCP servers without Docker, ports, or tunnels.
Always online
Your MCP keeps working even when your laptop is off.
One secure URL
Use the same MCP from any agent, anywhere.
Secure by default
Encrypted connections. Secrets never stored locally.
Configuration for
Environment Variables
Please provide values for the following environment variables:
HTTP Headers
Please provide values for the following HTTP headers:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Memory Management
View All →Context7
Discover Context7 MCP, a powerful tool that injects fresh, version-specific code docs and examples from official sources directly into your AI prompts. Say goodbye to outdated or incorrect API info—Context7 ensures your language model answers come with the latest coding references. By simply adding "use context7" to your prompt, you get precise, reliable library documentation and working code snippets without leaving your editor. Designed for smooth integration with many MCP-compatible clients and IDEs, it enhances AI coding assistants with accurate, real-time context that boosts developer productivity and confidence.
Memory Bank
Maintains persistent project context through a structured Memory Bank system of five markdown files that track goals, status, progress, decisions, and patterns with automatic timestamp tracking and workflow guidance for consistent documentation across development sessions.
In Memoria
Provides persistent intelligence infrastructure for codebase analysis through hybrid Rust-TypeScript architecture that combines Tree-sitter AST parsing with semantic concept extraction, developer pattern recognition, and SQLite-based persistence to build contextual understanding of codebases over time, learning from developer behavior and architectural decisions.
Kiro Memory
Provides intelligent memory management and task tracking for software development projects with automatic project detection, semantic search, and SQLite-based persistence that maintains context across coding sessions through memory classification, relationship building, and context-aware task creation.
More for AI and Machine Learning
View All →Blender
Experience seamless AI-powered 3D modeling by connecting Blender with Claude AI via the Model Context Protocol. BlenderMCP enables two-way communication, allowing you to create, modify, and inspect 3D scenes directly through AI prompts. Control objects, materials, lighting, and execute Python code in Blender effortlessly. Access assets from Poly Haven and generate AI-driven models using Hyper3D Rodin. This integration enhances creative workflows by combining Blender’s robust tools with Claude’s intelligent guidance, making 3D content creation faster, interactive, and more intuitive. Perfect for artists and developers seeking AI-assisted 3D design within Blender’s environment.
Video Edit (MoviePy)
MoviePy-based video editing server that provides comprehensive video and audio processing capabilities including trimming, merging, resizing, effects, format conversion, YouTube downloading, and text/image overlays through an in-memory object store for chaining operations efficiently.
Video & Audio Text Extraction
Extracts text from videos and audio files across platforms like YouTube, Bilibili, TikTok, Instagram, Twitter/X, Facebook, and Vimeo using Whisper speech recognition for transcription, content analysis, and accessibility improvements.
Similar MCP Servers
ROS MCP Server
Unlock true AI-powered robotics—control and observe any ROS1 or ROS2 robot using natural language. ROS-MCP-Server bridges large language models like Claude, GPT, and Gemini with your robot, enabling seamless two-way communication with zero changes to the robot’s code. Instantly browse topics, read sensors, call services, and adjust parameters—all through intuitive commands. Perfect for debugging, real-time control, and monitoring, this server works with both simulated and real robots and supports all MCP-compatible language models. Future updates will add enhanced permissions and action support for even greater flexibility.
Context7
Discover Context7 MCP, a powerful tool that injects fresh, version-specific code docs and examples from official sources directly into your AI prompts. Say goodbye to outdated or incorrect API info—Context7 ensures your language model answers come with the latest coding references. By simply adding "use context7" to your prompt, you get precise, reliable library documentation and working code snippets without leaving your editor. Designed for smooth integration with many MCP-compatible clients and IDEs, it enhances AI coding assistants with accurate, real-time context that boosts developer productivity and confidence.
Google BigQuery
Discover official and open-source Model Context Protocol (MCP) servers from Google. This project provides an up-to-date directory of MCP servers for Google services like BigQuery. Explore examples and resources that help you build, integrate, and extend intelligent agents using Google's ecosystem of MCP solutions—all designed to streamline context-aware app development and experimentation.