RSS Feed Manager
Enables AI to access and manage RSS feed content by parsing OPML files, fetching articles, and filte...
Prepare Your MySQL Database
- Make sure you have access to a MySQL server. You will need the following credentials:
- DB_HOST: The database server address (default:
localhost). - DB_PORT: The database server port (default:
3306). - DB_USERNAME: Your MySQL username (default:
root). - DB_PASSWORD: Your MySQL user password (default:
123456). - DB_DATABASE: The name of the database to use (default:
mcp_rss).
- DB_HOST: The database server address (default:
- If your MySQL is not running, you can start one with:
docker run -itd --name mysql-test -p 3306:3306 -e MYSQL_ROOT_PASSWORD=123456 mysql
- Make sure you have access to a MySQL server. You will need the following credentials:
Obtain or Create an OPML File
- Prepare an OPML file that lists your RSS feed subscriptions. This file usually ends with
.opml. - Place this file somewhere accessible on your system.
- Note the full path to this OPML file. You will need to enter it as the
OPML_FILE_PATHvalue.
- Prepare an OPML file that lists your RSS feed subscriptions. This file usually ends with
Fill in the FastMCP Connection Interface
- In the FastMCP connection interface, use the “Install Now” button and provide the following environment variables:
DB_HOST(e.g.,localhostor the address of your MySQL server)DB_PORT(e.g.,3306)DB_USERNAME(e.g.,root)DB_PASSWORD(the password for your MySQL user)DB_DATABASE(e.g.,mcp_rss)OPML_FILE_PATH(the full path to your.opmlfile)
- In the FastMCP connection interface, use the “Install Now” button and provide the following environment variables:
(Optional) Adjust RSS Fetch Interval
- If you want to customize how frequently RSS feeds are refreshed, set
RSS_UPDATE_INTERVAL(in minutes) in the FastMCP connection interface.
- If you want to customize how frequently RSS feeds are refreshed, set
Save the Configuration
- After entering all required values, save or apply the configuration in the FastMCP interface.
You must fill in all marked variables in the connection interface for proper server operation.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Run MCP servers without
local setup or downtime
Access to 1,000+ ready-to-use MCP servers
Skip installation, maintenance, and trial-and-error.
No local setup or infra
Run MCP servers without Docker, ports, or tunnels.
Always online
Your MCP keeps working even when your laptop is off.
One secure URL
Use the same MCP from any agent, anywhere.
Secure by default
Encrypted connections. Secrets never stored locally.
Configuration for
Environment Variables
Please provide values for the following environment variables:
HTTP Headers
Please provide values for the following HTTP headers:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Web Scraping
View All →DeepWiki
Instantly turn any Deepwiki article into clean, structured Markdown you can use anywhere. Deepwiki MCP Server safely crawls deepwiki.com pages, removes clutter like ads and navigation, rewrites links for Markdown, and offers fast performance with customizable output formats. Choose a single document or organize content by page, and easily extract documentation or guides for any supported library. It’s designed for secure, high-speed conversion and clear, easy-to-read results—making documentation and learning seamless.
Selenium WebDriver
Enables browser automation through Selenium WebDriver with support for Chrome, Firefox, and Edge browsers, providing navigation, element interaction, form handling, screenshot capture, JavaScript execution, and advanced actions for automated testing and web scraping tasks.
Deep Research (Tavily)
Enables comprehensive web research by leveraging Tavily's Search and Crawl APIs to aggregate information from multiple sources, extract detailed content, and structure data specifically for generating technical documentation and research reports.
More for Content Management
View All →Pandoc Markdown to PowerPoint
Converts Markdown content to PowerPoint presentations using pandoc with automatic diagram rendering for Mermaid, PlantUML, and Graphviz code blocks, supporting custom templates and file path inputs for streamlined presentation generation from documentation and notes.
Payload CMS
Provides validation, query, and code generation services for Payload CMS 3.0 development, enabling developers to validate collections, execute SQL-like queries against validation rules, and scaffold complete projects with Redis integration for persistence.
NotebookLM
Empower your CLI agents with zero-hallucination answers from your own docs. NotebookLM MCP Server connects AI tools like Claude, Cursor, and Codex directly to Google’s NotebookLM, ensuring every answer is accurate, current, and citation-backed. Skip error-prone manual copy-paste—let your AI assistant research, synthesize, and reference across your documents for confident coding. All queries are grounded in your uploads, eliminating invented APIs or outdated info. Manage notebooks, automate deep research, and achieve seamless collaboration between multiple tools using a shared, always-relevant knowledge base. Spend less time debugging and more time building with trustworthy, source-based answers.
Similar MCP Servers
RSS Feed Parser
Provides RSS feed parsing and retrieval with RSSHub integration, automatically trying multiple instances when one fails and supporting custom rsshub:// protocol URLs for accessing current content from websites, social platforms, and news sources that don't natively provide RSS feeds.