Higress AI Search
Enhances AI model responses with real-time search results from various engines through Higress ai-se...
Determine the LLM Model Name
Identify which AI model you wish to use with the Higress AI-Search MCP Server (e.g.,qwen-turbo). If you are unsure which models are available, consult the documentation for your Higress setup or use the defaultqwen-turbo.(Optional) Prepare the Higress Service URL
If your Higress instance is running on a non-default address, note the URL (e.g.,http://localhost:8080/v1/chat/completions). If unsure, you can use the default value.(Optional) List Internal Knowledge Bases
If you wish to enhance the AI's responses with internal company documents, provide a descriptive list (e.g., “Employee handbook, company policies, internal process documents”). This step is optional.Open the FastMCP Connection Interface
Locate and open the FastMCP interface where the server connection environment variables are configured.Click the “Install Now” Button
Use the pre-made “Install Now” button to add the Higress AI-Search MCP Server.Fill in the Required ENV Values
In the FastMCP connection interface, enter the following values:MODEL: (required) Enter your chosen model name (e.g.,qwen-turbo).HIGRESS_URL: (optional) Enter your Higress API endpoint or leave ashttp://localhost:8080/v1/chat/completions.INTERNAL_KNOWLEDGE_BASES: (optional) Add any internal resource descriptions if you wish to enable custom knowledge search.
Save and Apply
Confirm the values and save the configuration in FastMCP.
No external keys or tokens are required for initial setup beyond the above values.
Quick Start
Choose Connection Type for
Authentication Required
Please sign in to use FastMCP hosted connections
Configure Environment Variables for
Please provide values for the following environment variables:
started!
The MCP server should open in . If it doesn't open automatically, please check that you have the application installed.
Copy and run this command in your terminal:
Make sure Gemini CLI is installed:
Visit Gemini CLI documentation for installation instructions.
Make sure Claude Code is installed:
Visit Claude Code documentation for installation instructions.
Installation Steps:
Configuration
Installation Failed
More for Web Search
View All →DuckDuckGo
Experience fast and reliable DuckDuckGo web search with this TypeScript MCP server. It offers a simple search interface supporting customizable queries, result counts, and safe search levels. Built-in rate limiting ensures fair usage with up to 1 request per second and 15,000 per month. The server returns well-formatted Markdown results, making it easy to integrate and display search data. Designed to demonstrate core Model Context Protocol concepts, it also includes helpful debugging tools to inspect communication. Perfect for developers wanting seamless DuckDuckGo integration via MCP with efficient error handling and robust controls.
Microsoft Docs
Access official Microsoft documentation instantly with the Microsoft Learn Docs MCP Server. This cloud service implements the Model Context Protocol (MCP) to enable AI tools like GitHub Copilot and others to perform high-quality semantic searches across Microsoft Learn, Azure, Microsoft 365 docs, and more. It delivers up to 10 concise, context-relevant content chunks in real time, ensuring up-to-date, accurate information. Designed for seamless integration with any MCP-compatible client, it helps AI assistants ground their responses in authoritative, current Microsoft resources for better developer support and productivity.
Brave Search
Integrate web search and local search capabilities with Brave. Brave Search allows you to seamlessly integrate Brave Search functionality into AI assistants like Claude. By implementing a Model Context Protocol (MCP) server, it enables the AI to leverage Brave Search's web search and local business search capabilities. It provides tools for both general web searches and specific local searches, enhancing the AI assistant's ability to provide relevant and up-to-date information.
Exa Search
Empower AI assistants like Claude with real-time web data using the Exa MCP Server. This Model Context Protocol server connects AI models to the Exa AI Search API, enabling safe, up-to-date web searches across diverse tools such as academic papers, company data, LinkedIn, Wikipedia, GitHub, and more. Its flexible toolset enhances research, competitor analysis, and content extraction, providing comprehensive information for smarter AI interactions. Designed for seamless integration with Claude Desktop, the Exa MCP Server boosts AI capabilities by delivering fast, reliable, and controlled access to the latest online information.
Perplexity
Unlock real-time, web-wide research for Claude with Sonar API integration via MCP. Perplexity Ask MCP Server enables natural-language queries by connecting Claude to Perplexity’s Sonar, bringing fresh internet search capabilities directly into your AI workflows. The system is designed for easy expansion and can also be used with compatible apps like Cursor. Ideal for users seeking accurate, up-to-date information in conversational settings, the server’s flexible architecture empowers seamless integration of advanced research tools with your favorite AI platforms.
Similar MCP Servers
Kagi Search
Supercharge your AI tools with fast web search and summarization via the Kagi MCP server. This server connects your Model Context Protocol-compatible apps to advanced search and summarizer features, making it easy to find real-time information and generate quick summaries from web content, articles, or videos. Customize settings such as summarizer engine and logging for flexible performance tailored to your workflow. Ideal for boosting productivity in research or automation tasks, the Kagi MCP server streamlines smart data retrieval with seamless integration into your existing environments.