A

Apache Airflow

Integrates with Apache Airflow clusters through REST API to provide complete workflow management inc...

205 views
0 installs
Updated Nov 21, 2025
Not audited
Integrates with Apache Airflow clusters through REST API to provide complete workflow management including DAG operations, task monitoring, pool and variable management, XCom data access, and performance analytics with event logging and import error tracking.
  1. Obtain Your Airflow Connection Credentials

    • For Airflow 2.x (API v1):
      • The default username and password for the official Airflow Docker/Docker Compose environment is usually:
        • Username: airflow
        • Password: airflow
      • You can verify or change these credentials either via the Airflow web interface (Admin → Users) or by checking the .env file used in your Airflow deployment.
    • For Airflow 3.x (API v2):
      • The default username and password are also typically:
        • Username: airflow
        • Password: airflow
      • For API v2, if your Airflow uses JWT tokens (FabAuthManager), refer to your Airflow’s authentication configuration or the admin panel for your token or credentials.
  2. Find the Airflow API Base URL

    • If using the official Airflow-Docker-Compose test project:
      • For Airflow 2.x: Default base URL is http://localhost:38080/api
      • For Airflow 3.x: Default base URL is http://localhost:48080/api
    • If using your own Airflow instance:
      • The base URL will be http(s)://<your-airflow-host>:<port>/api
      • Confirm the API endpoint version (/api/v1 or /api/v2) depending on your Airflow version.
  3. Set the API Version

    • Decide which Airflow version you are connecting to:
      • Use v1 for Airflow 2.x
      • Use v2 for Airflow 3.x
    • Example: AIRFLOW_API_VERSION=v1 or AIRFLOW_API_VERSION=v2
  4. (Optional for “streamable-http” mode) Create a Secret Key for Remote Authentication

    • For remote access in production, you should secure the MCP server with a token.
    • Create a strong random secret key (32+ characters recommended).
      • You can generate one using openssl rand -hex 32 or an online password generator.
  5. Fill in the FastMCP Connection Interface

    • Click the "Install Now" button for the MCP-Airflow-API integration.
    • In the FastMCP interface, when prompted for environment variables, enter the following values:
      • AIRFLOW_API_VERSION: v1 (for Airflow 2.x) or v2 (for Airflow 3.x)
      • AIRFLOW_API_BASE_URL: The base URL of your Airflow API (e.g., http://localhost:38080/api)
      • AIRFLOW_API_USERNAME: Your Airflow username (e.g., airflow)
      • AIRFLOW_API_PASSWORD: Your Airflow password (e.g., airflow)
      • If using remote/“streamable-http” mode with authentication:
        • REMOTE_AUTH_ENABLE: true
        • REMOTE_SECRET_KEY: Your generated secret key
  6. Save and Complete the Integration

    • After entering all required values, save the configuration in FastMCP.
    • Test the connection to make sure MCP-Airflow-API can access your Airflow cluster.

Tip:
If you use the provided test Airflow clusters, all default credentials and URLs will already match the examples above.

Security Note:
Always use strong unique passwords and, for remote deployments, enable authentication and HTTPS.

How to Install Apache Airflow

Install Apache Airflow MCP server with one click through FastMCP. Choose your preferred AI development tool below:

Claude Desktop

Click "Claude Desktop" in Quick Start

Cursor IDE

Click "Cursor IDE" in Quick Start

VS Code

Click "VS Code" in Quick Start

Alternatives to Apache Airflow

Looking for similar MCP servers? Browse other servers in the same categories on FastMCP, or check out the similar servers listed above.

Quick Start

View on GitHub

More for Monitoring

View All →

More for Automation

View All →

Report Issue

Thank you! Your issue report has been submitted successfully.

Stay ahead of the MCP ecosystem

Get the top new MCP servers, trending tools, and dev tips delivered weekly. Free, no spam, unsubscribe anytime.

Join 2,847 developers. We send one email per week.