by yw0nam
A gateway server that bridges the Model Context Protocol (MCP) with the Agent-to-Agent (A2A) protocol, enabling MCP-compatible AI assistants (like Claude) to seamlessly interact with A2A agents.
This project serves as an integration layer between two cutting-edge AI agent protocols:
Model Context Protocol (MCP): Developed by Anthropic, MCP allows AI assistants to connect to external tools and data sources. It standardizes how AI applications and large language models connect to external resources in a secure, composable way.
Agent-to-Agent Protocol (A2A): Developed by Google, A2A enables communication and interoperability between different AI agents through a standardized JSON-RPC interface.
By bridging these protocols, this server allows MCP clients (like Claude) to discover, register, communicate with, and manage tasks on A2A agents through a unified interface.
š The package is now available on PyPI!
# Run with default settings (stdio transport) uvx mcp-a2a-gateway # Run with HTTP transport for web clients MCP_TRANSPORT=streamable-http MCP_PORT=10000 uvx mcp-a2a-gateway # Run with custom data directory MCP_DATA_DIR="/Users/your-username/Desktop/a2a_data" uvx mcp-a2a-gateway # Run with specific version uvx mcp-a2a-gateway==0.1.6 # Run with multiple environment variables MCP_TRANSPORT=stdio MCP_DATA_DIR="/custom/path" LOG_LEVEL=DEBUG uvx mcp-a2a-gateway
# Clone and run locally git clone https://github.com/yw0nam/MCP-A2A-Gateway.git cd MCP-A2A-Gateway # Run with uv uv run mcp-a2a-gateway # Run with uvx from local directory uvx --from . mcp-a2a-gateway # Run with custom environment for development MCP_TRANSPORT=streamable-http MCP_PORT=8080 uvx --from . mcp-a2a-gateway
also support cloud deployed Agent
Agent Management
Communication
Task Management
Transport Support
Before you begin, ensure you have the following installed:
Option 1: Direct Run with uvx (Recommended)
Run directly without installation using uvx
:
uvx mcp-a2a-gateway
Option 2: Local Development
git clone https://github.com/yw0nam/MCP-A2A-Gateway.git cd MCP-A2A-Gateway
uv run mcp-a2a-gateway
uvx --from . mcp-a2a-gateway
Option 3: HTTP (For Web Clients)
Start the server with HTTP transport:
# Using uvx MCP_TRANSPORT=streamable-http MCP_HOST=0.0.0.0 MCP_PORT=10000 uvx mcp-a2a-gateway
Option 4: Server-Sent Events
Start the server with SSE transport:
# Using uvx MCP_TRANSPORT=sse MCP_HOST=0.0.0.0 MCP_PORT=10000 uvx mcp-a2a-gateway
The server can be configured using the following environment variables:
Variable | Default | Description |
---|---|---|
MCP_TRANSPORT | stdio | Transport type: stdio , streamable-http , or sse |
MCP_HOST | 0.0.0.0 | Host for HTTP/SSE transports |
MCP_PORT | 8000 | Port for HTTP/SSE transports |
MCP_PATH | /mcp | HTTP endpoint path |
MCP_DATA_DIR | data | Directory for persistent data storage |
MCP_REQUEST_TIMEOUT | 30 | Request timeout in seconds |
MCP_REQUEST_IMMEDIATE_TIMEOUT | 2 | Immediate response timeout in seconds |
LOG_LEVEL | INFO | Logging level: DEBUG , INFO , WARNING , ERROR |
Example .env file:
# Transport configuration MCP_TRANSPORT=stdio MCP_HOST=0.0.0.0 MCP_PORT=10000 MCP_PATH=/mcp # Data storage MCP_DATA_DIR=/Users/your-username/Desktop/data/a2a_gateway # Timeouts MCP_REQUEST_TIMEOUT=30 MCP_REQUEST_IMMEDIATE_TIMEOUT=2 # Logging LOG_LEVEL=INFO
The A2A MCP Server supports multiple transport types:
stdio (default): Uses standard input/output for communication
streamable-http (recommended for web clients): HTTP transport with streaming support
sse: Server-Sent Events transport
For HTTP/SSE Transport
Add below to VS Code settings.json for sse or http:
*Configuration content*
For STDIO Transport - Using uvx (Published Package)
*Configuration content*
For STDIO Transport - Using uvx (Local Development)
*Configuration content*
For STDIO Transport - Using uv (Local Development)
*Configuration content*
Using uvx (Published Package)
Add this to claude_config.json
*Configuration content*
Using uvx (Local Development)
Add this to claude_config.json
*Configuration content*
Using uv (Local Development)
Add this to claude_config.json
*Configuration content*
The server exposes the following MCP tools for integration with LLMs like Claude:
register_agent: Register an A2A agent with the bridge server
*Configuration content*bash
./release.sh patch
./release.sh minor
./release.sh major
The script will:
1. ā
Check you're on the main branch with clean working directory
1. š Automatically bump the version in `pyproject.toml`
1. šØ Build and test the package locally
1. š¤ Commit the version change and create a git tag
1. š Push to GitHub, triggering automated PyPI publishing
#### Option 2: Manual Tag Creation
```bash
# Update version in pyproject.toml manually
# Then create and push a tag
git add pyproject.toml
git commit -m "chore: bump version to 0.1.7"
git tag v0.1.7
git push origin main
git push origin v0.1.7
v0.1.7
)To enable automated publishing, add your PyPI API token to GitHub Secrets:
Get PyPI API Token:
pypi-
)Add to GitHub Secrets:
PYPI_API_TOKEN
Test the Workflow:
For emergency releases or local testing:
# Build and get manual publish instructions ./publish.sh # Or publish directly (with credentials configured) uv build uv publish
No version information available