by Deploya-labs
Project Note: This MCP server implementation builds upon the browser-use/web-ui foundation. Core browser automation logic and configuration patterns are adapted from the original project.
AI-driven browser automation server implementing the Model Context Protocol (MCP) for natural language browser control.
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
*Configuration content*
*Configuration content*
# Install dev dependencies uv sync # Run with debugger npx @modelcontextprotocol/inspector uv --directory . run mcp-server-browser-use
MCP_USE_VISION=true for screenshot analysis.The server supports multiple LLM providers through environment variables. Here are the available options for MCP_MODEL_PROVIDER:
| Provider | Value | Required Env Variables |
|---|---|---|
| Anthropic | anthropic | ANTHROPIC_API_KEYANTHROPIC_ENDPOINT (optional) |
| OpenAI | openai | OPENAI_API_KEYOPENAI_ENDPOINT (optional) |
| Azure OpenAI | azure_openai | AZURE_OPENAI_API_KEYAZURE_OPENAI_ENDPOINT |
| DeepSeek | deepseek | DEEPSEEK_API_KEYDEEPSEEK_ENDPOINT (optional) |
| Gemini | gemini | GOOGLE_API_KEY |
| Mistral | mistral | MISTRAL_API_KEYMISTRAL_ENDPOINT (optional) |
| Ollama | ollama | OLLAMA_ENDPOINT (optional, defaults to localhost:11434) |
| OpenRouter | openrouter | OPENROUTER_API_KEYOPENROUTER_ENDPOINT (optional) |
MCP_TEMPERATURE (default: 0.3)MCP_MODEL_NAMEnum_ctx and num_predict are configurableThis project extends the browser-use/web-ui under MIT License. Special thanks to the original authors for their browser automation framework.
MIT - See LICENSE for details.
No version information available