procursorrules.com
Sign In
Back to MCPs
M

MCP Browser Use

by Deploya-labs

Browser Use Web UI

browser-use MCP server

Documentation
License

Project Note: This MCP server implementation builds upon the browser-use/web-ui foundation. Core browser automation logic and configuration patterns are adapted from the original project.

AI-driven browser automation server implementing the Model Context Protocol (MCP) for natural language browser control.

Browser-Use Server MCP server

Features

  • 🧠 MCP Integration - Full protocol implementation for AI agent communication
  • 🌐 Browser Automation - Page navigation, form filling, and element interaction
  • 👁️ Visual Understanding - Screenshot analysis and vision-based interactions
  • 🔄 State Persistence - Maintain browser sessions between tasks
  • 🔌 Multi-LLM Support - OpenAI, Anthropic, Azure, DeepSeek integration

Quick Start

Prerequisites

  • Python 3.11 or higher
  • uv (fast Python package installer)
  • Chrome/Chromium browser

Installation

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json

*Configuration content*

Local Development

*Configuration content*

Development

# Install dev dependencies
uv sync

# Run with debugger
npx @modelcontextprotocol/inspector uv --directory . run mcp-server-browser-use

Troubleshooting

  • Browser Conflicts: Close all Chrome instances before starting.
  • API Errors: Verify API keys in environment variables match your LLM provider.
  • Vision Support: Ensure MCP_USE_VISION=true for screenshot analysis.

Provider Configuration

The server supports multiple LLM providers through environment variables. Here are the available options for MCP_MODEL_PROVIDER:

ProviderValueRequired Env Variables
AnthropicanthropicANTHROPIC_API_KEY
ANTHROPIC_ENDPOINT (optional)
OpenAIopenaiOPENAI_API_KEY
OPENAI_ENDPOINT (optional)
Azure OpenAIazure_openaiAZURE_OPENAI_API_KEY
AZURE_OPENAI_ENDPOINT
DeepSeekdeepseekDEEPSEEK_API_KEY
DEEPSEEK_ENDPOINT (optional)
GeminigeminiGOOGLE_API_KEY
MistralmistralMISTRAL_API_KEY
MISTRAL_ENDPOINT (optional)
OllamaollamaOLLAMA_ENDPOINT (optional, defaults to localhost:11434)
OpenRouteropenrouterOPENROUTER_API_KEY
OPENROUTER_ENDPOINT (optional)

Notes:

  • For endpoints marked as optional, default values will be used if not specified
  • Temperature can be configured using MCP_TEMPERATURE (default: 0.3)
  • Model can be specified using MCP_MODEL_NAME
  • For Ollama models, additional context settings like num_ctx and num_predict are configurable

Credits

This project extends the browser-use/web-ui under MIT License. Special thanks to the original authors for their browser automation framework.

License

MIT - See LICENSE for details.

Statistics
Tools
0
Stars
2
Last Checked
9/8/2025
Version Info

No version information available