by spences10
This repository is no longer maintained.
The functionality of this tool is now available in mcp-omnisearch, which combines multiple MCP tools in one unified package.
Please use mcp-omnisearch instead.
A Model Context Protocol (MCP) server for integrating Jina.ai's Reader
API with LLMs. This server provides efficient and comprehensive web
content extraction capabilities, optimized for documentation and web
content analysis.
This server requires configuration through your MCP client. Here are
examples for different environments:
Add this to your Cline MCP settings:
*Configuration content*
For WSL environments, add this to your Claude Desktop configuration:
*Configuration content*
The server requires the following environment variable:
JINAAI_API_KEY
: Your Jina.ai API key (required)The server implements a single MCP tool with configurable parameters:
Convert any URL to LLM-friendly text using Jina.ai Reader.
Parameters:
url
(string, required): URL to processno_cache
(boolean, optional): Bypass cache for fresh results.format
(string, optional): Response format ("json" or "stream").timeout
(number, optional): Maximum time in seconds to wait fortarget_selector
(string, optional): CSS selector to focus onwait_for_selector
(string, optional): CSS selector to wait forremove_selector
(string, optional): CSS selector to excludewith_links_summary
(boolean, optional): Gather all links at thewith_images_summary
(boolean, optional): Gather all images at thewith_generated_alt
(boolean, optional): Add alt text to imageswith_iframe
(boolean, optional): Include iframe content innpm install
npm run build
npm run dev
npm run build
npm publish
Contributions are welcome! Please feel free to submit a Pull Request.
MIT License - see the LICENSE file for details.
No version information available
0 contributors