Open Deep Research MCP Server
An AI-powered research assistant that performs deep, iterative research on any topic. It combines search engines, web scraping, and AI to explore topics in depth and generate comprehensive reports. Available as a Model Context Protocol (MCP) tool or standalone CLI.
Quick Start
- Clone and install:
git clone https://github.com/Ozamatash/deep-research
cd deep-research
npm install
- Set up environment in
.env.local
:
# Copy the example environment file
cp .env.example .env.local
- Build:
# Build the server
npm run build
- Run the cli version:
npm run start "Your research query here"
- Test MCP Server with Claude Desktop:
Follow the guide thats at the bottom of server quickstart to add the server to Claude Desktop:
https://modelcontextprotocol.io/quickstart/serverβ
Features
- Performs deep, iterative research by generating targeted search queries
- Controls research scope with depth (how deep) and breadth (how wide) parameters
- Generates follow-up questions to better understand research needs
- Produces detailed markdown reports with findings and sources
- Available as a Model Context Protocol (MCP) tool for AI agents
- For now MCP version doesnβt ask follow up questions
How It Works
Advanced Setup
Using Local Firecrawl (Free Option)
Instead of using the Firecrawl API, you can run a local instance. You can use the official repo or my fork which uses searXNG as the search backend to avoid using a searchapi key:
- Set up local Firecrawl:
git clone https://github.com/Ozamatash/localfirecrawl
cd localfirecrawl
# Follow setup in localfirecrawl README
- Update
.env.local
:
FIRECRAWL_BASE_URL="http://localhost:3002"
Optional: Observability
Add observability to track research flows, queries, and results using Langfuse:
# Add to .env.local
LANGFUSE_PUBLIC_KEY="your_langfuse_public_key"
LANGFUSE_SECRET_KEY="your_langfuse_secret_key"
The app works normally without observability if no Langfuse keys are provided.
License
MIT License
Last updated on