Web search using free multi-engine search (NO API KEYS REQUIRED) — Supports Bing, Baidu, DuckDuckGo, Brave, Exa, and CSDN.
A Model Context Protocol (MCP) server based on multi-engine search results, supporting free web search without API keys.
The fastest way to get started:
# Basic usage
npx open-websearch@latest
# With environment variables (Linux/macOS)
DEFAULT_SEARCH_ENGINE=duckduckgo ENABLE_CORS=true npx open-websearch@latest
# Windows PowerShell
$env:DEFAULT_SEARCH_ENGINE="duckduckgo"; $env:ENABLE_CORS="true"; npx open-websearch@latest
# Cross-platform (requires cross-env, Used for local development)
npm install -g open-websearch
npx cross-env DEFAULT_SEARCH_ENGINE=duckduckgo ENABLE_CORS=true open-websearch
Environment Variables:
Variable | Default | Options | Description |
---|---|---|---|
ENABLE_CORS |
false |
true , false |
Enable CORS |
CORS_ORIGIN |
* |
Any valid origin | CORS origin configuration |
DEFAULT_SEARCH_ENGINE |
bing |
bing , duckduckgo , exa , brave |
Default search engine |
USE_PROXY |
false |
true , false |
Enable HTTP proxy |
PROXY_URL |
http://127.0.0.1:7890 |
Any valid URL | Proxy server URL |
PORT |
3000 |
1-65535 | Server port |
Common configurations:
# Enable proxy for restricted regions
USE_PROXY=true PROXY_URL=http://127.0.0.1:7890 npx open-websearch@latest
# Full configuration
DEFAULT_SEARCH_ENGINE=duckduckgo ENABLE_CORS=true USE_PROXY=true PROXY_URL=http://127.0.0.1:7890 PORT=8080 npx open-websearch@latest
npm install
npm run build
Cherry Studio:
{
"mcpServers": {
"web-search": {
"name": "Web Search MCP",
"type": "streamableHttp",
"description": "Multi-engine web search with article fetching",
"isActive": true,
"baseUrl": "http://localhost:3000/mcp"
}
}
}
VSCode (Claude Dev Extension):
{
"mcpServers": {
"web-search": {
"transport": {
"type": "streamableHttp",
"url": "http://localhost:3000/mcp"
}
},
"web-search-sse": {
"transport": {
"type": "sse",
"url": "http://localhost:3000/sse"
}
}
}
}
Claude Desktop:
{
"mcpServers": {
"web-search": {
"transport": {
"type": "streamableHttp",
"url": "http://localhost:3000/mcp"
}
},
"web-search-sse": {
"transport": {
"type": "sse",
"url": "http://localhost:3000/sse"
}
}
}
}
Quick deployment using Docker Compose:
docker-compose up -d
Or use Docker directly:
docker run -d --name web-search -p 3000:3000 -e ENABLE_CORS=true -e CORS_ORIGIN=* ghcr.io/aas-ee/open-web-search:latest
Environment variable configuration:
Variable | Default | Options | Description |
---|---|---|---|
ENABLE_CORS |
false |
true , false |
Enable CORS |
CORS_ORIGIN |
* |
Any valid origin | CORS origin configuration |
DEFAULT_SEARCH_ENGINE |
bing |
bing , duckduckgo , exa , brave |
Default search engine |
USE_PROXY |
false |
true , false |
Enable HTTP proxy |
PROXY_URL |
http://127.0.0.1:7890 |
Any valid URL | Proxy server URL |
PORT |
3000 |
1-65535 | Server port |
Then configure in your MCP client:
{
"mcpServers": {
"web-search": {
"name": "Web Search MCP",
"type": "streamableHttp",
"description": "Multi-engine web search with article fetching",
"isActive": true,
"baseUrl": "http://localhost:3000/mcp"
},
"web-search-sse": {
"transport": {
"name": "Web Search MCP",
"type": "sse",
"description": "Multi-engine web search with article fetching",
"isActive": true,
"url": "http://localhost:3000/sse"
}
}
}
}
The server provides four tools: search
, fetchLinuxDoArticle
, fetchCsdnArticle
, and fetchGithubReadme
.
{
"query": string, // Search query
"limit": number, // Optional: Number of results to return (default: 10)
"engines": string[] // Optional: Engines to use (bing,baidu,linuxdo,csdn,duckduckgo,exa,brave,juejin) default bing
}
Usage example:
use_mcp_tool({
server_name: "web-search",
tool_name: "search",
arguments: {
query: "search content",
limit: 3, // Optional parameter
engines: ["bing", "csdn", "duckduckgo", "exa", "brave", "juejin"] // Optional parameter, supports multi-engine combined search
}
})
Response example:
[
{
"title": "Example Search Result",
"url": "https://example.com",
"description": "Description text of the search result...",
"source": "Source",
"engine": "Engine used"
}
]
Used to fetch complete content of CSDN blog articles.
{
"url": string // URL from CSDN search results using the search tool
}
Usage example:
use_mcp_tool({
server_name: "web-search",
tool_name: "fetchCsdnArticle",
arguments: {
url: "https://blog.csdn.net/xxx/article/details/xxx"
}
})
Response example:
[
{
"content": "Example search result"
}
]
Used to fetch complete content of Linux.do forum articles.
{
"url": string // URL from linuxdo search results using the search tool
}
Usage example:
use_mcp_tool({
server_name: "web-search",
tool_name: "fetchLinuxDoArticle",
arguments: {
url: "https://xxxx.json"
}
})
Response example:
[
{
"content": "Example search result"
}
]
Used to fetch README content from GitHub repositories.
{
"url": string // GitHub repository URL (supports HTTPS, SSH formats)
}
Usage example:
use_mcp_tool({
server_name: "web-search",
tool_name: "fetchGithubReadme",
arguments: {
url: "https://github.com/Aas-ee/open-webSearch"
}
})
Supported URL formats:
https://github.com/owner/repo
https://github.com/owner/repo.git
[email protected]:owner/repo.git
https://github.com/owner/repo?tab=readme
Response example:
[
{
"content": "<div align=\"center\">\n\n# Open-WebSearch MCP Server..."
}
]
Used to fetch complete content of Juejin articles.
{
"url": string // Juejin article URL from search results
}
Usage example:
use_mcp_tool({
server_name: "web-search",
tool_name: "fetchJuejinArticle",
arguments: {
url: "https://juejin.cn/post/7520959840199360563"
}
})
Supported URL format:
https://juejin.cn/post/{article_id}
Response example:
[
{
"content": "🚀 开源 AI 联网搜索工具:Open-WebSearch MCP 全新升级,支持多引擎 + 流式响应..."
}
]
Since this tool works by scraping multi-engine search results, please note the following important limitations:
Rate Limiting:
Result Accuracy:
Legal Terms:
Search Engine Configuration:
DEFAULT_SEARCH_ENGINE
environment variableProxy Configuration:
USE_PROXY=true
PROXY_URL
Welcome to submit issue reports and feature improvement suggestions!
If you want to fork this repository and publish your own Docker image, you need to make the following configurations:
To enable automatic Docker image building and publishing, please add the following secrets in your GitHub repository settings (Settings → Secrets and variables → Actions):
Required Secrets:
GITHUB_TOKEN
: Automatically provided by GitHub (no setup needed)Optional Secrets (for Alibaba Cloud ACR):
ACR_REGISTRY
: Your Alibaba Cloud Container Registry URL (e.g., registry.cn-hangzhou.aliyuncs.com
)ACR_USERNAME
: Your Alibaba Cloud ACR usernameACR_PASSWORD
: Your Alibaba Cloud ACR passwordACR_IMAGE_NAME
: Your image name in ACR (e.g., your-namespace/open-web-search
)The repository includes a GitHub Actions workflow (.github/workflows/docker.yml
) that automatically:
Trigger Conditions:
main
branchv*
)Build and Push to:
Image Tags:
ghcr.io/your-username/open-web-search:latest
your-acr-address/your-image-name:latest
(if ACR is configured)main
branch or create version tagsdocker run -d --name web-search -p 3000:3000 -e ENABLE_CORS=true -e CORS_ORIGIN=* ghcr.io/your-username/open-web-search:latest