Explore MCP Servers and Clients
What are you looking for?
Herald - The Self Hosted Mcp Bridge Between Claude Chat And Claude Code.
You're on the couch. On your phone. You open Claude Chat and type: "Refactor the auth middleware in my-api to use JWT instead of session cookies. Run the tests." Four minutes later, it's done. Branch created, code refactored, tests passing, changes committed. Your workstation did all the work. You never opened your laptop. That's Herald. The Problem Claude Chat and Claude Code are two brilliant tools that live in completely separate worlds. You've been copy-pasting between them. Or worse — you've been waiting until you're back at your desk. That's over. The Solution Herald is a self-hosted MCP server that bridges Claude Chat to Claude Code using Anthropic's official Custom Connectors protocol. One Go binary. Zero hacks.
Synter
MCP server for AI agents to manage ad campaigns across Google, Meta, LinkedIn, Microsoft, Reddit, TikTok, and more. Create campaigns, pull performance data, generate AI creatives, and optimize budgets. Also, for conversion tracking, google analytics, posthog, hubspot, salesforce.com, stripe, and more.
Chart Pane
MCP App that renders interactive Chart.js charts and dashboards inline in AI conversations. Supports bar, line, area, pie, doughnut, scatter, and radar charts with multi-chart dashboard grids.
Mint Club V2
Trade bonding curve tokens on Base via Mint Club V2. Buy, sell, swap (smart routing via bonding curves + Uniswap V3/V4), create tokens, check prices and balances. 10 tools for AI-powered DeFi on Base.
Aluvia
Give your AI agent unblockable browser access. Aluvia routes traffic through premium mobile carrier IPs to bypass 403s, CAPTCHAs, and WAFs automatically.
Same Stateless Agent Memory Engine
I gave Claude a mass grave of 200 markdown files and now it remembers my entire project between sessions. No cloud, no API keys, one 10mb Go binary, and private. Stateless Agent Memory Engine is your synapse to the CLI. SAME (Stateless Agent Memory Engine) is persistent memory for AI coding agents. It indexes your markdown notes locally — Obsidian vaults, Logseq graphs, or plain folders — and surfaces relevant context automatically through a 6-gate relevance chain. ~80% of prompts are correctly skipped so your agent isn't drowning in context it doesn't need. 12 MCP tools (9 read, 3 write): semantic search across your knowledge base, filtered search by domain/tag, federated search across multiple vaults, session context with pinned notes and handoffs, save decisions, create handoffs for the next session. SQLite + Ollama embeddings on localhost. Falls back to keyword search without Ollama. No outbound network calls, no telemetry, no accounts. Your notes never leave your machine. Works with Claude Code (hooks + MCP), Cursor, Windsurf, and any MCP client via stdio transport.
Mcp Server 12306
12306 MCP Server 是一个基于 Model Context Protocol (MCP) 的高性能火车票查询后端系统。它通过标准化接口提供官方 12306 的实时数据服务,包括余票查询、车站信息、列车经停站、中转换乘方案等核心功能。
VoidMob MCP
Mobile proxies, Non-VoIP SMS verifications, and global eSIM data plans for AI agents and MCP clients. 18 tools for renting phone numbers, purchasing mobile proxies, buying eSIM data plans, and managing crypto wallet deposits.
Scrapi
Web scraping MCP server for AI agents. Bypass anti-bot systems and get clean, LLM-ready Markdown content. Supports both stdio and Streamable HTTP transports. Built on 8+ years of production scraping infrastructure.
SafeDep
SafeDep MCP Server protects AI coding agents from supply chain attacks by checking every open source package before installation. When your AI suggests a package, SafeDep validates it against our threat intelligence database, built from continuous scanning, behavioral analysis, and human security researcher verification. Malicious packages are blocked instantly. Safe packages install without friction. We detect threats in hours, not the 24-48 hours it takes for public disclosure. Same intelligence that caught Shai-Hulud and S1ngularity.
Rival Mcp
rival-mcp MCP server for querying AI model comparison data from rival.tips This server lets AI coding assistants — Claude Code, Cursor, Windsurf, and any MCP-compatible client — natively query model benchmarks, pricing, capabilities, and side-by-side comparisons without leaving your editor. Quick Start npx rival-mcp No API key required. All data is served from the public rival.tips API. Configuration Claude Code Add to your .claude/settings.json (project-level) or ~/.claude/settings.json (global): { "mcpServers": { "rival": { "command": "npx", "args": ["-y", "rival-mcp"] } } } Claude Desktop Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows): { "mcpServers": { "rival": { "command": "npx", "args": ["-y", "rival-mcp"] } } } Cursor Add to your Cursor MCP settings (.cursor/mcp.json): { "mcpServers": { "rival": { "command": "npx", "args": ["-y", "rival-mcp"] } } } Windsurf Add to your Windsurf MCP config: { "mcpServers": { "rival": { "command": "npx", "args": ["-y", "rival-mcp"] } } } Available Tools list-models List all AI models with optional filtering. Parameters: Parameter Type Description provider string (optional) Filter by provider: OpenAI, Anthropic, Google, Meta, Mistral, etc. category string (optional) Filter by category: flagship, reasoning, coding, small, free, image-gen capability string (optional) Filter by capability: chat, code, vision, image-gen, function-calling q string (optional) Free-text search across name, ID, provider, and description Example prompts: "List all Anthropic models" "Show me free models" "What models support vision?" get-model Get detailed information about a specific model — benchmarks, pricing, capabilities, unique features, and provider availability. Parameters: Parameter Type Description id string (required) Model ID, e.g. gpt-4.1, claude-3.7-sonnet, gemini-2.5-pro Example prompts: "Get details on claude-3.7-sonnet" "What are the benchmarks for gpt-4.1?" compare-models Compare 2-3 models side by side — benchmarks, pricing, capabilities, and shared challenges. Parameters: Parameter Type Description models string (required) Comma-separated model IDs (2-3). Example: gpt-4.1,claude-3.7-sonnet Example prompts: "Compare GPT-4.1 vs Claude 3.7 Sonnet" "How does Gemini 2.5 Pro stack up against GPT-4.1 and Claude Sonnet?" search-models Search for models by name, description, or capability when you don't know the exact model ID. Parameters: Parameter Type Description query string (required) Search query, e.g. vision, cheap coding, fast reasoning Example prompts: "Find models good at coding" "Search for cheap reasoning models" Development # Install dependencies npm install # Run in development mode npm run dev # Build for production npm run build # Run the built server npm start How It Works This MCP server communicates over stdio (standard input/output) using the Model Context Protocol. When an AI assistant needs model comparison data, it calls the appropriate tool, which fetches data from the rival.tips public API and returns structured JSON. The server exposes no resources or prompts — only tools. All data is read-only and publicly available. Data Source All model data comes from rival.tips, an AI model comparison platform featuring: 60+ AI models with benchmarks, pricing, and capability data Side-by-side comparisons with shared challenge responses Community-driven AI duel voting and rankings Pre-generated showcase responses across coding, creative, and reasoning tasks License MIT
Patreon
Give AI assistants access to your Patreon creator data. View campaigns, patrons, tiers, and posts. Privacy-hardened - no patron emails or notes are ever requested.
Mcp Hive
MCP Hive is a hub which connects MCP Server Consumers to paid MCP Servers. For example, paid MCP Servers might serve protected content, served by original copyright owners; paid MCP Servers can serve data which was built by expert systems, such as financial and sports data repositories.