Use Predexon API with AI assistants via Model Context Protocol
The Predexon MCP server lets AI assistants like Claude, Cursor, Windsurf, Codex, and others directly query prediction market data from Polymarket, Kalshi, and Dflow.
Model Context Protocol (MCP) is an open standard that allows AI assistants to interact with external tools and data sources. With the Predexon MCP server, you can ask your AI assistant questions like:
“What are the top prediction markets by volume?”
“Show me the price history for this market”
“Find matching markets across Polymarket and Kalshi”
Edit your Claude Desktop configuration file:macOS: ~/Library/Application Support/Claude/claude_desktop_config.jsonWindows: %APPDATA%\Claude\claude_desktop_config.json
Cursor supports both global and project-specific MCP configuration.Global (all projects): ~/.cursor/mcp.jsonProject-specific: .cursor/mcp.json in your project root
You can also add MCP servers from the MCP icon in the Cascade panel, or via Windsurf Settings > Cascade > MCP Servers.Windsurf MCP Docs
VS Code supports MCP servers through GitHub Copilot.Global: Open your user mcp.json via Command PaletteProject-specific: .vscode/mcp.json in your workspace