Connect with AI agents
Prop AI Deals is AI-native by design. Every major AI agent ecosystem can discover and use our API documentation without a single line of glue code on your side. Three integration layers, in order of richness:| Layer | Best for | URL |
|---|---|---|
| MCP Server | Claude Code, Cursor, Claude Desktop, Continue, Cline, any MCP client | https://docs.propaideals.co.uk/mcp |
llms.txt | ChatGPT browsing, Perplexity, Claude search, You.com | https://docs.propaideals.co.uk/llms.txt |
llms-full.txt | One-shot context loading for any LLM | https://docs.propaideals.co.uk/llms-full.txt |
MCP Server (recommended)
Our docs are exposed as a Model Context Protocol server with two tools:search_prop_ai_deals— semantic + keyword search across all our API docsquery_docs_filesystem_prop_ai_deals— read individual doc pages directly
Claude Code
Add the MCP server with one CLI command:Cursor
Add to your~/.cursor/mcp.json:
Claude Desktop
Edit yourclaude_desktop_config.json (located in ~/Library/Application Support/Claude/ on macOS, %APPDATA%\Claude\ on Windows):
propaideals server is connected.
Continue (VS Code / JetBrains)
Edit~/.continue/config.json:
Cline (VS Code)
In Cline settings → MCP Servers → Add Server → enterhttps://docs.propaideals.co.uk/mcp.
Other MCP clients
Any MCP-compatible client should work. The transport is HTTP streaming (not stdio), and authentication is public read-only — no API key needed to read the docs.llms.txt (for non-MCP agents)
If your agent doesn’t speak MCP yet, it almost certainly knows about llmstxt.org. Our llms.txt is a structured Markdown index of all the API docs:
llms-full.txt (one-shot context)
For agents that prefer to load all docs into context at once (instead of fetching pages incrementally), we provide a 2,400+ line concatenated file:
- Long-context models that can ingest the whole thing in a single message
- Pre-loading the docs into a system prompt
- Building a custom RAG index of our docs offline
- Agents that don’t want to make N HTTP calls during a conversation
What this means for your apps
If you’re building a property tool and want users to be able to ask natural-language questions like:“Find me 3-bed houses under £400k with at least 7% yield in Manchester”…you have three options:
- Use our AI chat endpoint directly — we run the LLM, you get back property cards. Costs 5× a normal request.
- Connect your AI agent to our MCP server so it can look up the right endpoint to call, then call it itself with your
paid_*key. The agent does the orchestration; you pay our standard rate. - Embed our
llms.txtin your system prompt so any LLM you use already knows our API surface.
Verifying the MCP server
You can test the MCP server is reachable without an MCP client by sending a JSON-RPC call directly:search_prop_ai_deals and query_docs_filesystem_prop_ai_deals.
Status & support
The MCP server is hosted by Mintlify and inherits their uptime. We monitor it as part of our status page. If you have an issue connecting an AI agent or want a feature, emailapi@propaideals.co.uk.