Your agent needs real data,
not a scraped mess.
Seven typed APIs — pricing, intent, jobs, social, real estate, competitor changes and news — behind one MCP server. Deterministic, cached, and priced per tool call, not per token burned re-parsing markdown.
# One MCP server. Seven tools. from anthropic import Anthropic client = Anthropic() resp = client.messages.create( model="claude-opus-4-7", mcp_servers=[{"url": "https://mcp.freshgeo.com", "name": "freshgeo"}], messages=[{"role": "user", "content": "What's the current price of SKU-841 vs its top 3 competitors?"}] ) # → Agent calls freshgeo.pricing.sku, gets typed JSON, # zero re-parsing, cached for the next 15 minutes.
Most teams get here the hard way.
✗ Tavily, Exa, Serp, Brave — great for blogs, bad for structured entities. Your agent burns 2-4k tokens per call re-parsing markdown to extract a price or a headcount. Wrong job.
Every FreshGeo response is a deterministic, typed object. Fields are stable, prices are numbers, dates are ISO. Zero extraction prompts, zero hallucinated fields.
✗ Firecrawl, ScrapingBee, Bright Data. You get HTML, you still have to teach the agent how to extract from each site. Blocks, layout changes, rate limits — your problem.
FreshGeo runs the crawl + extraction + normalisation upstream. The agent sees clean entities keyed by GTIN, UPRN, ticker or company domain. Source evidence URLs included per field.
✗ Clearbit for firmographics, Cognism for contacts, ZoomInfo for headcount, Bombora for intent, Crayon for competitor changes. Five auths, five schemas, no joins. Your agent can't link a news event to a pricing change.
Seven domains of data stitched through a shared entity graph. One FreshGeo company ID joins pricing, hiring, intent, news and social. Your agent reasons across datasets natively.
✗ Six months of headless Chrome, proxy rotation, and weekly breakage. You'll still be babysitting them next year. And compliance hasn't signed off on the source list.
99.95% uptime, signed data-use licences, SOC 2 Type II (in progress), UK-hosted with DPA. Procurement and security review pre-cleared.
Four things no scraper and no search API will give you.
MCP-native, out of the box
One endpoint — mcp.freshgeo.com — registers seven tools with typed parameters, descriptions and scopes. Claude, GPT, Gemini, Cursor and any MCP-compatible runtime pick them up without code.
Deterministic + cacheable
Same query returns the same answer for 15 minutes. Every response carries a cache_id so you can replay it in evals, audit logs, or production retries. Hallucination-safe by construction.
Per-agent keys, hard caps
Mint a key per agent or user session with a per-tool scope, call cap and rate limit. Runaway loops hit a wall at your chosen threshold. No metered overages unless you opt in.
Entity graph across seven domains
A pricing event, a news article, a hiring surge and a competitor tech change all resolve to the same FreshGeo company ID. Your agent can reason across datasets without stitching.
Before FreshGeo → with FreshGeo
| Before | With FreshGeo |
|---|---|
| 40% of tool calls fail or retry on long-tail sites | Deterministic typed JSON, p95 <200ms cached |
| Agent burns 2-4k tokens parsing scraped HTML | Clean fields, zero re-parse tokens |
| Five vendors, five auths, five schemas | One MCP server, seven tools, one key |
| "Sometimes hallucinates prices or dates" | Entity-linked, timestamped, cache_id per call |
| Runaway loops send you a £4,000 overage | Per-agent caps, hard-stop at your quota |
| Compliance hasn't signed off on your scrape list | Licensed sources, SOC 2 (in progress), DPA |
Who this is for.
FreshGeo is sharpest for teams shipping agents into production — not research prototypes, not chatbots that answer from their weights.
Teams shipping production agents on Claude, GPT or Gemini
You've moved past "it works in a notebook". You care about p95 latency, token cost per session, deterministic evals, and not waking up to a £4,000 overage.
RAG stacks that graduated to agents
Your retrieval is great for documents. But now the agent needs current prices, live hiring signals, or today's news — not the snapshot in your vector DB.
AI vertical SaaS
You're building an agent for sales, retail pricing, real estate, compliance or PE. The data is the product. Scraping is a liability, not a moat.
Agent framework users
LangChain, LlamaIndex, Mastra, Agno, Google ADK. You want seven reliable tools, not seven integration projects.
Not sure it's a fit?
Read the full agent integration guide →Seven tools. One agent.
Each is a real data product on its own — documented, licensed, priced transparently. Together, they're a grounding layer your agent can actually trust.
Teams shipping agents into production.
"FreshGeo's pricing API replaced three vendors. Data quality and latency are exceptional — we cut integration time from weeks to hours."
"Intent signals transformed our pipeline. 3.2x improvement in qualified leads within 60 days. The ROI paid for itself in week one."
"Finally, a real estate API that scales. 500K daily queries handled flawlessly with sub-50ms responses."
"The risk signals API gives us an edge our competitors don't have. Early warning on layoffs and funding changes is invaluable."
From your agent to licensed sources in one hop.
Claude, GPT, Gemini, or custom runtime — MCP-compatible.
One endpoint. 7 typed tools. Per-agent keys + scopes.
Stable IDs link pricing, hiring, news, competitor and social data.
15-min determinism window. cache_id per call. Sources[] per field.
Public-web crawls, licensed feeds, partner APIs. UK-hosted.
Questions agent builders ask us.
What is FreshGeo? +
FreshGeo is a grounding API for AI agents. Seven typed data endpoints — competitor pricing, buying intent, jobs, social, real estate, competitor monitoring and news/risk — exposed as one MCP server and one REST API so agents can fetch real-world facts without scraping the web.
How is this different from a web search API like Tavily or Exa? +
Search APIs return ranked web pages in markdown. Agents still have to reason over unstructured text. FreshGeo returns deterministic typed JSON — prices, headcount, listings, events — so the agent spends zero tokens extracting fields. Different job, same stack.
Can I plug it straight into Claude, OpenAI or Cursor? +
Yes. Point any MCP-compatible client at mcp.freshgeo.com and the seven APIs appear as tools with typed parameters and per-tool auth scopes. There is also a REST surface and official SDKs for Python, Node, Go and Ruby for function-calling setups.
Why do agents need this instead of scraping? +
Scrapers give you HTML that the LLM must re-parse on every call, burning tokens and introducing hallucinations. FreshGeo caches typed responses, dedupes across sources, and returns a cache_id per call so you can replay the exact response in evals or production retries.
How do you stop a runaway agent from burning our budget? +
Every agent or user session can have its own API key with a hard call cap, per-tool scope, and rate limits. Usage is visible per key in the dashboard, and overages are off by default — we hard-stop at your quota rather than send you a surprise invoice.
Point your agent at real data.
Free tier, no card. MCP server goes live the moment you get your key. UK-hosted, UK-supported, procurement-ready.
Monthly notes on agent data infrastructure.
No spam, no product launches you don't care about. Unsubscribe in one click.