Loading...
REST for per-property dossiers. SQL for analytics over 2,500+ datasets. AI for natural-language investigations. Time-travel built in. $0 egress forever.
Same data. Three ways to query it. All metered against the same credit balance.
/api/v1/property/* /api/v1/owner/* /api/v1/area/*Curated per-entity dossiers. Pre-joined zoning + violations + permits + ownership per property. Fixed cost per call. Good when you know exactly what you're asking.
/api/v1/data/queryDuckLake SQL over 2,500+ civic datasets via R2 parquet. Time-travel (`as_of`), Parquet output (`format=parquet`), and federated joins against Postgres all supported. Credit cost scales with bytes scanned.
/api/v1/answersNatural-language Q&A over the same data. The system routes between tools and the knowledge graph, cites every source, and explains its reasoning. Good for investigators + exploratory research.
MUNIMIND_API_KEY.curl https://api.munimind.com/api/v1/property/1000010010/summary \
-H "X-API-Key: $MUNIMIND_API_KEY"The /api/v1/data/query endpoint runs sandboxed DuckDB over 2,500+ datasets stored as parquet on Cloudflare R2. Joins, aggregations, window functions, CTEs — all supported. Credit cost = ~1 per 10 MB scanned.
curl -X POST https://api.munimind.com/api/v1/data/query \
-H "X-API-Key: $MUNIMIND_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"sql": "SELECT borough, count(*) AS n FROM lake.main.\"nyc__hpd_violations\" WHERE inspection_date >= '\''2025-01-01'\'' GROUP BY borough ORDER BY n DESC"
}'Every dataset we ingest is snapshotted. Pass as_of to query the state at any point in history — legal-grade for diligence and audit.
# DuckLake time-travel — 'what did this look like on a past date?'
curl -X POST https://api.munimind.com/api/v1/data/query \
-H "X-API-Key: $MUNIMIND_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"sql": "SELECT count(*) FROM lake.main.\"nyc__dob_violations\"",
"as_of": "2026-01-15T00:00:00"
}'Pass format: "parquet" for Arrow-compatible binary output. Pandas / Polars / DuckDB read it natively with zero copies.
# Return results as Apache Parquet for pandas/polars/DuckDB clients
curl -X POST https://api.munimind.com/api/v1/data/query \
-H "X-API-Key: $MUNIMIND_API_KEY" \
-H "Content-Type: application/json" \
-d '{"sql":"SELECT * FROM lake.main.\"nyc__taxi_zones\"","format":"parquet"}' \
--output taxi_zones.parquetInclude Idempotency-Key on any POST — same key + body within 24 h returns the cached response, with Idempotency-Replay: true. You never pay for a retry twice.
# Safe retries — same key + same body returns the cached response
# (within 24h). Per IETF idempotency-key-header draft.
curl -X POST https://api.munimind.com/api/v1/data/query \
-H "X-API-Key: $MUNIMIND_API_KEY" \
-H "Idempotency-Key: $(uuidgen)" \
-H "Content-Type: application/json" \
-d '{"sql":"SELECT * FROM lake.main.\"nyc__taxi_zones\" LIMIT 10"}'# Filter rows PostgREST-style, no SQL needed
curl 'https://api.munimind.com/api/v1/data/nyc.dob_violations/rows?where=bbl.eq.1000010010&limit=100' \
-H "X-API-Key: $MUNIMIND_API_KEY"Content-negotiate Arrow IPC stream via Accept: application/vnd.apache.arrow.stream for 10-100x smaller payloads. ADBC/pandas/polars/DuckDB read it zero-copy. Same credit cost as JSON.
# Request Apache Arrow IPC stream — 10-100x smaller than JSON,
# zero-copy into pandas / polars / DuckDB.
curl -X POST https://api.munimind.com/api/v1/data/query \
-H "X-API-Key: $MUNIMIND_API_KEY" \
-H "Content-Type: application/json" \
-H "Accept: application/vnd.apache.arrow.stream" \
-d '{"sql":"SELECT * FROM lake.main.\"nyc__taxi_zones\""}' \
--output taxi_zones.arrow
# Python — native Arrow reader:
python3 -c "
import pyarrow.ipc as ipc
with open('taxi_zones.arrow','rb') as f:
reader = ipc.open_stream(f)
df = reader.read_all().to_pandas()
print(df.head())"Point + radius query returns everything nearby as a GeoJSON FeatureCollection. Parcels come from PostGIS; permits + violations attach via BBL. Drop straight into Mapbox / Deck.gl / Leaflet. 3 credits / call, radius capped at 5 km.
# Point-radius GeoJSON — Mapbox-style
curl "https://api.munimind.com/api/v1/area/tilequery?lng=-73.986&lat=40.736&radius_m=250&city=nyc&types=parcels,permits,violations&limit=200" \
-H "X-API-Key: $MUNIMIND_API_KEY"
# Returns { type: "FeatureCollection", features: [...] } — drop straight
# into Mapbox, Deck.gl, Leaflet, etc. 3 credits / call, 5 km radius cap.Our agent routes between tools, cites every source, and explains its reasoning. Good for investigators doing exploratory work.
curl -X POST https://api.munimind.com/api/v1/answers \
-H "X-API-Key: $MUNIMIND_API_KEY" \
-H "Content-Type: application/json" \
-d '{"question":"Who are the top 3 owners on the Lower East Side by HPD violations last year?"}'Every request sends X-API-Key: mm_live_.... Keys scope to a user, tier, and rate limit. Hash-based lookup — we never store keys in plain text. Canary tokens auto-detect compromise.
Every response carries IETF-standard RateLimit + RateLimit-Policy + de-facto X-RateLimit-* headers. You know exactly how much budget you have left.
Everything costs credits. Credits roll over monthly. Use our live usage page for per-endpoint spend, top-scanning queries, and projected run-rate.
1 credit5 credits10 credits~5 credits~50 credits5 credits25–50 credits401 — missing or invalid key.402 — insufficient credits. Top up at /pricing.422 — Idempotency-Key already used with different body.429 — rate limit exceeded. Check Retry-After header.400 — sandbox rejection (DDL, multi-statement, etc.).First 5,000 credits free. No credit card. Upgrade when you need more throughput or overage pricing.
Dataset catalog, schemas, and first-100-row previews are always free + unauthenticated. Try any dataset before signing up.
Auto-generated from our OpenAPI spec. All 57 endpoints fully typed. Regenerates on every API change.
pip install munimindPlug MuniMind into Claude Desktop / Code / Cursor. 7 tools + 5 prompts + 4 resources.
MCP setup →Sign up, grab 5,000 free credits, and ship your first query in 60 seconds.