White-label AI chat app. Nothing hits our servers.

One UI for OpenAI, Anthropic, Google & local models. API calls go direct from device to provider. All history stays on the user's device.
We never see your prompts.

Launch Technical Preview

Web App · 100% Local Storage

Discuss Your Use Case

Tell us what to build next

The Core Architecture

Zero-Routing Engine

We are a client, not a proxy. Your API requests go directly from your device to the provider. We cannot log your prompts because we never touch them.

Local-First Storage

Your competitive advantage is your history. We store all conversational data inside an SQLite database directly on your device.

Native Performance

Built on SolidJS and SQLite. Designed for 60FPS performance even with million-token contexts. No loading spinners, no lag.

Looking for White-Labeling Partners

Sovereign Cloud

You secure the compute. We provide a frontend that stores nothing server-side. Self-host Chatcore within your private network or deploy via desktop apps with no external dependencies.

Edge & Offline

Works without internet. No cloud sync required. Local storage and local inference support mean the app functions mid-ocean, on the factory floor, or in the field.

Legal, Medical & Defense

Cloud-stored AI logs create discovery risk, GDPR exposure, and malpractice liability. Chatcore acts like a digital legal pad — history stays strictly on the practitioner's device.

Model & API Providers

You built the backend. We built the frontend. License the Chatcore engine and go to market with a production-grade chat UI in days, not months.

System Integrators

Stop rebuilding chat UIs on every engagement. License our white-label solution, apply your client's branding, and focus your billable hours on the hard problems.

Technical Roadmap

Live Now

  • Universal access to the latest models from OpenAI, Anthropic, and Google via OpenRouter.
  • WASM SQLite — full local data persistence in the browser.
  • Rendering support for 1M+ token windows at 60FPS.
  • Conversation branching and versioning.

Coming Soon

  • Direct API connectors: Azure OpenAI, AWS Bedrock, Google Vertex AI, Google AI Studio, OpenAI, and Anthropic — no intermediary.
  • Local inference: native support for Ollama / Llama.cpp servers.
  • Desktop native: standalone apps for Windows, macOS, and Linux.
  • End-to-end encrypted cloud sync (device-to-device) and SSO.
  • Local PDF parsing and DOCX redlining.
  • Team management and self-hosted instances.

Engineering History

Chatcore is built by the founders of MapHub and OpenFreeMap.

143 TB/month

Proven Scale

We run the OpenFreeMap infrastructure, serving global map tiles at 143 TB/month. We know how to build low-latency systems.

10 Years Bootstrapped

Sustainable Business

We have run MapHub.net profitably for a decade. We answer to our customers, not to VCs.

Craftsmanship

Small & Dedicated

A small team building high-performance tools that last.

Shape the Roadmap

We are building the V1 feature set right now.
Tell us what to prioritize.