Operationally
A real CTO builds with you, then helps you graduate to your own infrastructure when production needs it. Mandaire is shaped the same way.
The commercial relationship
The managed surface exists so you can ship without engineering your own substrate while the product is still finding its shape. When the product is ready to run on your own cloud account (AWS, GCP, your own boxes, whatever your stage requires), Mandaire helps you graduate the production system out, the way a CTO would handle the migration. The hosting is the build-time scaffolding, not the product.
After migration, the decision history, the taste memory, and the audit log continue with you. The CTO layer compounds across projects. It is not locked to the production hosting. You graduate the product hosting, not the relationship.
Mandaire never deploys to production, migrates a database, changes anything user-visible, or makes any irreversible action happen without your sign-off. Approval is one word; the work waits politely; it never bypasses.
Pricing
The managed surface runs at $500/month in beta. The anchor is not Cursor's $20 or any other AI builder tool. The anchor is the missing CTO function: what you would pay a fractional CTO, what you are losing in wrong-build time, what you are spending in AI tokens on builds that go sideways. Against that baseline, $500/month is the cost of one bad week.
$500/month (beta)
Full stack including GPU. You bring your Claude API key. Mandaire provides the reasoning layer, the decision ledger, the build environment, and the graduation path. For founders in the window where getting the build right is the constraint.
Free (open source)
Self-host the full open-source stack on your own infrastructure. Right for teams with compliance requirements, sovereignty preferences, or technical capacity to operate it. Managed and Yours are not a quality split.
The LLM stack
Every other AI builder runs on a single LLM provider end-to-end. Cursor, Bolt, Replit, Devin: one model class, one cost curve. Mandaire routes across three layers:
Frontier reasoning (Anthropic, OpenAI, Google) for the architectural calls, the refusals, the taste memory synthesis. Used selectively. Paid to the provider directly by you.
Local model on the GPU we provide. Handles the bulk of the decision-ledger maintenance, the escalation routing, the build-plan drafting. No per-token cost to you after the flat monthly fee.
Claude Desktop, ChatGPT app, Gemini, Cursor. Whichever you already pay for. Mandaire exposes tools via MCP; your existing client consumes them. You do not need a new chat surface.
The setup is one step: bring your Claude API key. We never see your credentials and we never resell tokens. The full architecture, the deterministic disclosure layer that makes BYO renderer structurally safe, and the three reasoner profiles live at mandaire.org.
What stays yours
You can leave without losing the system. Export the full record and self-host any time. Graduation is supported, not penalized. The artifact trail you built over 90 days (the decision ledger, the taste memory, the refused-paths log) goes with you. No lock-in is an architectural commitment, not a preference.
The encryption module is open-source under AGPL, which means any competent engineer can read your archive without Mandaire's involvement. The self-host path is the same codebase as the managed surface.