Cloud-connected for frontier models. WASM-in-browser for offline web. Desktop-isolated for air-gapped security. Same codebase, same tools — you choose where your code and data live. Build across Rust, C++, C#, Go, Python, JS, and WASM.
Each project tackles a different layer of the polyglot AI problem. All are active research — built in production, being prepared for open release. Coming soon.
Three deployment modes from one codebase. Cloud-connected with frontier AI models. WASM-in-browser for offline web — no server, no data leaves the tab. Windows native via WebView2 and Skia for air-gapped, isolated desktop. Multiple CSS themes with Smart JS/TS customisation.
One cps.ai.toml at your workspace root. Cargo’s TOML conventions for Rust, C++, C#, Go, Python, and JS. Shared versions, build pipelines, AI context. TOML and JSON all the way down.
Specify a function once in TOML. MathIR parses it into a typed AST (Abstract Syntax Tree), then emits deterministic, identical implementations across seven languages — plus Excel Lambda. AI proposes — the AST enforces. Frontier models and deterministic codegen working in harmony.
Same test vectors, every language, every commit. If any implementation disagrees, the build fails. Parity is a CI gate, not a hope.