Skip to main content
A Transaction Science Platform

AI that costs joules, not tokens.
Free. Always on. No off switch.

EOC resolves every query through a four-stage pipeline and only invokes a neural model when nothing cheaper can. Energy is the unit. The substrate runs in a browser, runs on commodity CPUs, and cannot be turned off because no one entity runs it.

4
Stages
6
Refine operators
100×
Typical energy drop
0
Off switches
Explore

The Pipeline

Four stages. Generation last.

Every query resolves through the same path: state-construct → retrieve → refine → check. A large language model is one of six refine operators, ordered last — reached only when nothing cheaper has produced an answer that survives the check. Cost compounds down the pipeline; most work never gets near the bottom.

01
state-construct
Type the work
02
retrieve
Ground the work
03
refine
Do the work
04
check
Validate the work
01 · state-construct

Type the work

Parsers and schema resolution turn a request into a typed problem and a typed target. A language model is forbidden in this stage — it produces nothing here, it only obscures.

cost: nano–µJ · LLMs forbidden
02 · retrieve

Ground the work

Pull what's already known, cheapest source first: direct lookup → knowledge-graph traversal → vector recall → document scan. Most answers exist; finding them costs almost nothing.

cost: nano–mJ · lookup → KG → vector → doc
03 · refine

Do the work

Six operator families, ordered cheapest-first. The pipeline tries each in turn and stops at the first whose output passes the check. Generation is the sixth, not the first.

cost: µJ–J · six families, cheapest first
04 · check

Validate the work

Schema conformance plus constraint satisfaction, run against the typed target. A candidate that fails is rejected and the pipeline moves to the next operator. A language model is forbidden here too.

cost: µJ · schema + constraints · LLMs forbidden
Refine — six operator families, ordered by energy
01 · Deterministic state machine µJ
02 · Local search µJ
03 · State-graph mJ
04 · Constrained search mJ
05 · Tiny-recursive net mJ–J
06 · LLM-in-loop J

Deterministic state machine · local search · state-graph · constrained search · tiny-recursive net · LLM-in-loop. Each family carries a stable identifier and a conformance suite. The order is the policy: try cheap, fall through, generate only if you must.

The Fourth Truth

Refinement before generation.

When the input is structured and the output must be structured, draft-and-revise over a candidate beats autoregressive generation on cost, reliability, and auditability. Generation is the rare fallback, not the default. A meeting created from a natural-language request resolves at roughly 150 mJ on the pipeline — against roughly 3–5 J for a pure language-model pass: about 20–30× lower energy for the same result, with a record of how it got there.

The cheap operators are not approximations of the expensive one. They are the right tool for most of the work, and the pipeline reaches the expensive one only when the work genuinely needs it.

The Unit

Tokens are the concealment.

The per-token billing unit is not a measure of work — it's a measure of one architecture's output, priced as if it were. It hides energy: joules per token vary across orders of magnitude. It hides work: a 5,000-token wrong answer bills 100× a 50-token right one. And it hides architecture: lookups, refinement loops, and knowledge-graph traversals do real work and generate no billable tokens at all.

Cost receipt
task calendar_event/inst_00482
proficiency P2
energy 152 mJ
corpus eval-001 · v0.1
hardware ref-hw-a
attestation blake3:9c4f…a17e
pipeline SC·RET·REF·CHK
operator constrained-search/03
result schema-conformant

One task. The joules it cost, the proficiency it hit, the corpus and hardware it ran against, and the content-addressed attestation that lets anyone re-run it. A number you can point to — not a token count.

Joules, not tokens

The unit is energy — the physical quantity the work actually consumes — not a count of one model's output symbols. Energy is comparable across architectures; tokens are not.

Per task, not per token

The bill is attached to the task — "make this calendar event," "answer this question" — with the joules it took to reach a result that passed the check. A wrong answer doesn't cost more by being longer.

Open corpus, not vendor benchmark

Proficiency is measured against a published evaluation corpus anyone can inspect and run — not a private benchmark whose contents and scoring you have to take on faith.

Attested, not asserted

Each result carries a content-addressed attestation: the pipeline path, the operator, the corpus version, the hardware profile. Reproducible by anyone holding the content ID, with no trust relationship required.

What EOC denominates in

Joules per task at a stated proficiency, against an open corpus, on reference hardware, certified by attestation.

Every term in that sentence is checkable. The proficiency standard (Eval-001/002) is open. The corpora are published and content-addressed. The reference hardware profiles are specified. The certificate is a hash anyone can verify. "Per-token" tells you how a model talked; "joules per task" tells you what the work cost — and lets you compare it to anything else that does the same work.

The Spec

An open RFC stack.

EOC is defined by documents, not by a product. The spec text is permissively licensed. The conformance test vectors and the open corpora are published so any implementation can verify itself — without a trust relationship with the authors. The protocol's identity lives in its vectors and content-addressing, not in a brand.

EOC-1

The substrate architecture

The four-stage pipeline — state-construct → retrieve → refine → check. Version 0.2 supersedes v0.1's tier model; the patch and addendum revisions are tracked. Addendum A states the non-goals — explicitly, token economics: EOC does not denominate, bill, or reason in tokens.

v0.2 · supersedes v0.1 tiers · Addendum A: non-goals
EOC-2

The wire protocol

Capability exchange, gossip, and signed envelopes between nodes. Browser-deployable over WebSocket, friendly to WebCrypto. The reference implementation of the protocol is roughly 2k lines — small enough to read in an afternoon.

~2k LOC · browser-deployable · signed envelopes
EOC-3

Artifact distribution

Content-addressed, BitTorrent-style fetch for models, state machines, and evaluation corpora. Anyone holding a content ID can serve the artifact; there is no authoritative endpoint. Roughly 1.5k lines in the reference implementation.

~1.5k LOC · content-addressed · no authoritative host
EOC-4 / DCY

Deterministic Cypher

A deterministic, terminating subset of openCypher for the knowledge-graph retrieve operator. Every query halts; every query is reproducible. Roughly 3k lines in the reference implementation.

~3k LOC · deterministic · terminating
EOC-5

The Operator Family Registry

Stable identifiers, conformance tests, and anti-capture governance for the pipeline's six operator families. Adding, versioning, or deprecating a family happens here, in the open, under fixed rules.

stable IDs · conformance suites · anti-capture rules
Eval-001

The proficiency standard

Schema-conformant generation. Defines proficiency levels P1/P2/P3 with energy envelopes for each, an open corpus of worked instances, and content-addressed certificates. A claim of "P2 at 150 mJ" means something specific and checkable.

P1 / P2 / P3 · energy envelopes · open corpus
Eval-002

Retrieval & provenance

Reliability, provenance correctness, fabrication rate, and omission rate for the retrieve stage. Measures whether what comes back is real, attributed, complete, and not invented.

reliability · provenance · fabrication · omission
The OS Theorem

The energy-first operating system

The OS the substrate is designed to run on: capability-bounded, single-language (Rust), a kernel of fifty thousand lines or fewer, idle-by-default, heterogeneous-substrate-native, with J/W as the scheduler's objective function. Energy isn't a metric the OS reports — it's the thing the OS optimizes.

≤50k-LOC kernel · capability-bounded · J/W scheduling
Robust to renaming

The protocol's identity is its conformance vectors and content-addressing — not any brand.

"EOC" is a name on a set of documents. The documents define test vectors; an implementation either passes them or doesn't. Artifacts are content-addressed; a content ID resolves to one thing regardless of who hosts it. Strip the name off the whole stack and nothing about how it works changes. That's the point: a substrate that can't be captured can't be captured by branding it, either.

The Reference Implementation

One reference implementation. Many expected.

Transaction Science ships a reference implementation in Rust — small by design, deterministic where it matters, and browser-deployable. It exists to make the spec concrete and to give every other implementation something to check itself against. The substrate's security is monoculture-bounded until a second implementation exists; we'd like there to be a second one.

Reference implementation — at a glance
wire protocol (EOC-2) ~2k LOC
deterministic cypher (EOC-4) ~3k LOC
artifact distribution (EOC-3) ~1.5k LOC
pipeline + lockstep core small by design
language Rust
transport WebSocket · WebCrypto-friendly

The core lockstep and pipeline machinery is intentionally compact — a thing small enough to audit is a thing that gets audited. The line counts are a feature, not a roadmap to grow.

Small by design

The wire protocol is roughly 2k lines, DCY roughly 3k, artifact distribution roughly 1.5k. Each piece is sized to be read end to end. Smallness is the security property, not an accident of an early version.

Deterministic, therefore auditable

State-construct and check are bit-deterministic. Refine is deterministic given a seed. A reproducibility failure isn't a quirk — it's evidence of tampering, and it's reviewable as such.

Browser-deployable

The transport is WebSocket; the crypto is WebCrypto-friendly. A node can run in a browser tab on a commodity CPU. The substrate doesn't require a data center to participate — it requires a browser.

Conformance-gated

Every implementation must pass the published test vectors before it joins production. Conformance is the membership test — not a relationship with the authors, not a certification body's stamp.

A call for second implementations

A substrate with one implementation has one failure mode. We'd like a second one.

The reference implementation is a starting point, not the substrate. The conformance vectors are public so anyone can build their own — in another language, on other assumptions, with another team's review — and prove it interoperates. Until that second implementation exists, the substrate's robustness is bounded by a monoculture. The spec, the vectors, and the open corpora are all the invitation we can offer; the rest is someone else picking it up.

Services

The substrate is open. The services are how we sustain it.

Transaction Science's commercial layer sits at the edges of the protocol, not inside it: hosted nodes, certification, registry operation, implementation support. The protocol stays everyone's; the services are a business built around keeping it healthy. Customers are AI vendors, enterprises running inference at scale, and anyone who needs energy-attributed compute.

Managed EOC nodes

Hosted, energy-honest AI compute

Run inference on a substrate that reports its joules. For teams that need energy-attributed compute — CSRD, other regulatory reporting, internal carbon accounting — this is the alternative to inference billed in tokens: every task comes with the energy it cost, on reference hardware, with the attestation attached.

you get: hosted nodes · per-task joules · attestations · SLA
Certification-as-a-service

Get a proficiency-at-joules number audited

An AI vendor that wants to claim a number — "P2 at 150 mJ on this corpus" — gets it audited against Eval-001 and Eval-002 on reference hardware and receives a content-addressed certificate. The claim becomes something a customer can verify rather than something they have to trust.

you get: audit run · Eval-001/002 scoring · content-addressed cert
Registry hosting

Operate the Operator Family Registry

Publish and version operator families against EOC-5: stable identifiers, conformance suites, public review windows. The registry has to run somewhere under fixed rules; Transaction Science operates the canonical instance — under open governance, not as its owner.

you get: family publication · versioning · conformance hosting
Reference-implementation support

Commercial support, integration, conformance review

Support contracts on the reference implementation, integration help for teams adopting it, and conformance review for teams building their own. The implementation is open; the engineering time around it is the product.

you get: support · integration · conformance review
The flagship implementation

TX Science AI is the flagship EOC implementation: its memoizing cascade is the four-stage pipeline — state-construct, retrieve, refine cheapest-first, check — running in production. It's one implementation among the many the spec is built to admit, not the substrate itself.

Steward, not owner

We don't own the protocol. We steward it.

Publishing the spec, shipping the reference implementation, running the services that keep it healthy — that's stewardship, and it's a business. It isn't ownership. Anyone can implement EOC; no one can capture it. The moat is at the edges — the hosted nodes, the certification, the registry operation, the support — never in the protocol, which has no owner by construction.

Governance

No token. No foundation. No off switch.

EOC is built so that no single party — including its authors — can own the AI substrate. That's not a policy choice that could be reversed later; it's a property of the construction. There's no protocol-level coin to confiscate, no foundation to capture, no endpoint that's authoritative, no switch to flip.

No protocol-level token

The substrate can't be financially captured by its authors because there's nothing financial at the protocol level to capture. No coin to confiscate. No stake to centralize. The unit of account is joules, and joules aren't issued by anyone.

No foundation to capture

The functions a foundation would normally perform — hosting, coordination, dispute resolution — are handled by distributed, organic mechanisms. The defense against foundation capture is not having a foundation.

Forkable by design

Permissive licensing means anyone can fork the spec and the reference implementation. But the protocol's identity is its conformance vectors: a fork that diverges fails conformance, and the divergence is reviewable. Forking is allowed; silent drift isn't.

No off switch

No single party runs it. No single endpoint is authoritative. Artifacts are content-addressed, so anyone holding the content ID can serve them. There's no console with a kill switch because there's no operator to own the console.

Open governance for the registry

The canonical Operator Family Registry is governed by at least three independent organizations, with no more than one third of them vendors of EOC implementations, and public review windows on every change. The registry has rules, and the rules are visible.

The impossibility

Every property above subtracts a lever someone could pull. What's left is a substrate that nobody — including Transaction Science — can switch off, gate, or own.

What the substrate is for

The impossibility is the product.

EOC exists to make a particular thing impossible — for any one party, including its authors, to own the AI substrate. The internet is free and no one runs it; AI isn't, yet. The four-stage pipeline makes AI cheap enough to be free; the ownerless construction makes it stay that way. That impossibility — the thing nobody can do to it — is what's being built.