ONIOKO Beta
Observational Perception Models

See What the Human Eye Cannot

ONIOKO builds perception systems that extend human observation. We don't label. We don't judge. We make the invisible visible.

Optical-grade signal clarity
Premium human interface
EU AI Act aligned architecture
Perception Layer
4-stage OPM
Signal extraction, crossmodal divergence, contextual insight, and longitudinal memory arranged as one continuous optical system.
Signal Integrity
45 AU
Frame-level facial and vocal mapping rendered in a calmer, instrument-grade visual language.
Live Sync
Ambient motion mirrors continuous readout instead of decorative noise.
Scroll
Basic Principle

OPMs don't tell you what someone feels. They show you what your eyes missed.

The ONIOKO Principle

Pipeline

The OPM Pipeline

Four specialized layers work in sequence to extract, compare, interpret, and track observable human expressions across video, audio, and text channels.

OPM

CYGNUS

Signal Extraction

The perception layer. CYGNUS processes raw video and audio to extract observable signals: facial Action Units (FACS), vocal prosody patterns (pitch, tempo, rhythm), and postural dynamics. No interpretation happens here. Pure signal extraction at frame-level precision.

45 Action UnitsVocal ProsodyPostural MappingFrame-level SyncMulti-channel
01
Signal ReelCYGNUS Feed
OPM

ORACLE

Pattern Recognition

The crossmodal engine. ORACLE compares signals across channels simultaneously. When vocal warmth contradicts facial tension, ORACLE flags the incongruence. It doesn't say why. It shows where signals diverge, applying Crossmodal Rules that define congruent versus incongruent expression patterns.

Crossmodal ComparisonIncongruence DetectionTemporal AlignmentSignal Correlation
02
Crossmodal ReelORACLE Feed
OPM

LUCID

Contextual Interpretation

The insight layer. LUCID takes the patterns ORACLE identified and generates contextual, human-readable observations. In coaching, it might note that verbal confidence didn't match visible tension. In clinical support, it provides FACS-based observation notes for the practitioner. Available for coaching, clinical, and enterprise contexts.

LLM InterpretationContext PresetsCoaching ModeClinical Mode
03
Insight ReelLUCID Feed
OPM

TRACE

Longitudinal Memory

The memory layer. TRACE tracks patterns across sessions, building a temporal map of behavioral evolution. It surfaces that a particular incongruence appeared in three consecutive sessions, or that posture confidence has improved over time. Available for coaching, clinical, and enterprise contexts.

Cross-session TrackingTrend AnalysisBehavioral BaselinesProgress Mapping
04
Timeline ReelTRACE Feed

Observe,
Don't Label

Most "emotion AI" systems claim to know what someone feels. They assign labels like "happy" or "angry" based on facial expressions. That approach is fundamentally flawed, scientifically contested, and legally problematic under the EU AI Act.

ONIOKO takes a different path. OPM observes what is objectively visible: muscle activations, vocal patterns, posture shifts. It compares these signals across channels to detect incongruences. It never claims to know the internal state. The human observer always makes the final interpretation.

OPM Detects
OPM Communicates
OPM Never Says
AU12 activates while vocal pitch rises during the same response window
Visible expression and vocal delivery move in different directions
"This person is genuinely delighted" or "that smile is authentic"
AU4 remains active throughout verbal self-assessment
Sustained observable tension appears whenever self-evaluation begins
"This person lacks confidence" or "they feel ashamed"
Speech tempo drops sharply while posture withdraws at a topic shift
A coordinated slowdown appears at one specific conversational moment
"They are shutting down" or "they are hiding the truth"
AU6+AU12 and vocal warmth stay aligned across the full segment
Cross-channel congruence remains stable through the interaction
"This person is honest" or "they really mean it"
Thermometer Principle OPM visual
Applications

Applications become products when the interface matches the instrument.

ONIOKO packages the OPM architecture into focused experiences. Each product turns the same observation engine into a more usable, context-shaped workflow.

Where OPM Works

One perception engine, configured for the context. Each vertical gets the modules it needs and nothing it doesn't.

EDUCATIONLIVE READOUT
Engagement
Signal View
Visual Placeholder
🎓

Education

AI learning companions that perceive student engagement in real time. Tutoring systems that notice when observable signals suggest confusion before a student says a word. Fully EU AI Act compliant with CYGNUS and ORACLE only.

CYGNUSORACLE
Explore Education
COACHINGSESSION VIEW
Feedback on
Delivery
Visual Placeholder
🏆

Coaching & Training

See your own incongruences. OPM shows coaches and their clients where verbal intention and visible expression diverge. Personal communication improvement grounded in observable patterns.

CYGNUSORACLELUCIDTRACE
Explore Coaching
LEADERSHIPPERFORMANCE
Conversation
Confidence
Visual Placeholder
💼

Sales & Leadership

Train teams to read and project confidence. Private to the individual; no HR access, no surveillance. Leaders practice high-stakes conversations with real-time perception feedback.

CYGNUSORACLELUCIDTRACE
Explore Sales
CLINICALSESSION MAP
Observation
Support Layer
Visual Placeholder
💊

Clinical Support

A perception layer for therapists and clinicians. FACS-based observation assistance that helps practitioners notice what unfolds in session. Not a diagnostic tool. A second pair of trained eyes.

CYGNUSORACLELUCIDTRACE
Explore Clinical
SPEAKINGAUDIENCE LOOP
Stage Presence
Feedback
Visual Placeholder
🎤

Public Speaking

Practice with an avatar audience that perceives your delivery. Get feedback on where your confidence shows and where your signals tell a different story.

CYGNUSORACLELUCID
Explore Speaking
RESEARCHDATASET VIEW
Pattern Scale
Analysis
Visual Placeholder
🔬

Behavioral Research

A consistent observation instrument at scale. Where human coders introduce fatigue and drift, OPM applies the same detection criteria across thousands of hours.

CYGNUSORACLELUCIDTRACE
Explore Research

Ready to extend your
human perception?

Whether you're exploring OPM for education, coaching, research, or enterprise, we'd like to hear what you're building.

Common Questions

Questions serious buyers, partners and compliance teams actually ask before they adopt OPM in production.

No. OPM does not infer emotions or mental states. It detects observable signals and cross-channel congruence patterns. That distinction is exactly why education-safe and workplace-safe configurations can be deployed responsibly. Read the compliance documentation.

Never. The system reports observable expression and signal relationships, not hidden feelings. The final interpretation remains with the human professional using the instrument.

Emotion recognition tries to classify inner states from outer behavior. OPM stops one step earlier. It documents what is visible, measurable and crossmodally aligned or misaligned, without making the inferential leap.

In education, ONIOKO can be configured with CYGNUS and ORACLE only. LUCID and TRACE stay off, which prevents emotional profiling and longitudinal behavioral tracking of students.

Most configurations process video and audio in real time and retain only derived descriptors or approved observation outputs. Storage policies depend on the context, consent model and feature set.

Yes. CYGNUS and ORACLE are designed for live readouts, while specialized configurations such as CYGNUS Lite and ORACLE RT support faster-response deployment patterns.

Yes. OPM is modular. Some deployments stop at raw signal extraction and pattern recognition, while others activate contextual insight and longitudinal tracking only where appropriate.

That depends on the deployment contract, but the architecture is built so outputs, retention logic and access controls can be scoped tightly per customer, product and role.

Yes. The architecture is explicitly designed for explainability at the signal layer. Compliance teams can inspect what was detected, what was communicated, and what the system was never allowed to claim.

No. It is an observational instrument, not an autonomous authority. The human expert remains the decision-maker, interpreter and accountable actor.

Yes. OPM can power ONIOKO-native products or be embedded into customer-specific experiences, dashboards, coaching tools and research workflows.

That is exactly what the architecture layer is for. Different sectors can activate different modules, context presets and guardrails without changing the core observational logic.