The Large
Feelings Model.
Large Language Models gave software a voice.
The Large Feelings Model gives it empathy.
The Missing Dimension
From brute-force outputs to human attunement.
For the past decade, AI has been obsessed with generation and output. Predicting the next word, rendering the next pixel, or optimizing for the next click. But despite this massive compute power, software remains entirely deaf to the human sitting on the other side of the screen.
Current AI systems compensate for this lack of understanding with scale: more data, more notifications, more infinite scrolls. They push too far, for too long, because they have no mechanism to perceive human fatigue or frustration.
The LFM shifts the focus from generation to attunement. It is a new class of foundational intelligence designed to infer emotional and cognitive states from everyday digital interactions - allowing software to care appropriately.
Generative AI (LLMs)
Maps the relationship between words. Predicts what should be said next.
Predictive AI (Analytics)
Maps the relationship between clicks. Predicts what a user will buy next.
Perceptive AI (The LFM)
Maps the relationship between physics and feeling. Predicts how an experience is unfolding, right now.
Deployment
How do you access the LFM?
We engineered the LFM to be accessible through a seamless, zero-friction platform. Whether you want to analyze past behavior or dynamically alter future experiences, you have complete control.
1. See the emotion.
The Affect Analyser. A macro-level SaaS dashboard built for Product Managers and UX Researchers. Connect the LFM to your existing event data to turn 2D funnels into 3D emotional heatmaps. Slice and dice your user base by Red/Green frustration and Cognitive Load.
2. Shape the experience.
Real-Time APIs & SDKs. A micro-level nervous system for developers. Consume the LFM's outputs directly in your code via WebSockets or Webhooks. Dynamically alter UI, spawn interventions, or prevent churn based on sub-120ms emotional indices.
Digital Body Language
How does the LFM see?
Rather than relying on invasive systems that depend on cameras, microphones, or constant self-reported surveys, our approach focuses on habitual actions people already perform.
Touch Dynamics
We measure flight time between taps, dwell time on the glass, swipe velocity, and screen pressure. A hesitant, hovering finger tells a completely different story than a rapid, forceful strike.
Device Kinematics
By capturing background gyroscope and accelerometer data, the LFM detects the physical micro-tremors generated by the user's pulse and muscle tension, which correlate highly to arousal.
100% Content Blind
The LFM only looks at the physics of the interaction. We never record what words are typed, what buttons are named, or what is on the screen. We extract profound insight strictly from the kinematics.
Empirical Foundations
Emotion isn't a category. It's a coordinate.
Legacy sentiment analysis tries to force human psychology into six basic buckets (Happy, Sad, Angry, etc.). This is like trying to paint a masterpiece using only 6 crayons.
The LFM treats emotion as a continuous 3D coordinate system - allowing it to distinguish between "mild irritation" and "white-hot rage" with mathematical precision.
LFM Outputs
Emotional states and context
Time to Value
The Fidelity Curve
You don't need a year to get value out of the LFM. It provides actionable metrics on Day 1 and evolves into a deeply personalized predictive engine.
Day 1: Zero-Prompt Analytics
Drop in the SDK. Without asking the user a single question, we use baseline clustering against population-wide Affective Phenotypes. You instantly unlock Red/Green Frustration indicators and Brain Juice (Cognitive Load) levels mapped directly to your UI.
Weeks 2-4: SAAQ Calibration
As the model gathers more kinematic data, we introduce sparse, highly-targeted SAAQ Prompts to gather ground-truth labels. The platform's Fidelity Score climbs as the model fine-tunes the baseline to the individual user's unique physical patterns.
Months 1+: True Prediction
The system transitions from a real-time monitor into a sequence predictor. Unlocking full V.A.D. coordinate mapping and 15-minute predictive horizons. Just as an LLM predicts the next word, the LFM predicts the next emotional state - allowing you to intervene before a user churns.
The Impact
Intelligent systems across every industry.
The LFM doesn't just analyze data; it drives real-world outcomes by making software fundamentally more human-centric.
Gaming
Balance difficulty dynamically based on player frustration to maintain the perfect "Flow State".
Health & Wellbeing
Longitudinally track cognitive load and emotional baselines to proactively suggest clinical check-ins.
E-Commerce
Remove UX friction at the exact moment a user is emotionally compromised to save the cart.
The Clinical Mandate
Scientists. Not Tech Bros.
nx10 was born in a therapy room, not a Silicon Valley incubator. We are a science company. Our goal is not to build a better slot machine; our goal is to build technology that respects human limits. Every capability added to the LFM must pass our Internal Ethics Board. Power must be handled with care.
Read our Company EthosReady to integrate the LFM?
Get your API keys and deploy our native SDKs today.
Not ready to integrate?
Join our newsletter to get updates on Large Feelings Model architecture, scientific publications, and early access features.
