The infrastructure for
emotional intelligence.
A complete suite to capture, analyse, and react to human emotion in real time. Built for scale, engineered for privacy.
The Missing Dimension
You know what they did.
Now understand how they felt.
Modern product stacks are excellent at tracking surface-level actions, but they completely miss the human context driving those actions.
"They visited."
Tracks page views, bounce rates, and traffic sources. Great for top-of-funnel acquisition, but blind to the actual user experience.
"They clicked."
Tracks funnels, retention cohorts, and specific events. You know they abandoned the cart, but you can only guess why.
"They were frustrated."
The attunement layer. By analysing kinematic interaction patterns, nx10 overlays emotional ground-truth directly onto your existing events.
The Architecture
One platform. Two ways to win.
nx10 isn't just an analytics tool, and it isn't just an API. It is a complete ecosystem designed to let you see the experience, and adapt the software.
1. The Affect Analyser
The ultimate dashboard for Product Managers and UX Researchers. Slice and dice your user base by emotional states instead of just clicks.
- ✦
Visualise Frustration:See exactly which UI screens or game levels are causing Red/Green frustration spikes.
- ✦
Track Cognitive Load:Map the depletion of "Brain Juice" over the course of a session to find the perfect session length.
- ✦
Activity States:Automatically correlate physical states (Sitting, Walking) with engagement metrics.
2. Real-Time Action
For Developers. Don't just look at charts - make your application adapt instantly to how the user is feeling via our SDKs and Webhooks.
- ✦
Dynamic Difficulty:In Unity, listen to the GBI stream and spawn a health pack the moment frustration peaks.
- ✦
Sentiment Paywalls:In iOS, check the user's emotional state before showing a subscription popup to avoid 1-star reviews.
- ✦
Predictive Webhooks:Let the LFM ping your backend if a user has an 80% chance of churning in the next 15 minutes.
Time to Value
Immediate insight. Deepening over time.
We know you can't wait six months for an AI to "learn" before seeing ROI. The nx10 platform is designed to deliver immediate analytics on Day 1, while quietly building a predictive data moat over time.
Day 1: Baseline Analytics
Zero Prompts Required
The moment you drop in the SDK, we use population-level clustering to deliver Red/Green Frustration indicators, Brain Juice levels, and Activity States mapped against your app's UI screens.
Weeks 2-4: Calibration
Fidelity Scaling
As the model gathers kinematics, we automatically introduce sparse, highly-targeted SAAQ Prompts to gather ground-truth labels. The platform's Fidelity Score climbs, fine-tuning to the individual user.
Month 1+: True Prediction
The Large Feelings Model
The system transitions from a real-time monitor into a sequence predictor. Unlocking full V.A.D. coordinate mapping and 15-minute predictive horizons, allowing you to trigger webhooks before a user churns.
Engineering
Built for Scale.
Engineered for Safety.
We are developers too. We know that adding a third-party SDK is a risk. We engineered nx10 from the ground up to be completely invisible to the end user and bulletproof for your application.
Careful Battery Useage
We don't stream JSON objects for every touch frame. Kinematic events are compressed into highly optimized binary tuples and sent periodically using background URL sessions, ensuring minimal impact on device battery or network data.
Fails Open (Crash-Proof)
The SDK runs asynchronously on background threads, completely detached from your main game/app loop. If the user loses internet or the nx10 cloud goes down for maintenance, your app will never stutter or hang. It simply fails open.
100% Content Blind
We strictly process kinematics (the physics of the interaction). The SDK never captures text field contents, passwords, semantic data, or screen recordings. It natively respects OS-level sandboxes for secure entry fields.
Zero-Config Wrappers
You do not need to subclass your UI buttons, rewrite your Unity raycasters, or manually push events. The SDK automatically hooks into the native event loops via method swizzling to capture kinematics effortlessly.
Built on clinical ethics.
Engineered for privacy.
As a science-first company, we believe ethical innovation begins with respect for people, their inner lives, and their agency. Understanding how people feel carries immense power, and that power must be handled with care.
100% Content Blind
Our systems operate strictly at the sensor level. We do not require access to the camera or microphone. We do not log words, sentences, passwords, or semantic content. We strictly process kinematics - translating raw physics into profound psychological insight.
Anonymity by Default
All data collected is anonymised at source. Each device is associated with a randomised identifier that cannot be reversed. Individual kinematic data points cannot be traced back to a personally identifiable user.
Ready to integrate?
Get your API keys and start building with the Large Feelings Model today.
