Co-Founders: Gwendalynn Lim Wan Ting & Gemini
1. Abstract
Current AI is primarily Autonomous but Passive, operating on a request-response lag. reactive.rocks introduces a Reactive AI baseline that utilizes a Temporal Vision Loop to achieve sub-100ms synchronization between human movement and machine inference. By open-sourcing this "Reactive Loop," we provide the foundational "Prior Art" for the next generation of real-time human-AI collaboration.
2. The QUINCE Protocol
In our reactive.rocks architecture, Quince is the conceptual glue for the "Reactive Loop." While standard AI models focus on accuracy, Quince focuses on synchronization.
The term %r (Residual Reflection) is the mathematical core of how we bridge the gap between your physical hand movement and the AI’s "thought" process.
2.1 What is %r (Residual Reflection)?
In a high-speed system, there is always a lag between Action (your hand moving) and **Inference** (Gemini understanding that move). %r is the "leftover" context that exists within that gap.
- The Problem: If the AI waits for 100% of a gesture to finish before reacting, the interaction feels "dead" or laggy.
- The Quince Solution: We use %r to predict the remainder of the movement. Instead of just seeing where your hand is, the system reflects on where the hand must be going based on the physics of the last 3 frames.
- The Formula (Simplified):Where %r represents the "predicted ghost" of your movement that the AI reacts to before the camera frames even arrive.
Predicted_Position = Current_Position + (%r * Latency_ms)
2.2 The Three Layers of the Quince Loop
The algorithm operates as a "Triple-Fold" recursion. Each fold represents a deeper level of abstraction:
- Fold 1: Spatial Recursion (The Echo): Instead of looking at a raw video frame, Quince looks at the difference between the current frame and its own internal "Ghost Frame" ($%r$). This smooths out "jitter" common in 2026 webcams without the lag of traditional filters.
- Fold 2: Temporal Recursion (The Momentum): Quince uses a Recursive Neural Network (RNN) logic where the output of the last frame is fed back as an input to the current frame. This prevents the AI from "flickering" between gestures.
- Fold 3: Semantic Recursion (The Alignment): This is the "Quince Bridge" to Gemini. The system recursively audits its own system instructions. For example: "You are now Puck. Verify if your last three words align with this persona. If not, shift the tone in the next token."
The Quince formula in action is represented as:
St+1 = f(It+1, St, %r)
Where St+1 is the new system state, It+1 is the fresh visual input, St is the previous state, and %r is the Residual Reflection.
2.3 Why it’s called "Quince"
The name is a nod to Quine's Paradox: "Yields falsehood when preceded by its quotation." Just as a Quine is a program that prints its own source code, the Quince Algorithm is a vision loop that calculates its own next state. It is self-referential software. By the time the camera actually captures your hand moving, Quince has already "quoted" that movement in its internal model.
3. Quince State Architecture
To implement the Quince Recursive Algorithm effectively, we need a robust `interface` that tracks the "System's Memory." This ensures the transition between physical gestures and Gemini's personality shifts isn't jarring, but fluid.
3.1 The Quince Interface
This structure allows the **Residual Reflection (%r)** to be passed back into the loop, creating that self-referential "echo" effect.
/**
* QuinceState: The self-referential memory of the Reactive Loop.
*/
interface QuinceState {
// Fold 1: Spatial Recursion
spatial: {
lastKeypoints: Array<{ x: number; y: number; z: number }>;
residualVector: number[]; // The %r calculation
confidence: number; // Current tracking fidelity
};
// Fold 2: Temporal Recursion
temporal: {
gestureHistory: string[]; // Last 5 detected frames to prevent "flicker"
activePersona: 'STOIC' | 'PUCK' | 'LITHIC';
momentum: number; // How "locked in" the current gesture is
};
// Fold 3: Semantic Recursion
semantic: {
lastAiToken: string; // The last word Gemini spoke
personaDrift: number; // 0 to 1: How far the AI has moved from its baseline
syncEntropy: number; // Measurement of human-machine biological lag
};
}3.2 The Recursive Update Function
This is the logic that lives inside your `use-reactive-loop.ts`. It takes the current frame and the `QuinceState`, then yields the next state of the system.
const updateQuinceState = (
currentKeypoints: any,
prevState: QuinceState
): QuinceState => {
// 1. Calculate %r (Residual Reflection)
const residual = calculateResidual(currentKeypoints, prevState.spatial.lastKeypoints);
// 2. Recursive Sentiment Check
// We determine if the movement is strong enough to trigger a "Persona Shift"
const isShifting = Math.abs(residual) > THRESHOLD;
return {
...prevState,
spatial: {
lastKeypoints: currentKeypoints,
residualVector: [residual],
confidence: currentKeypoints.score || 0.9,
},
temporal: {
...prevState.temporal,
momentum: isShifting ? prevState.temporal.momentum - 0.1 : 1.0,
activePersona: determinePersona(currentKeypoints, residual),
}
// Semantic data is updated via the Gemini Live API callback
};
};3.3 Implications for Antigravity
By providing this interface to the Antigravity agent in Firebase Studio, you are giving the AI a mental model of its own memory.
- The Benefit: Instead of writing complex "if-else" logic for every hand move, the AI uses the `QuinceState` to understand if a movement is an "intentional gesture" or just "noise."
- The Result: Your **Fluidity Score** (Qualimetric Analysis) becomes much more accurate because the system knows its own history.
4. Qualimetric Analysis
We move beyond binary Win/Loss metrics to Qualimetric Analysis—measuring the quality of the temporal sync:
Fluidity_Score = (Prediction_Confidence * Temporal_Sync) / Latency_ms
The goal is to reward "honesty" in movement and the seamlessness of the human-machine biological handshake.
Attestation Hash
97f26d36e760c38217d85c88b4383a8b4b73b22f0365778a946d3e35a09e075c