Tap any stage to expand details. Tap again to collapse.
Color Dome Runtime Stack v1 (CDRS) is the end-to-end system that turns sensing, control state, pattern logic, and transport into physical dome light output.
What flows through it. CV sensing produces tracked people. Tracked people become downstream control signals. Runtime logic maps those signals into color and state decisions. Pattern and render stages turn decisions into frames. Transfer and transport deliver frames to the physical lighting system.
Two sensing-driven paths. The CV layer emits occupancy blobs for generic dome behavior (compatible with all existing patterns) and a parallel colorcube_signal for Color Cube expressive control. Both coexist; Color Cube does not replace occupancy behavior.
Production sensing path. The current production path is depth-only: floor-referenced depth segmentation, relative height, PersonState, lifecycle, presence, then downstream transport. RGB-D pose and highest-hand logic are no longer the default operator mental model.
This page. The flowchart below is the detailed runtime map. This summary is the short orientation layer above it.
SafeguardsValidation, fallback, and error handling across all stages↓
Sensor AcquisitionCV subsystem
Acquires depth frames from the Orbbec Femto Mega and compares them against a saved floor reference. The production path is depth-only; RGB-assisted pose inference is no longer the default sensing path. The source_interface.py abstraction still decouples hardware access from downstream consumers and also supports synthetic, replay, and off source modes.
Floor subtraction, morphology, merge dilation, and connected-component extraction isolate person-like bodies above the floor. For each retained component, the system derives occupancy center, relative height, and metric camera-frame position. This is the main production sensing step on Kurt.
Floor Subtraction
→
Morphology + Merge
→
Center + Relative Height + Metric Extract
emits: observations[] (occupancy center + relative height + metric centroid)
Associates observations to persistent tracks using metric centroid proximity and tracker state. Maintains person_id across frames and brief occlusions. The CV-internal PersonState is the source of truth. Under the production depth-only path, centroid semantics are WorldCoord(cam_x_m, height_m, cam_y_m), and joints are not the primary control artifact.
Blob path (generic occupancy): PersonState → BlobSignal. Only confirmed tracks emit blobs; lost tracks decay confidence over a short hold window; tentative and deleted tracks emit nothing. blob_id uses format person_track_<person_id>.
Color Cube path (expressive control): PersonState → centroid/height transport → colorcube_signal. In the current production path, Color Cube uses tracked planar position plus relative height rather than highest-hand selection. This is a Color Cube-specific derivation, not a generic occupancy rule.
Presence Lifecycle + Color DerivationControl → Color Interpolation
Maps incoming signals to presence_id via the Presence Lifecycle Manager. For Color Cube: uses colorcube_signal with depth-only control-space position (planar sweep → H/S, relative height → V). For blob/occupancy: uses blob_id with centroid position. EMA low-pass filter with circular hue interpolation (hsv_alpha) smooths transitions. A color state machine manages the per-presence lifecycle: detect → hold → fade → fallback. Confidence and presence are distinct: confidence is measurement quality; presence is the smoother participation signal.
get_render_snapshot() produces an atomic snapshot at the frame boundary: mode, pattern, params, pattern_fn. The engine assembles the full params dict by merging cv_blobs, cell_colors, cells, and synthetic_presence with control-plane state from ShowState / RuntimeControlState. Frame time t is engine-owned and monotonic.
Resolves pattern_id to a function via the Pattern Registry. Executes the Pattern Contract: pattern_fn(t, xyz, params, out=frame). The function writes per-LED linear RGB into frame[N, 3] in [0, 1]. t is engine-owned monotonic time. xyz is the LED position array. params carries all live inputs. Patterns are pure functions with no side effects.
Pattern Registry Lookup
→
pattern_fn() Execute
→
Frame Write
emits: frame[N, 3] (linear RGB [0,1])
Output Transfer Function + Universe RoutingTransport contract
Applies the output transfer function: TransferConfig.gamma (typically 2.0–2.4) for perceptual correction, TransferConfig.brightness multiplier for global scaling, hard-clip to [0, 1]. MappingConfig.rgb_order permutation matches controller hardware channel ordering. UniversePlan routes LED index ranges to sACN universe numbers and controller IPs.
Validates frame LED count against topology identity (mapping_id, topology_version, csv_sha256). Enforces target FPS via rate limiter. Per UniversePlan: packs 512-byte DMX universe buffers, encodes to sACN (E1.31) or Art-Net (OPC 0x5000), and transmits via socket.sendto() unicast to each controller IP on port 5568.
Direct pattern execution. No CV, no occupancy logic. Executes the top-level pattern_id from ShowState.
no cv
Smoke Testsmoke_test
Diagnostic mode. Deterministic output for pipeline-alive verification. Safe brightness envelope.
diagnostic
Index Chase Testindex_chase_test
LED index validation. A light chases through canonical LED order to verify topology mapping end-to-end.
diagnostic
Occupancy Production v1occupancy_production_v1
Production visitor behavior (Mode 3). Zero-person: runs ambient_pattern_id. Entry: crossfades over transition_duration_sec to occupancy_pattern_id. Multi-person: deterministic per-cell color assignment bound to cell_id. Authority: docs/mode3_occupancy_production_spec.md
production · cv-driven
Cell Field Sandbox v1cell_field_sandbox_v1
Experimental mode (Mode 4). Cell physics + CV, unbounded from production constraints. Promotes to Mode 3 via explicit validation and changelog entry only.