Color Dome Access

Enter password to continue.

Color Dome Runtime Stack

Tap any stage to expand details. Tap again to collapse.

Color Dome Runtime Stack v1 (CDRS) is the end-to-end system that turns sensing, control state, pattern logic, and transport into physical dome light output.

  • What flows through it. CV sensing produces tracked people. Tracked people become downstream control signals. Runtime logic maps those signals into color and state decisions. Pattern and render stages turn decisions into frames. Transfer and transport deliver frames to the physical lighting system.
  • Two sensing-driven paths. The CV layer emits occupancy blobs for generic dome behavior (compatible with all existing patterns) and a parallel colorcube_signal for Color Cube expressive control. Both coexist; Color Cube does not replace occupancy behavior.
  • Production sensing path. The current production path is depth-only: floor-referenced depth segmentation, relative height, PersonState, lifecycle, presence, then downstream transport. RGB-D pose and highest-hand logic are no longer the default operator mental model.
  • This page. The flowchart below is the detailed runtime map. This summary is the short orientation layer above it.
TCD contract term
Implementation detail
Control-plane input
Subsystem boundary
Safeguard / fallback
Data Plane
Sensor
Acquire
depth_frame +
floor_ref
Segmentation +
Metric Extract
components[]
(center + height)
Tracking +
PersonState
cv_blobs +
colorcube_signal
Presence +
Color
cell_colors:
dict[cell_id→rgb]
Frame
Snapshot
render_snapshot
{pattern_fn, params}
Pattern
Execute
frame[N, 3]
linear RGB
Transfer +
Routing
routed
universes[]
Protocol +
Transport
sACN E1.31
UDP packets
LEDs
physical
light output
Control Plane
ShowState mode_id pattern_id cells TransferConfig UniversePlan
Safeguards Validation, fallback, and error handling across all stages
Sensor Acquisition CV subsystem

Acquires depth frames from the Orbbec Femto Mega and compares them against a saved floor reference. The production path is depth-only; RGB-assisted pose inference is no longer the default sensing path. The source_interface.py abstraction still decouples hardware access from downstream consumers and also supports synthetic, replay, and off source modes.

Depth Frame + Floor Reference Acquire
emits: depth_frame[H,W] + floor_ref[H,W]
Depth Segmentation + Metric Extraction CV subsystem

Floor subtraction, morphology, merge dilation, and connected-component extraction isolate person-like bodies above the floor. For each retained component, the system derives occupancy center, relative height, and metric camera-frame position. This is the main production sensing step on Kurt.

Floor Subtraction
Morphology + Merge
Center + Relative Height + Metric Extract
emits: observations[] (occupancy center + relative height + metric centroid)
Multi-Person Tracking + PersonState Derivation CV subsystem → CV boundary

Associates observations to persistent tracks using metric centroid proximity and tracker state. Maintains person_id across frames and brief occlusions. The CV-internal PersonState is the source of truth. Under the production depth-only path, centroid semantics are WorldCoord(cam_x_m, height_m, cam_y_m), and joints are not the primary control artifact.

Blob path (generic occupancy): PersonStateBlobSignal. Only confirmed tracks emit blobs; lost tracks decay confidence over a short hold window; tentative and deleted tracks emit nothing. blob_id uses format person_track_<person_id>.

Color Cube path (expressive control): PersonState → centroid/height transport → colorcube_signal. In the current production path, Color Cube uses tracked planar position plus relative height rather than highest-hand selection. This is a Color Cube-specific derivation, not a generic occupancy rule.

Track Association
Kalman Smooth
PersonState Emit
CV Boundary
PersonState
PersonState→BlobSignal Adapt
BlobSignal Emit
PersonState
Centroid/Height Control Map
colorcube_signal Emit
emits: cv_blobs: list[BlobSignal] (occupancy) + colorcube_signal (Color Cube depth-only control)
Presence Lifecycle + Color Derivation Control → Color Interpolation

Maps incoming signals to presence_id via the Presence Lifecycle Manager. For Color Cube: uses colorcube_signal with depth-only control-space position (planar sweep → H/S, relative height → V). For blob/occupancy: uses blob_id with centroid position. EMA low-pass filter with circular hue interpolation (hsv_alpha) smooths transitions. A color state machine manages the per-presence lifecycle: detect → hold → fade → fallback. Confidence and presence are distinct: confidence is measurement quality; presence is the smoother participation signal.

Signal→Presence Match
Position→HSV Sphere Map
HSV EMA Filter
Color State Machine
cell_colors Emit
emits: cell_colors: dict[cell_id → [r,g,b]] (linear RGB)
Frame-Boundary State Assembly Engine runtime

get_render_snapshot() produces an atomic snapshot at the frame boundary: mode, pattern, params, pattern_fn. The engine assembles the full params dict by merging cv_blobs, cell_colors, cells, and synthetic_presence with control-plane state from ShowState / RuntimeControlState. Frame time t is engine-owned and monotonic.

get_render_snapshot()
Params Dict Assembly
consumes: cv_blobs, cell_colors, cells + synthetic_presence (control plane), ShowState (control plane)
emits: render_snapshot {mode_id, pattern_fn, t, params}
Pattern Execution Pattern contract

Resolves pattern_id to a function via the Pattern Registry. Executes the Pattern Contract: pattern_fn(t, xyz, params, out=frame). The function writes per-LED linear RGB into frame[N, 3] in [0, 1]. t is engine-owned monotonic time. xyz is the LED position array. params carries all live inputs. Patterns are pure functions with no side effects.

Pattern Registry Lookup
pattern_fn() Execute
Frame Write
emits: frame[N, 3] (linear RGB [0,1])
Output Transfer Function + Universe Routing Transport contract

Applies the output transfer function: TransferConfig.gamma (typically 2.0–2.4) for perceptual correction, TransferConfig.brightness multiplier for global scaling, hard-clip to [0, 1]. MappingConfig.rgb_order permutation matches controller hardware channel ordering. UniversePlan routes LED index ranges to sACN universe numbers and controller IPs.

Gamma Correction
Brightness Scale
Clamp [0, 1]
RGB Channel Reorder
Universe Routing
emits: routed_universes: list[UniversePayload]
Protocol Encoding + Network Transport Transport contract

Validates frame LED count against topology identity (mapping_id, topology_version, csv_sha256). Enforces target FPS via rate limiter. Per UniversePlan: packs 512-byte DMX universe buffers, encodes to sACN (E1.31) or Art-Net (OPC 0x5000), and transmits via socket.sendto() unicast to each controller IP on port 5568.

Frame Validate
FPS Rate Limit
DMX Universe Pack
sACN/Art-Net Encode
UDP sendto()
LED Controllers
emits: sACN E1.31 packets → UDP → controller_ip:5568 → physical LEDs
Runtime Control State — what the dome is doing and how it’s configured right now
ShowState
The authoritative snapshot of what the dome should be doing right now: mode, pattern, parameters, and configuration.
RuntimeControlState
Thread-safe state machine that validates and applies ShowState changes at frame boundaries.
mode_id
Which runtime behavior is active (e.g., demo, smoke_test, occupancy_production_v1).
pattern_id
Which render function the engine should execute this frame.
mode_config
Per-mode parameter overrides: transition durations, ambient pattern, occupancy thresholds.
Fixed Timestep t
Engine-owned monotonic frame clock; the single source of time truth for the runtime.
Simulation + Sensing Inputs — data from physics simulation and test harnesses that feed into frame assembly
synthetic_presence
Fake occupancy signals injected for testing; lets the pipeline run without a real person in the dome.
cells (Cell Physics)
Per-frame cell geometry from the physics simulation: centers, radii, lifecycle state, age.
Output Configuration — how frames are transformed, routed, and delivered to physical hardware
TransferConfig
How to convert linear math colors into what LEDs actually need: gamma curve, brightness ceiling, clamping.
MappingConfig
Which RGB channel goes to which wire; maps the engine’s color order to what the physical controllers expect.
UniversePlan
Which LEDs go into which DMX universe, and which controller IP receives each one.
topology_version
Identity stamp for the LED layout; if this doesn’t match, the frame gets rejected before it hits the wire.
ShowState, RuntimeControlState, mode_config → Frame Snapshot mode_id → Frame Snapshot (render branch select) pattern_id → Pattern Execution (registry lookup) Fixed Timestep t → Frame Snapshot (engine-owned frame time) synthetic_presence, cells → Frame Snapshot (params dict assembly) TransferConfig, MappingConfig → Transfer + Routing UniversePlan, topology_version → Transfer + Routing, Protocol + Transport

Runtime Modes (mode_id)

Demo demo
Direct pattern execution. No CV, no occupancy logic. Executes the top-level pattern_id from ShowState.
no cv
Smoke Test smoke_test
Diagnostic mode. Deterministic output for pipeline-alive verification. Safe brightness envelope.
diagnostic
Index Chase Test index_chase_test
LED index validation. A light chases through canonical LED order to verify topology mapping end-to-end.
diagnostic
Occupancy Production v1 occupancy_production_v1
Production visitor behavior (Mode 3). Zero-person: runs ambient_pattern_id. Entry: crossfades over transition_duration_sec to occupancy_pattern_id. Multi-person: deterministic per-cell color assignment bound to cell_id. Authority: docs/mode3_occupancy_production_spec.md
production · cv-driven
Cell Field Sandbox v1 cell_field_sandbox_v1
Experimental mode (Mode 4). Cell physics + CV, unbounded from production constraints. Promotes to Mode 3 via explicit validation and changelog entry only.
experimental · cv-driven

Pattern Registry (pattern_id)

Demo Gradient demo_gradient
Vertical gradient with radial shimmer. Params: seed, brightness, speed, blend, palette.
demo
Demo Wave demo_wave
Azimuthal wave bands with height modulation. Params: seed, brightness, speed, blend, palette.
demo
Ambient Baseline v1 ambient_baseline_v1
Mode 3 zero-occupancy baseline. Slow dome-wide motion. Default brightness 0.55, speed 0.35, blend 0.7.
mode 3 · ambient
Occupancy Response v1 occupancy_response_v1
Mode 3 occupancy pattern. Cell-aware with mitosis pulse behavior, per-cell color from cell_colors. Params: cell_glow_radius, mitosis_pulse_strength.
mode 3 · production
Occupancy Voronoi v1 occupancy_voronoi_v1
Voronoi partition with person-derived cell colors. Params: boundary_softness, cell_edge_gradient_strength, boundary_quiver_amplitude, boundary_quiver_speed. Grayscale ambient fallback at zero occupancy.
mode 3 · advanced
Sensor Acquisition Camera lock fail → _UnavailableAdapter → emit empty cv_blobs[]
Segmentation Floor subtraction yields no valid person-like components → emit empty cv_blobs[] and no expressive control signal
Blob Adaptation BlobSignal schema validation failure → ValueError → log cv_source_error, use last valid frame
Presence Lifecycle No blob match → skip to fallback_rgb (black). Confidence < 0.6 → hold_ms=200fallback_fade_ms=400fallback_rgb
Pattern Execution Pattern not found in registry → demo_pattern() fallback. Pattern exception → safe zero frame (all black)
Network Transport LED count / mapping_id mismatch → fallback state. DMX channel boundary exceeded → fallback state
Network Transport OSError on sendto() → log transport_fallback_enter → apply fallback policy: "blackout" (all-zeros) or "hold_last" (replay)

Single-Person End-to-End Trace

One person enters the dome. Follow the data from sensor acquisition to LED output.

1
Sensor Orbbec acquires live depth frame matching the saved floor-reference mode and resolution
2
Subtract Floor subtraction isolates one person-like vertical component above the floor
3
Extract Occupancy center + relative height derived: center=(0.32m, 0.15m), relative_height=1.12m
4
PersonObservation Emit metric centroid in camera space and feed tracker
5
Tracker Track association: person_id="p_001", status=confirmed, hit_streak=12, age=47 frames
6
PersonState Emit centroid semantics: WorldCoord(cam_x_m, height_m, cam_y_m)
7a
BlobSignal Blob adapter: confirmed → blob_id="person_track_p_001", confidence=0.88, presence=0.9 (occupancy path)
7b
Color Cube Depth-only centroid/height transport → control-space position emit via colorcube_signal
8
HSV Map Color Cube control-space position→HSV: H/S from planar sweep, V from relative height (position.z)
9
EMA Filter HSV low-pass: hsv_alpha=0.15, circular hue blend + linear S/V
10
Color State Confidence 0.88 ≥ 0.6 → active → HSV→RGB = [0.098, 0.065, 0.064]
11
Snapshot render_snapshot: mode_id=occupancy_production_v1, params assembled with cv_blobs + cell_colors
12
Pattern pattern_fn(t, xyz, params) reads cell_colors → writes frame[N, 3]
13
Transfer Output transfer: gamma=2.2: 0.098^2.2≈0.007, brightness=0.8: 0.007×0.8≈0.006
14
Transport sACN E1.31 encode → UDP sendto(controller_ip:5568) → LED on