How to build a consistent color pipeline for LED volumes in UE5 - from OCIO config to camera verification on the day.
A chart pass through the full pipeline - wall, camera, and dailies - is the fastest way to confirm alignment.
An LED volume is a display driving a light source - not a grading monitor. The image on the wall is real-time pixels captured through a camera with its own response curve, IDT, and show pipeline. Until that full chain is verified, wall-to-grade mismatches are expected. This note covers how to set up OCIO correctly in UE5 and what to check when the wall doesn't match.
Run these during prep, tech scout, or day-zero before any creative work starts.
1) Agree the ACES version and OCIO config with your DIT and post pipeline before anything else. 2) Set the wall target display space - P3-D65 is standard for most modern LED stages; confirm with the vendor. 3) Build a clean chain: camera IDT -> ACEScg working space -> look transforms -> display ODT. 4) Validate the ODT on a calibrated reference monitor before the wall is involved. 5) Load the same OCIO config into UE5's nDisplay configuration and version-lock it - wall, dailies, and DI need to reference the same file.
When the wall doesn't match, it's almost always one of these.
Early in prep you're comparing a live, partial pipeline against a final-grade reference. The mismatch is expected - it means the chain isn't verified yet, not that something is broken. When the camera pipeline is locked, the wall is calibrated to the correct display target, and the OCIO config is consistent across every tool in the room, the wall will match. That alignment is the goal of the verification process, not a happy accident.
Shoot a chart, skin tone, and neutral gray in the volume and route the camera feed through the exact OCIO pipeline used for dailies. If it matches within tolerance, lock the chain and document every setting. If it doesn't, change one variable at a time and log each change - guessing in parallel wastes time and creates new unknowns.
Using an Unreal previs as a motion track and FLUX.2 [klein] keyframes to drive near-photorealistic output with exact camera path.
How Vicon, Mo-Sys, and Vive Mars solve the same problem differently - and what to watch for on each.