RENDERING THE SPACE
One Environment, Ten Perspectives
The chamber’s cubic form and compressed vertical view created a hard constraint for asset production: every frame had to maintain correct perspective while filling all 10 vertical screens without distortion. Using a custom camera blueprint in Unreal Engine, I generated panoramic renders. From each frame, I cropped a vertical slice to match the display height, then distributed it in TouchDesigner across ten projectors, ensuring seamless 360° coverage.
CineChamber calibration at Gray Area/Grand Theater, San Francisco, Screenshots · Maintaining a seamless image requires regularly realigning ten projectors · Photograph Gary Boodhoo
Projection Viewport Map, Production Notes · How I segmented a 360° panorama into ten viewports for CineChamber’s projection system · 3D Design Gary Boodhoo
AESTHETICS
I thought spectacle was enough. It wasn’t.
Early content tests with video game–style environments looked good but lacked heart. While the scenes were visually rich, they didn’t evoke a personal connection. Even so, these initial attempts validated the technical direction. The rendering pipeline worked as intended, which allowed a shift toward a more grounded approach.
Mechanical Environment, Production Test · I used a 3rd party scene from the Unreal Marketplace to examine how atmosphere and motion translated to the venue · Technical Direction & Design Gary Boodhoo
LOW-STAKES LIDAR
Low-res scans, high-res presence
Using my iPhone’s LiDAR, I began casually scanning everyday objects and familiar spaces. I experimented with walking scans and layered exposures—capturing motion by moving with the subject or letting them shift during the scan. The results were rough but expressive, more like memory than measurement. Imperfections became a feature, turning scan noise into atmospheric texture.
3D Design Gary Boodhoo, Scanning while moving, LiDAR capture • Motion appears as overlapping point distributions in the scan. CloudCompare, an open-source tool for viewing 3D scan data, is shown here navigating the the dataset
3D Design Gary Boodhoo, Multiple Exposures, LiDAR Capture • A mid-scan position shift creates duplicate figures in the point cloud.
FILLING THE ROOM
Ten Screens, One Voice
I processed point clouds using Niagara, a visual scripting tool for creating and managing particle effects within Unreal Engine. Each point became a responsive particle, animated by sound, motion, and spatial logic. Niagara had a bit of a learning curve, but also introduced room for happy accidents. This became a way to compose across surfaces and fill the room with movement and rhythm.
Pipeline Overview:
LiDAR Capture (iPhone) → Point Cloud Processing (Meshlab, CloudCompare) → Import to Unreal Engine → Particle Animation (Niagara) → Panoramic Render (Unreal) → Cropping & Video Encoding Per Viewport (After Effects) → Synchronized Video Playback to 10-unit Projection Array (TouchDesigner)
Technical Direction & Design Gary Boodhoo, Fragmented Living Room, LiDAR Capture • Reassigning video feeds to different projectors allows one side of the cube to mirror the other
POINT CLOUD CINEMA
Projecting memory and motion into space
XOHOLO demonstrated a virtuality based on projecting space outward, not inward. That shift made it possible to share the experience with others. LiDAR snapshots of familiar things became an unexpectedly rich resource. Point cloud cinema emerged from their rhythmic transformations through projected space.
3D Design Gary Boodhoo Sound Design Yoann Resmond, XOHOLO
Outcome