
Process
Contemporary mechanical hallucinations
From the start there was a mismatch. Live video runs at 30 frames per second, while DeepDream hallucinations emerge slowly. A completed image might take 20 seconds to fully resolve, so I didn't wait for them to do so. It was more important to watch the dream emerge in real time than to see it perfectly sharp right away.
Multi-scale image processing with DeepDream · DeepDream operates in successive stages, each yielding a sharper, higher-resolution output. Showing intermediate steps kept the system visually responsive.
A game of hide-and-seek · Stillness triggers dreaming, motion returns to live video. Refreshing the screen with partial dream updates kept the interaction feeling responsive.
Hardware
The physical pipeline
For every exhibition, my hardware rig was disassembled, transported, and reassembled onsite. Then broken down, packed, and stored again. Interesting how this physical pipeline mirrored the image generation pipeline. Over time, my logistics became rigorously simple, with labeled components tracked at every stage, including:
Small form factor Core i7 workstation 16GB
NVIDIA Titan X (Pascal) GPU
30" 4K monitor
Logitech C920 webcam (2)
65" HDTV
Mobile flat panel AV cart
Mobile computer cart
TV backlighting
USB control keypad
C-stands and mounts
Mobile computer cart
Venue-specific lighting
Cabling
Power adaptors and strips
DeepDream Vision Quest installation rig
Software
Hide and seek
As I mentioned, the system dreamed only when the video feed was nearly still. I counted differences between frames and when this value fell below a threshold, the system began hallucinating. When motion picked up again, the live feed smoothly faded back in. The experience was a game of hide-and-seek. Transitioning smoothly between the live and dreaming states became the key driver of my development efforts.
Motion detection monitor screen · Observing motion by counting pixel changes between frames. Once the scene settled under a threshold, the system slipped into dreaming
Time-series visualization of pixel changes · Plotting pixel changes (purple) to separate motion from stillness (green). A rolling average (orange) helped the system adapt for distance, background, and light.
Exhibition history
2019
Beyond Human Nightlife Event, California Academy of Sciences, San Francisco, CA, August 23
2018
The 5th Last Festival, Stanford Linear Accelerator, Stanford, CA, April 26–29
If So, What? (ISW) Festival, Palace of Fine Arts, San Francisco, CA, April 26–29
Lost Chord Awards for the Ritual, Sacred and Folk Arts, Qal’bu Maryam Women’s Mosque, Berkeley, CA, June 2
CODAME Art + Tech Festival, The Midway, San Francisco, CA, June 6–7
Noise Floor 2, Red Victorian, San Francisco, CA, August 25
2017
The 4th Last Festival, Hammer Theater, San Jose, CA, April 7–8
living.room, Soma Grand, San Francisco, CA, October 7
Noise Floor 1, The East Cut, San Francisco, CA, December 9
2016
CODAME Art + Tech Festival, Hotel Zetta, San Francisco, CA, November 11
Outcome