Interacting with DeepDream, generative AI demo at GDC
This case study examines my artwork DeepDream Vision Quest, first presented at the Game Developers Conference in 2016 as a poster session. I built an art installation that dreamed about what it saw.
Context and Curiosity
Where the idea came from
Instead of identifying objects, DeepDream amplifies patterns it already knows, feeding each result back into itself until faint hints swell into full-blown hallucinations. The original DeepDream code went viral in mid-2015, after Google released the project publicly in June of that year. For a while, the internet was full of these hallucinatory images but I wasn’t interested in the memes. I was fascinated by how images grew out of noise, like a Polaroid photo of hyperspace.

Journey
Bringing it to GDC
The challenge was turning DeepDream from a slow generative process into an interactive magic mirror people could step in front of and instantly grasp.
Bringing this creative AI project to the GDC floor took months of coding and finding the right way to explain it to my colleagues.
Experiments
From Still Frame to Dreamscape
I deepened my Python skills, the go-to language for sharing machine learning experiments on GitHub, and began with still images, tweaking parameters to see how the algorithm responded to different inputs. Soon I was making movies by batch processing images frame by frame, but something was missing. Was this really just a weird video effect?
Overlook, a Haunted House · DeepDream visualization of image sequences from Stanley Kubrick’s The Shining (1980) · Art Gary Boodhoo
The breakthrough came when I built a pipeline that fed webcam images into the algorithm in real time, letting users step in, hold a pose, and watch the dreamscape reshape around their behaviors. Now it was personal. Moving the camera or moving in front of the camera now included the viewer as a participant in the dream.
System Flow Diagram · Motion-triggered rendering pipeline for live dreaming
Live Dreaming · In this early experiment the hallucinations are triggered by a moving light source
Audience Reactions
Faces, gestures, and unexpected connections
After my talks, people played with the installation, holding dramatic poses, interacting with strangers, and testing the limits. The “rules” of the game emerged quickly. Stay still and the dream would unfold; move, and it vanished.
As a UX designer, I was fascinated by how easily participants ascribed intention to the machine. That’s when I realized it wasn’t just trippy hallucinations for a crowd. It might just be a magnet for self reflection.
GDC 2016 Play Session · Selected participant portraits
Outcome





