48
32
32
Interacting with machine learning in public, generative art at GDC
Interacting with machine learning in public, generative art at GDC
Interacting with machine learning in public, generative art at GDC
This case study examines my artwork DeepDream Vision Quest, first presented at the Game Developers Conference in 2016 as a poster session. I built an art installation that dreamed about what it saw.
This case study examines my artwork DeepDream Vision Quest, first presented at the Game Developers Conference in 2016 as a poster session. I built an art installation that dreamed about what it saw.
This case study examines my artwork DeepDream Vision Quest, first presented at the Game Developers Conference in 2016 as a poster session. I built an art installation that dreamed about what it saw.
Interacting with machine learning in public, generative art at GDC
This case study examines my artwork DeepDream Vision Quest, first presented at the Game Developers Conference in 2016 as a poster session. I built an art installation that dreamed about what it saw.
48
48
32
32

Years before generative AI was mainstream, I presented early ideas for making interactive art with deep learning, using the conference as a platform for introducing game developers to the technology.
Photograph TK Rex

Years before generative AI was mainstream, I presented early ideas for making interactive art with deep learning, using the conference as a platform for introducing game developers to the technology.
Photograph TK Rex

Years before generative AI was mainstream, I presented early ideas for making interactive art with deep learning, using the conference as a platform for introducing game developers to the technology.
Photograph TK Rex

Years before generative AI was mainstream, I presented early ideas for making interactive art with deep learning, using the conference as a platform for introducing game developers to the technology.
Photograph TK Rex
48
48
32
32
That’s when I realized it wasn’t just trippy hallucinations for a crowd. It might be a magnet for self reflection.
48
48
32
32
48
32
32
48
Contents
Contents
Contents
Contents
48
32
32
48
Context
Where the idea came from
Context
Where the idea came from
Context
Where the idea came from
Context
Where the idea came from
Instead of identifying objects, DeepDream amplifies patterns it already knows, feeding each result back into itself until faint hints swell into full-blown hallucinations. The original DeepDream code went viral in mid-2015, after Google released the project publicly in June of that year. For a while, the internet was full of these hallucinatory images but I wasn’t interested in the memes. I was fascinated by how images grew out of noise, like a Polaroid photo of hyperspace.
Instead of identifying objects, DeepDream amplifies patterns it already knows, feeding each result back into itself until faint hints swell into full-blown hallucinations. The original DeepDream code went viral in mid-2015, after Google released the project publicly in June of that year. For a while, the internet was full of these hallucinatory images but I wasn’t interested in the memes. I was fascinated by how images grew out of noise, like a Polaroid photo of hyperspace.
Instead of identifying objects, DeepDream amplifies patterns it already knows, feeding each result back into itself until faint hints swell into full-blown hallucinations. The original DeepDream code went viral in mid-2015, after Google released the project publicly in June of that year. For a while, the internet was full of these hallucinatory images but I wasn’t interested in the memes. I was fascinated by how images grew out of noise, like a Polaroid photo of hyperspace.
Instead of identifying objects, DeepDream amplifies patterns it already knows, feeding each result back into itself until faint hints swell into full-blown hallucinations. The original DeepDream code went viral in mid-2015, after Google released the project publicly in June of that year. For a while, the internet was full of these hallucinatory images but I wasn’t interested in the memes. I was fascinated by how images grew out of noise, like a Polaroid photo of hyperspace.
48
32
32
48




Dogslug (2015): A glimpse into the machine's subconscious. Before the polished realism of modern diffusion models, early AI art was characterized by this distinct, hallucinogenic style.
Dogslug (2015): A glimpse into the machine's subconscious. Before the polished realism of modern diffusion models, early AI art was characterized by this distinct, hallucinogenic style.
Creator Alexander Mordvintsev




Visualizing the Basset Hound class in GoogLeNet (a pioneering 2014 vision model) by optimizing random noise.
Research Audun M. Øygard,
48
32
32
48
Journey
Bringing it to GDC
Journey
Bringing it to GDC
Journey
Bringing it to GDC
Journey
Bringing it to GDC
GDC reshaped my experimentation as a deliverable: no onboarding, no facilitation, constant turnover, and zero tolerance for latency or failure. People had to get it in seconds, which shifted the goal from interesting output to legible behavior. The challenge was turning DeepDream from a slow, generative process into an interactive mirror people could step in front of and instantly grasp.
This magic mirror metaphor guided everything. Mirrors require no instructions, but if the system hesitated, the illusion broke. Reliability was the core UX requirement.
GDC reshaped my experimentation as a deliverable: no onboarding, no facilitation, constant turnover, and zero tolerance for latency or failure. People had to get it in seconds, which shifted the goal from interesting output to legible behavior. The challenge was turning DeepDream from a slow, generative process into an interactive mirror people could step in front of and instantly grasp.
This magic mirror metaphor guided everything. Mirrors require no instructions, but if the system hesitated, the illusion broke. Reliability was the core UX requirement.
GDC reshaped my experimentation as a deliverable: no onboarding, no facilitation, constant turnover, and zero tolerance for latency or failure. People had to get it in seconds, which shifted the goal from interesting output to legible behavior. The challenge was turning DeepDream from a slow, generative process into an interactive mirror people could step in front of and instantly grasp.
This magic mirror metaphor guided everything. Mirrors require no instructions, but if the system hesitated, the illusion broke. Reliability was the core UX requirement.
GDC reshaped my experimentation as a deliverable: no onboarding, no facilitation, constant turnover, and zero tolerance for latency or failure. People had to get it in seconds, which shifted the goal from interesting output to legible behavior. The challenge was turning DeepDream from a slow, generative process into an interactive mirror people could step in front of and instantly grasp.
This magic mirror metaphor guided everything. Mirrors require no instructions, but if the system hesitated, the illusion broke. Reliability was the core UX requirement.
48
32
32
48

Bringing this creative AI project to the GDC floor took months of coding, but the hard part was finding the right way to explain it to my colleagues.
Photograph TK Rex

Bringing this creative AI project to the GDC floor took months of coding, but the hard part was finding the right way to explain it to my colleagues.
Photograph TK Rex

Bringing this creative AI project to the GDC floor took months of coding, but the hard part was finding the right way to explain it to my colleagues.
Photograph TK Rex

Bringing this creative AI project to the GDC floor took months of coding, but the hard part was finding the right way to explain it to my colleagues.
Photograph TK Rex
48
32
32
48
Breakthrough
From still frames to dreamscapes
Breakthrough
From still frames to dreamscapes
Breakthrough
From still frames to dreamscapes
Breakthrough
From still frames to dreamscapes
I deepened my Python skills, the go-to language for sharing machine learning experiments on GitHub, and began with still images, tweaking parameters to see how the algorithm responded to different inputs. Soon I was making movies by batch processing images frame by frame, but something was missing. Was this really just a weird video effect?
I deepened my Python skills, the go-to language for sharing machine learning experiments on GitHub, and began with still images, tweaking parameters to see how the algorithm responded to different inputs. Soon I was making movies by batch processing images frame by frame, but something was missing. Was this really just a weird video effect?
I deepened my Python skills, the go-to language for sharing machine learning experiments on GitHub, and began with still images, tweaking parameters to see how the algorithm responded to different inputs. Soon I was making movies by batch processing images frame by frame, but something was missing. Was this really just a weird video effect?
I deepened my Python skills, the go-to language for sharing machine learning experiments on GitHub, and began with still images, tweaking parameters to see how the algorithm responded to different inputs. Soon I was making movies by batch processing images frame by frame, but something was missing. Was this really just a weird video effect?
48
32
32
48
Overlook, a Haunted House · Recombinant media from The Shining (1980). Feature amplification dissolves the footage into a dense, recursive landscape of latent imagery.
Generative Video Gary Boodhoo, Skinjester Studio
Overlook, a Haunted House · Recombinant media from The Shining (1980). Feature amplification dissolves the footage into a dense, recursive landscape of latent imagery.
Generative Video Gary Boodhoo, Skinjester Studio
Overlook, a Haunted House · Recombinant media from The Shining (1980). Feature amplification dissolves the footage into a dense, recursive landscape of latent imagery.
Generative Video Gary Boodhoo, Skinjester Studio
Overlook, a Haunted House · Recombinant media from The Shining (1980). Feature amplification dissolves the footage into a dense, recursive landscape of latent imagery.
Generative Video Gary Boodhoo, Skinjester Studio
48
32
32
48
The breakthrough came when I built a pipeline that fed webcam images into the algorithm in real time, letting users step in, hold a pose, and watch the dreamscape reshape around their behaviors. Now it was personal. Moving the camera or moving in front of the camera now included the viewer as a participant in the dream.
The breakthrough came when I built a pipeline that fed webcam images into the algorithm in real time, letting users step in, hold a pose, and watch the dreamscape reshape around their behaviors. Now it was personal. Moving the camera or moving in front of the camera now included the viewer as a participant in the dream.
The breakthrough came when I built a pipeline that fed webcam images into the algorithm in real time, letting users step in, hold a pose, and watch the dreamscape reshape around their behaviors. Now it was personal. Moving the camera or moving in front of the camera now included the viewer as a participant in the dream.
The breakthrough came when I built a pipeline that fed webcam images into the algorithm in real time, letting users step in, hold a pose, and watch the dreamscape reshape around their behaviors. Now it was personal. Moving the camera or moving in front of the camera now included the viewer as a participant in the dream.
48
32
32
48

By counting pixel deltas between webcam frames, the pipeline transitions between the raw camera feed and DeepDream processing, allowing users to enter the hallucination simply by holding still.
System Design Gary Boodhoo, Skinjester Studio

By counting pixel deltas between webcam frames, the pipeline transitions between the raw camera feed and DeepDream processing, allowing users to enter the hallucination simply by holding still.
System Design Gary Boodhoo, Skinjester Studio

By counting pixel deltas between webcam frames, the pipeline transitions between the raw camera feed and DeepDream processing, allowing users to enter the hallucination simply by holding still.
System Design Gary Boodhoo, Skinjester Studio

By counting pixel deltas between webcam frames, the pipeline transitions between the raw camera feed and DeepDream processing, allowing users to enter the hallucination simply by holding still.
System Design Gary Boodhoo, Skinjester Studio
48
32
32
48
Live dreaming with motion as modulation. The moving light generates sufficient pixel difference between frames to drive the transition between the live camera feed and the hallucinated output.
Video Capture DeepDream Vision Quest
Live dreaming with motion as modulation. The moving light generates sufficient pixel difference between frames to drive the transition between the live camera feed and the hallucinated output.
Video Capture DeepDream Vision Quest
Live dreaming with motion as modulation. The moving light generates sufficient pixel difference between frames to drive the transition between the live camera feed and the hallucinated output.
Video Capture DeepDream Vision Quest
Live dreaming with motion as modulation. The moving light generates sufficient pixel difference between frames to drive the transition between the live camera feed and the hallucinated output.
Video Capture DeepDream Vision Quest
48
32
32
48
Audience
Unexpected connections
Audience
Unexpected connections
Audience
Unexpected connections
Audience
Unexpected connections
After my talks, people played with the installation, holding dramatic poses, interacting with strangers, and testing the limits. The “rules” of the game emerged quickly. Stay still and the dream would unfold; move, and it vanished.
As a UX designer, I was fascinated by how easily participants ascribed intention to the machine. That’s when I realized it wasn’t just trippy hallucinations for a crowd. It might just be a magnet for self reflection.
After my talks, people played with the installation, holding dramatic poses, interacting with strangers, and testing the limits. The “rules” of the game emerged quickly. Stay still and the dream would unfold; move, and it vanished.
As a UX designer, I was fascinated by how easily participants ascribed intention to the machine. That’s when I realized it wasn’t just trippy hallucinations for a crowd. It might just be a magnet for self reflection.
After my talks, people played with the installation, holding dramatic poses, interacting with strangers, and testing the limits. The “rules” of the game emerged quickly. Stay still and the dream would unfold; move, and it vanished.
As a UX designer, I was fascinated by how easily participants ascribed intention to the machine. That’s when I realized it wasn’t just trippy hallucinations for a crowd. It might just be a magnet for self reflection.
After my talks, people played with the installation, holding dramatic poses, interacting with strangers, and testing the limits. The “rules” of the game emerged quickly. Stay still and the dream would unfold; move, and it vanished.
As a UX designer, I was fascinated by how easily participants ascribed intention to the machine. That’s when I realized it wasn’t just trippy hallucinations for a crowd. It might just be a magnet for self reflection.
48
32
32
48
















96
64
64
96
Outcome
Outcome
Outcome
Outcome
What started as a one-day tech demo at GDC 2016 grew into a four-year touring installation, with DeepDream Vision Quest appearing at festivals, galleries, and museums across the Bay Area.
What started as a one-day tech demo at GDC 2016 grew into a four-year touring installation, with DeepDream Vision Quest appearing at festivals, galleries, and museums across the Bay Area.
What started as a one-day tech demo at GDC 2016 grew into a four-year touring installation, with DeepDream Vision Quest appearing at festivals, galleries, and museums across the Bay Area.
What started as a one-day tech demo at GDC 2016 grew into a four-year touring installation, with DeepDream Vision Quest appearing at festivals, galleries, and museums across the Bay Area.
96
96
96
96


