What the Robot Saw

what-the-robot-saw.com

After a five-year continuous run, What the Robot Saw’s cinemagical algorithms have been revamped for the online cinema of 2025..

Welcome to the contrarian algorithm. In the new world order of robots and talking heads, a reality exists that’s part life, part cinema, part business and part algorithm. It’s a secret cinematic world seen only by robots — the software robots who analyze content that may never be viewed by human eyes.

‘What the Robot Saw’ is a live, continuously-generated, robot film, curated, analyzed and edited using contrarian search algorithms, computer vision, and neural-network-based image and sound classification. Reversing the conventional logic of social media algorithms, The Robot catches glimpses of some of the least viewed and subscribed recent videos on YouTube, featuring first person narratives by some of the people commercial algorithms conceal.

The Robot’s algorithms analyze, edit, mix, compile and title the film as methodically as a robot might analyze its subjects: pixel by pixel and bit by bit. The film is a fanciful interpretation of the online zeitgeist where self-representation and performed selves meet market-driven classification and curation.




^^^Please turn on the sound.^^^… ‘What the Robot Saw’ is intended for fullscreen viewing if you can…
If the stream’s not live, or to scrub through the live or recent videos, view them on Twitch.
Thrice daily intermissions last four minutes; stream resumes on the hour. Archives are linked from the Videos page

An invisible audience of software robots continually analyze and curate content on the Internet. Videos by non-“YouTube stars” that algorithms don’t promote to the top of the search rankings or the “recommended” sidebar may be seen by few or no human viewers. For these videos, robots may be the primary audience. In ‘What the Robot Saw,’ the Robot is AI voyeur turned director: classifying and magnifying the online personas of the subjects of a never-ending film. The Robot continuously makes its way through the world of low engagement online video, organizing and describing the people and scenes it features in its documentary. The film is generated and titled algorithmically from among the least viewed and subscribed YouTube videos uploaded in recent hours and days. “What the Robot Saw” documents the complicated relationship between the world’s surveillant and curatorial AI robots and the humans who are both their subjects and their stars — people they superficially observe and analyze but cannot comprehend.

Robot meets Resting Bitch Face and Other Adventures. As it makes its way through the film, the Robot adds lower third supers: periodic section titles, derived from its image recognition-based groupings and interpreted through the Robot’s vaguely poetic perspective; and frequent identifiers, for the many human interviewees in its documentary. The identifiers — talking head style descriptions like “Confused-Looking Female, age 22-34” — are generated using Amazon Rekognition — a popular commercial face detection/recognition library. The feature set of Rekognition offers a glimpse into how computer vision robots, marketers, and others, choose to categorize humans. While attempting to adhere to Rekognition’s guidelines that differentiate a person’s appearance from their actual internal emotional state, the Robot titles each person as it analyzes/perceives them — and as marketers and others using similar software do. When you’re a computer vision robot, appearance is everything. Pixels don’t have internal states.

Full info at what-the-robot-saw.com.
What the Robot Saw v1.0 debuted in 2020. Version 2.0 debuted in early 2025.

Artist, Director, Lead Developer, AI Wrangler, Coffee Girl: Amy Alexander (contact)
Additional software development and sound design contributions: Curt Miller