Psyched to have What the Robot Saw as part of the xcoax 2020 conference/exhibition July 8th – 10th! http://www.xcoax.org/
8th Conference on Computation, Communication, Aesthetics & X
Although originally scheduled for Graz, Austria. this year XCOAX be online.
What the Robot Saw’s page is here. It includes a video presentation by yours truly, and some other goodies.
“xCoAx is an exploration of the intersection where computational tools and media meet art and culture, in the form of a multi-disciplinary enquiry on aesthetics, computation, communication and the elusive X factor that connects them all.
xCoAx has been an occasion for international audiences to meet and exchange ideas, in search for interdisciplinary synergies among computer scientists, artists, media practitioners, and theoreticians at the thresholds between digital arts and culture.”
Also, I’ve written a new paper about the Robot and Algorithmic Bias. It’s an extended version the XCOAX paper, with additional writing on the algorithmic bias issues in the context of visibility within social networks. The Algorithm is the Message: What the Robot Saw
What the Robot Saw has recently moved to Twitch. (Though you can always find it at the WTRS homepage, regardless of where the stream is.) Despite their being gaming-centered, Twitch seems to be working out well: The compression algorithms are presumably designed for computer animated games — so, minus a few nitpicks, the image quality for WTRS’s graphics is much better than it was on YouTube. It’s easy for the Robot to start and stop streams throughout the day. And best of all — so far their “community guidelines” algorithms aren’t knocking the Robot offline at every turn. (For the record, the Robot is an upstanding member of its community — just look at all the recycling it does… )
To be fair, I understand YouTube’s predicament in trying to train AI’s to filter out troll videos: I know it’s hard because I’ve been training the Robot to do that kind of thing for over a year. In the Robot’s case, it curates a small number of videos to begin with. If the Robot overaggressively culls content, it doesn’t knock that content offline entirely. In YouTube’s case, it’s a bigger deal, and can have a potentially bigger impact. ‘What the Robot Saw’ is a silly robot film, but it’s about the serious concern of making videos (and by extension, their humans) invisible if they don’t fit the mass popularity algorithm. Usually it happens by burying them in recommendations and search results. Now the “misfits” are being removed entirely. The Robot will be OK. But what — and who — else is not getting seen that should be? YouTube has become a global meta-medium that influences how we perceive the culture of the moment. And it’s search algorithms, then, are a meta-meta-medium. So — this matters.
YouTube has said they’re trying to keep videos off their site that spread of disinformation about Coronavirus. And stopping disinformation is a good thing. But they’ve got a dilemma, because at the intersection of monetization, popularity, and fear lies disinformation. AI’s are notoriously imperfect moderators, so a lot of disinfo still gets out, while some amount of legitimate content gets censored. Both parts of this are worth keeping an eye on. Which will be hard to do, since not even Robots can see what isn’t there.
For now, here’s what the Robot does see. (BTW, I’ve finally had a chance to do some timing work on the animation and other cleanup. So, here we go:)
Hi all, hope you’re doing ok in this batshit crazy time…
Not the most important thing on the plate right now, but it’s been a strange time for “What the Robot Saw” too. I’d always thought the 24/7 YouTube archives would be some kind of strange time capsule of a slice of the world on the internet. I hadn’t anticipated that it would become a time capsule of a world moving *into* the internet. But as our collective Elvis left the building, so, naturally, did YouTube’s employees. That’s of course a good health move right now. But interestingly, they left a robot in charge of policing YouTube. Apparently their robot doesn’t like my robot — it started removing all ‘What the Robot Saw’s’ new streams from YouTube, labeling them as Community Violations. (You can’t make this stuff up folks; AI’s finally *are* taking over, and they’ve got their own robo-hegemony… )
Robowars aside, it seems to me like this is a moment when “What the Robot Saw” can be an interesting participant, as it watches so many of us move into YouTube. So I’m trying to keep it going by moving to Facebook Live. I haven’t worked out the tech at all yet to get it run continuously, create the archives, etc. So for now it’ll just run as much as I can keep it running manually.
OK, here’s what I’ve been working on. It’s net art! (Not exactly like my old 90’s net art, but…)
And I’m happy to say that the amazing Curt Miller will be once again working with me to do some sound work on the project.
Is it really, really done? No.** Is it live already? Yes. So why not have a peek? I imagine I’ll be awkwardly promoting it for real soon enough:
Welcome to Robot TV.‘What the Robot Saw’ (v 0.1 alpha) is a perpetual, robot-generated livestream film, curated and edited algorithmically from among the least viewed and subscribed YouTube videos uploaded over the past few hours. A Robot/AI filmmaker makes its way through the world of online video, focusing its attention on people who don’t usually get attention.
If the stream isn’t live, you can find recent archives here.
An invisible audience of software robots continually analyze content on the Internet. Videos by non-“YouTube stars” that algorithms don’t promote to the “recommended” sidebar or award “verified” status may be seen by few or no human viewers. For these videos, robots may be the primary audience. In ‘What the Robot Saw,’ the Robot is both AI voyeur and film director, depicting and magnifying the online personas of the subjects of a never-ending film.
Using computer vision, neural networks, and other robotic ways of seeing and understanding, the Robot continually selects, edits, and arranges recently uploaded public YouTube clips from among those with low subscriber and view counts, focusing on personal videos. A loose, stream-of-consciousness structure is developed as the Robot organizes the clips as a drift through neural network-determined groupings. As the Robot generates the film, it streams it live back to YouTube for public viewing. The film navigates a slice of social media that’s overlooked by the usual search and recommendation algorithms, thus largely only visible to robotic-algorithmic voyeurs.
As time zones around the globe sleep and wake, ‘What the Robot Saw’ follows the circadian rhythms of the world’s uploads. So tune in now and then. Robots never sleep*.
* For now, the live stream runs eighteen hours a day; there are one hour “intermissions” every four hours (and as needed for maintenance.) Archives of some recent streams are available on the Videos page or on the YouTube Channel.
‘What the Robot Saw’ is a non-commercial project.
** This is version 0.1-alpha, an initial implementation. There’s work still to be done in terms of structure, timing, sound, and AI. Versions focused on different content will also likely be spawned in the future.
Although the YouTube live stream is central to the project, the technical limitations of live streaming mean the image and sound quality are not ideal and may vary with network conditions. A high quality stream can be generated locally for art installations and screenings.
I’ve been coding away this summer, trying to get my new social media video algorithmic curation project ready. Lots of fun with everything from neural net and computer vision-based video classification to algorithmically-based sound and picture editing. It’ll be a live stream, plus available for installations. It’s almost ready for beta, so stay tuned!
I’ve had a couple of requests to see one of my works-in-progress — my first foray into working with generative machine learning. So here it is, an alternative to DeepFakes: DeepReals.
The First Three Minutes
Every frame from the first three minutes of Christine Blasey Ford’s and Brett Kavanaugh’s testimony before the Senate Judiciary Committee, September 2018.
Also still working on the time-based online project I mentioned in my previous post. Since it runs in real time, that one’s a bit more involved! Hopefully will have that one in beta over the summer when I have some bigger chunks of time to work on tricky things!
My bad, I’ve been overly sporadic about my sporadic updates again! Been busy, but here’s a brief one:
Still working on new online/installation project. Buzzwordy keywords: Real-time video, social media, algorithmic subjectivity, computer vision. Hopefully will have the equipment soon to get a beta version online, but mainly focusing on teaching til early June.
Teaching! This quarter I’m teaching a new “special topics” seminar course in computer vision/machine learning/algorithmic bias practice and critical contemporary issues. Also teaching one of our sections of “ICAM Senior Projects,” where I get to mentor some of our fabulous ICAM senior undergrads in their computing-in-the-arts graduation projects.