The PIGS / AlgoCurator FAQ

PIGS / AlgoCurator FAQ/Self-Interview
(Vaguely in the self-interview style launched by my ol’ invisible pal Plagiarist in the 1990’s)
June/July 2018

Q: What is PIGS?
A: The Percussive Image Gestural System, a system for facilitating live visuals for improvisation by way of drawing gestures and (mostly) striking drums. A paper on PIGS as visual improvisation research is currently in the works, discussing PIGS’s use of algorithms and the drum kit metaphor to develop an improvisational performance structure. PIGS consists of software (written in Max) and various hardware interfaces. None of the hardware interfaces are terribly exotic; the focus in pigs is in how the software algorithms and percussion approach allow for improvisation of structured, fluid, live visuals.

Q: How is PIGS 2.0 (2018) different from PIGS 1.0 (2016)?
A1: It’s closer to the original intent of PIGS, which was developed by the thoughtful, rational, impulsive and aggressive Amy Alexander. PIGS 1.0 allowed for slow and moderato playing, but wasn’t controllable enough to satisfactorily develop structured improvisations nor responsive enough for faster or more aggressive playing styles. As with a drum kit, you should ultimately be able to perform a range of styles with it, but one of those should eventually be punk rock.

A2: It uses the AlgoCurator to dynamically curate and download recent YouTube content for both video (PIGS / Amy Alexander) and audio (“The Farm” / Curt Miller.) Curt and Amy then improvise together using new content downloaded just before the performance. Since the content is new to us, the performance is part “improvisation” (as in music), part “improv” (as in theatre.)

Q: What is the AlgoCurator?
A: The AlgoCurator is a Python script that uses OpenCV (computer vision) and other strategies to curate content for PIGS from YouTube videos of the past several hours. Loosely provoked by Stan VanDerBeek’s idea of the “Culture Intercom,” the current implementation of the AlgoCurator considers that the current pessimism around YouTube is largely the result of algorithmic bias, and proposes an alternative algorithmic that could reveal a more utopian video network within YouTube. In its initial composition, “Utopian Algorithm #1,” the AlgoCurator seeks out personal narratives from among the least viewed content on YouTube. By running counter to YouTube algorithms’ emphasis on viewer engagement, the AlgoCurator seeks to weed out commercially, politically, and otherwise strategically generated content.

Looked at another way: Although the AlgoCurator is necessarily subject to some algorithmic biases, by deliberately applying its own desired bias to more popular, “intelligent” algorithms that seek to tell us what’s important, the AlgoCurator hopes to reveal less sensational narratives within YouTube. By navigating around popularity — which is often fueled by sensationalism — the AlgoCurator often finds that the chorus of quieter voices sounds.*

Q. Is this a utopian view or a dystopian one?
A. Yes.

Q: When does the AlgoCurator download the videos?
A: As close to performance time as reasonably possible.

Q: Why don’t you run it live during performance?
A: The AlgoCurator downloads and analyzes well over a hundred YouTube videos for a twenty minute performance, then edits and converts the videos it accepts to HAP-encoded (video) and WAV (audio) files for performance. Even running parallel threads, this process currently takes at least an hour.

Q: Bah! It’s not live! I like things that are live!
A: The improvisation is very much live; that’s the point of PIGS and The Farm. And since we haven’t seen the YouTube content before performance, and the content is very recently uploaded (and generally very recently created), it may as well be live. Amy Alexander did a lot of “live content” Google search performances in the early 2000’s and generally found the process unsatisfying: more style than substance (especially when dealing with previously created, dynamically accessed content.) As an illustration: if we were, hypothetically, to plan PIGS’s content ahead of time but download it live, it would technically be “live.” But effectively it would be less so. We are not really interested in technicalities.

Q: Is PIGS live coded?
A: No. It took years (part-time, with lots of reworks) to develop the software. If a project this complex were live coded, the performativity would be severely impacted. It would then be less live for both performers and audience. It is on the other hand, “live data,” in that the data gestures are created live and re-used as structures with variations.

Q: But what about live streams? YouTube has live streams; why not use them?
A: There are technical limitations that make it difficult to use live streams in a graphics-intensive work like PIGS. It might be possible in the future to use a few live streams within a performance, but, for reasons stated above, this isn’t Amy’s priority.

Q: You snob/lazy person/traitor! What have you got against liveness?
A: Nothing; in fact liveness is the point of PIGS. But we believe in a critical approach to liveness that focuses on overall performance and relationships of performers to audience and performers to one another rather than technically centered definitions.

Q: Dynamically downloaded internet content based on search algorithms is an old idea; people (including you, Amy) have been doing it since the 1990’s. Why do it again?
A: Paintings of sitters go back millenia. But if we were to say, “Oh, just another painting of a person sitting in a chair” about each one, it would be reductive; we understand different portraits by different artists in different eras to be distinct from another. Put another way: it’s not what you do; it’s how you do it. The compositions created by the AlgoCurator and performed by Amy and Curt have distinct styles : from the AlgoCurator’s algorithm (subjectively authored by Amy) from the PIGS and Farm visual and aural software algorithms (subjectively authored by Amy and Curt) and from Amy and Curt’s performance styles. Technology-based art is often evaluated reductively based on what a particular technical process within it does, rather than on a nuanced evaluation of a work as a whole.** We think it’d be a good idea for people to do the latter.

Q: What is the future for the AlgoCurator?
A: “Utopian Algorithm #1” is the first implementation of the AlgoCurator. It can be easily modified to create more specific implementations based on, e.g., geographical location, current events, topics, etc., as well as creating a continually running stream of the “global YouTube narrative” that more closely resembles VanDerBeek’s 1960s idea. Amy is also interested in pushing the “AI” algorithms further to develop a more pleasingly / arrogantly opinionated curator.

* Some intros to algorithmic bias:
David Casacuberta. “Algorithmic Injustice | CCCB LAB.” CCCB LAB. N. p., 2017.
Kirkpatrick, Keith. “Battling Algorithmic Bias.” Communications of the ACM 59.10 (2016): 16-17.
Danks, David & London, Alex. Algorithmic Bias in Autonomous Systems. 4691-4697. 10.24963/ijcai.2017/654. 2017.
Garcia, Megan. “Racist In The Machine.” World Policy Journal 33.4 (2016): 111-117.
Alexander, Amy. “About… Software, Surveillance, Scariness, Subjectivity (and SVEN).”
Transdisciplinary Digital Art. Sound, Vision and the New Screen. Springer, Berlin, Heidelberg, 2008. 467-475.
Levin, Sam. “A Beauty Contest Was Judged By AI And The Robots Didn’t Like Dark Skin.” the Guardian. N. p., 2016.
(That last one obviously raises a whole slew of questions about the ethics of beauty contests in general… but the article includes some useful discussion and links.)

**There are a lot of reasons for this, from a hyperfocus on algorithmic processes to a phobia of them. Given the role of algorithms and their subjectivity in the present global political conditions, we think a nuanced approach to algorithms is probably a good idea.