Percussive Image Gestural System (PIGS)

PIGS is a software/hardware/percussive instrument designed for liveness in improvised visual performance through the use of silent percussion — i.e. visuals are performed by playing drums. PIGS has two broad aims:

  • To propose a formalized approach to visual improvisation that is structured while remaining intuitive and fluid.
  • To make the human aspect of visual performance more relatable for audiences.

"Utopian Algorithm #1" — PIGS (Percussive Image Gestural System) Studio Rehearsal/Demo, June 2018

In “Utopian Algorithm #1,” a non-cynical algorithm seeks to find the Internet’s lost Utopia by thwarting social media popularity algorithms to reveal the videos nobody gets to see.

PIGS’s approach to structured improvisation combines approaches from experimental abstract animation, musical performance, and object-oriented programming to allow for the performance of visually rhythmic structures that can be played (by playing drums) while maintaining “liveness” — i.e. the connection between a performer’s action and the result that is intuitive to both performer and audience.

Although PIGS is performed by playing (silent) drums, PIGS research does not involve sonic-visual (color/pitch, etc.) correspondences.

Writing

  • “On PIGS:” Chapter-length interview with audiovisual developers/performers Amy Alexander and Curt Miller.
    Amy Alexander and Curt Miller discuss mixed mode improvisation with PIGS (visuals) and The Farm (audio.) Alexander and Miller discuss historical visual, music and programming practices, and how these trajectories feed into the development of PIGS. The artists also discuss the specificities of mixed-modal collaborative improvisation, including the impact of representational content (algorithmically curated YouTube videos). They conclude with a discussion of PIGS as audiovisual performance research and propose some ideas for the future role of “frameless” visuals in music ensemble performance.
  • The PIGS/AlgoCurator FAQ
    “By deliberately pushing against biased “intelligent” algorithms that seek to tell us what’s important, the AlgoCurator hopes to find vestiges of real life among the universe. By navigating around popularity — which is often fueled by sensationalism — the AlgoCurator finds that the chorus of quieter voices sound a lot different than the amplified stars.”
  • See also… Inside PIGS (video).
    Amy Alexander explains how PIGS works and discusses its approach to liveness and structured improvisation.

More Videos

“June 8th, 2018” (excerpt of real-time PIGS animation)

PIGS (Percussive Image Gestural System) Live Studio Jam – People Blowing Things Up on YouTube. October 2016

Here’s PIGS’s first live performance at UCSD’s Qualcomm Institute. The QI crew did a nice job of capturing the performance interfaces:

Rocket's Red Glare at IDEAS

Current PIGS performance interfaces include iPads, Leap Motion controller, and quiet MIDI drums. Strokes are scribbled using the iPads and Leap Motion and can then replayed with variations by striking the MIDI drums. (This in some ways resembles how traditional drums work – each strike of the same drum or cymbal generates roughly the same pitch, but may vary in loudness, choking, etc.) The PIGS system allows for an assortment of variations from the original scribble in both duration and form with each drum stroke. The performer may also use this functionality to create theme and variations or looping structures. Individual drums/video layers may also be set to auto-loop, allowing the performer to improvise on the other drums/layers against the rhythms of the looping background layers.

While PIGS uses musical instruments and strategies as general models for thinking about performativity and temporal structures, care is taken not to attempt to simply translate musical approaches to visual ones. Rather than approaching audiovisual integration as a matter of synchronization of sound and image, the idea is to create an instrument that is performable as a part in a duet or ensemble (analogous to the way various instruments in an ensemble play different musical parts even though they are performing the same piece.) Likewise, while twentieth-century gesture and drawing-based abstract animators like Len Lye and Walter Ruttmann are progenitors, PIGS combines abstract drawing with live action, and it integrates contemporary visual influences from cell phone videos and YouTube, CGI, concert light shows and holographs.

Our research aim in developing PIGS is not to create an “end user” tool for other artists, but to present an expanded approach to live visuals and collaboration between visualists and musicians. Some of the specific issues PIGS addresses are: means of developing a structured approach to visual improvisation; alternatives to rectangular screen space in live cinematic composition; performance interfaces that are both intuitive for the performer and contribute to a sense of “liveness” for the audience — i.e., allowing the audience to relate to the sense that the visuals, although computationally generated, are being performed by a fallible human. Although these problems are by no means completely solved by PIGS, we hope to contribute to the discourse around these issues through its performance and presentation.

PIGS has been performed eight times as of June 2018. In the course of the first seven shows three different pre-composed “compositions for PIGS” were performed. Each composition involves building and editing a specific set of video content for performance. Curt Miller created a software instrument in parallel with PIGS, in which he combines live clarinet and talk box with real-time processing of recorded source material. Curt’s instrument focuses on facilitating musical improvisation with visuals. The first four PIGS performances were audiovisual collaborations between Amy and Curt.

AlgoCurator

As of Spring 2018, PIGS compositions can be created using a pseudo/anti-artificial intelligence. A “curator” algorithm analyzes newly uploaded YouTube videos for specified characteristics. Loosely provoked by the question: “What if an algorithm attempted to convert YouTube into Stan VanDerBeek’s utopian vision of a networked ‘culture intercom,'” the AlgoCurator currently works against YouTube’s algorithms, which encourage posters to upload content intended for commercialization. “Intelligent” algorithms seek to tell us what’s important, but often simply popularize the sensational while burying quieter voices. The AlgoCurator deliberately works against these algorithms. Read more in the AlgoCurator FAQ.

The AlgoCurator made its live debut at ICLI in Porto, June 2018 with “Utopian Algorithm #1.” “Utopian Algorithm #1” was also used to curate the content in these videos.

Shows

Amy Alexander has performed PIGS at:

“People Blowing Things Up on YouTube,” the first composition developed for PIGS, is a rumination on the YouTube culture in which people perform explosions for the camera. The videos run along an uneasy continuum from inquisitive experimentation through toxic aggressiveness and beyond.

Percussive Image Gestural System (PIGS) is created by Amy Alexander. Software development by Amy Alexander and Curt Miller with contributions by Wojciech Kosma.
Audio software: Curt Miller
Research assistant: Doug Rosman

Percussive Image Gestural System (PIGS) development has been supported by the iotaCenter, University of California Institute of Research in the Arts (UCIRA), and UCSD Academic Senate. It has been performed at events in the US, Australia, Canada and Portugal.