I’d already posted the full length studio rehearsal / improvised audiovisual animation, “June 8th, 2018,” that Curt Miller and I recorded last month. Came across the second run-through from that day — same set of clips, but we did two improvisations. Rather like how this one flows as a film — slower paced, and you hear and see more of the people — so I’ve posted it too:
I’ve put together a new video discussing the Percussive Image Gestural System. Mostly I’m discussing / demonstrating the software from an real time experimental animation / visual music perspective: I talk about the main ways the PIGS system implements its approach to “structured improvisation.”
Abstract: Amy Alexander and Curt Miller discuss mixed modal improvisation with their custom integrated
software systems PIGS (Alexander, visuals) and The Farm (Miller, sound.) In this free-flowing
discussion, Alexander and Miller discuss historical visual, music and programming practices
including abstract animation, graphic scores, and object-oriented programming. They discuss
how these trajectories feed into the development of PIGS, a system designed to facilitate
improvisation by using drums and visual controllers to perform structured visuals. The artists
then discuss the specificities of mixed-modal collaborative improvisation, including the impact of
representational content (algorithmically curated YouTube videos) on their responses as
improvisers. They review responses to PIGS performances to date and discuss future plans for
new PIGS performance context. They conclude with a discussion of PIGS as audiovisual
performance research and propose some ideas for the future role of frameless visuals in music
Looking forward to doing some new PIGS performance and installation work with the AlgoCurator in the coming months.
Tomorrow evening June 16th! The amazing Curt Miller and I will be performing PIGS (Percussive Image Gestural System) at International Conference on Live Interfaces in Porto. It’ll be the debut of PIGS 2.0 and the AlgoCurator.
For those of you have been wondering the difference(s) between PIGS 2.0 and PIGS 1.0 — and those of you who haven’t — here’s a spiffy new PIGS / AlgoCurator FAQ.
I’m pleased as pizza to have curated this spring’s “Performing Code” exhibition featuring Shelly Knotts at gallery@calit2.
Now I’m especially ecstatic to be hosting the closing event, “Performing Code Live:”
Please join us Thursday June 7th for “Performing Code Live” at Calit2 Auditorium.
“Performing Code Live” is the closing event for the “Performing Code” gallery exhibition.
Shelly will be here live for this event. There’ll be a panel discussion on a hopefully enticing slew of topics encompassing collaborative improvisation, social hierarchies, networked collaboration as cyberfeminism, coding as performance, laptop ensembles and liveness.
Then we’ll have a live coding performance by ALGOBABEZ!
It’s free, and there’s a reception afterwards!
Performing Code Live
Featuring Shelly Knotts
Curated by Amy Alexander
Thursday, June 7, 2018
7:00 Intro from Amy Alexander and Presentation from Shelly Knotts
7:20 Calit2 Auditorium Panel with Amy Alexander, David Borgo, Shelly Knotts, Curt Miller, Suzanne Thorpe, Michael Trigilio, Pinar Yoldas and Q&A
7:40 ALGOBABEZ Performance
The amazing Curt Miller and I will be performing PIGS at the International Conference on Live Interfaces in Porto in June. Besides some improvements (hopefully) to the software and performance, it’ll be the debut of the Algocurator feature, which will attempt to curate an assemblage of global personal narrative YouTube clips of the moment; the clips will form the basis of our audiovisual improvisation. This is fun to practice; looking forward to doing it for real in Porto!
PIGS news! Finally implemented something I’ve been thinking about for awhile…
As of April 2018, PIGS compositions can be created using a pseudo-artificial intelligence. A “curator” algorithm analyzes newly uploaded YouTube videos for specified characteristics. Loosely provoked by the question: “What if an algorithm attempted to convert YouTube into Stan VanDerBeek’s utopian vision of a networked ‘culture intercom,'” the algo-curator currently works against YouTube’s algorithms, which encourage posters to upload content intended for commercialization, in the fervent hope of finding “real life.”
I hope to expand the algo-curator and use it for some other projects. I’ll explain a bit more about the initial implementation of the algo-curator another time. But for now… it’s a start! http://amy-alexander.com/pigs
New images posted to the Mary Hallock Greenewalt Visibility Project from the amazing Mary Hallock archive at Historical Society of Pennsylvania. A little bit of everything this time, from pastels and paintings to lots of scientific drawings and calculations. As always, you can search the Flickr database of uploads here.
Über-excited to be curating “Performing Code” an exhibition / performances by Shelly Knotts (featuring appearances by ALGOBABEZ and OFFAL) this Spring at Gallery@Calit2. The fun starts Thursday, April 5th with “Performing Code Stream,” a live coding and OFFAL performance stream from Melbourne, Australia. The performance is at 5 p.m. followed by a reception at 6 p.m. In June, Shelly will be live in San Diego for another performance and a panel discussion.