Sneak Peek: DeepReals

I’ve had a couple of requests to see one of my works-in-progress — my first foray into working with generative machine learning. So here it is, an alternative to DeepFakes: DeepReals.

The First Three Minutes
Every frame from the first three minutes of Christine Blasey Ford’s and Brett Kavanaugh’s testimony before the Senate Judiciary Committee, September 2018.

Also still working on the time-based online project I mentioned in my previous post. Since it runs in real time, that one’s a bit more involved! Hopefully will have that one in beta over the summer when I have some bigger chunks of time to work on tricky things!

Project and teaching updates (finally!)

My bad, I’ve been overly sporadic about my sporadic updates again! Been busy, but here’s a brief one:
Still working on new online/installation project. Buzzwordy keywords: Real-time video, social media, algorithmic subjectivity, computer vision. Hopefully will have the equipment soon to get a beta version online, but mainly focusing on teaching til early June.

Teaching! This quarter I’m teaching a new “special topics” seminar course in computer vision/machine learning/algorithmic bias practice and critical contemporary issues. Also teaching one of our sections of “ICAM Senior Projects,” where I get to mentor some of our fabulous ICAM senior undergrads in their computing-in-the-arts graduation projects.

And of course, the Mary Hallock Greenewalt Visibility Project continues!

Quick bits — upcoming stuff

A quick update on what I’ve been up to during sabbatical:

Working on a new durational, real-time algorithmic, live streamed movie / installation project. Almost ready to debut; stay tuned!

Also, happy to report that “June 8th, 2018” (take 2) a real-time improvised short film produced within a studio rehearsal of PIGS performance, has been selected as part of SIGGRAPH Digital Arts Community’s online exhibition, “Urgency of Reality in a Hyper-Connected Age.” . More details on that as I have ’em.

Also been doing some research and development on more things computer vision and machine learning related. Some of which goes into the upcoming durational movie, and some which doesn’t. Stay tuned!

New work in progress!

Sabbatical update: I’m making new work! Focusing on online and installation work once again; a few different projects:

Algorithmic. Video. Still Image. Computer vision. Border region. Global. Social media. Speculative futures. And presents.

Despite the string of buzzwords, those are really some of the topics I’m working on. Some of it follows on from the ideas I started dealing with in “Utopian Algorithm #1,” and others are quite a bit different.

I’ll be posting more as I go along, but if you’d like to know more, give me a ping!

“Googling Californias” is up on the site (five years late)

Back in 2013, Rick Silva invited me to make a project for the “W-E-S-T-E-R-N D-I-G-I-T-A-L” pavilion he was curating at The Wrong Biennale. “W-E-S-T-E-R-N D-I-G-I-T-A-L” being a pavilion featuring the work of west coast artists, I started thinking about what “west coast” means — and what “California” means. I decided to do something on the theme of Californias. I sent Rick the video and HTML links for the show at the time and did a news post about it here. But apparently I neglected to make “Googling Californias” a proper page on my site, which caused it to essentially drop off the face of the Internet after “The Wrong” ended. I just unearthed it again tonight. Thanks to Rick for inviting me to make something “wrong” on purpose!

So here it is, with its five years belated webpage: “Googling Californias (Half Truths for People on the Go.)” Video Loop, 2013. (Original, theoretically better quality web video here.)

Late at night, thoughts wander — and we find ourselves Googling Californias. Seems fitting — and all wrong: Google is itself the image of early 21st-century California technology, commerce, and power. It lives in California — and it doesn’t. The images Google offers up form a muddled patchwork of stereotypes and half-truths; but the awkward thing about half-truths is that they are half true. Like stereotypes of California itself: as awkwardly accurate as they are grotesque distortions. Sometimes you don’t find the California you were searching for. The system failed you — or you failed the system. Or maybe you weren’t looking for the right Californias. And as you travel between Californias, you remember, there’s yet another California beyond the borders of California. It doesn’t stop here. And it does.

New PIGS film: “June 8th – take 2”

[For those coming here anew: Here’s what PIGS is, and an intro to the AlgoCurator who selected the clips for “June 8th.” ]

I’d already posted the full length studio rehearsal / improvised audiovisual animation, “June 8th, 2018,” that Curt Miller and I recorded last month. Came across the second run-through from that day — same set of clips, but we did two improvisations. Rather like how this one flows as a film — slower paced, and you hear and see more of the people — so I’ve posted it too:

June 8th: Take 2 (full, uncut real-time animation) from Amy Alexander on Vimeo.

Also on YouTube … (be sure to set your YouTube setting for 1080p60fps.) Definitely better experienced with good monitor and speakers than on a laptop (or *gasp* — a phone!)

New video: “Inside PIGS.”

I’ve put together a new video discussing the Percussive Image Gestural System. Mostly I’m discussing / demonstrating the software from an real time experimental animation / visual music perspective: I talk about the main ways the PIGS system implements its approach to “structured improvisation.”

Inside PIGS: Amy Alexander discusses the Percussive Image Gestural System from Amy Alexander on Vimeo.

For a deeper dive into the historical/critical aspects of PIGS and collaborative audiovisual improvisation, check out the interview Curt Miller and I did a few weeks ago, “On PIGS.”

New PIGS text (interview) and videos!

Happy summer! I’ve got lots of new PIGS (Percussive Image Gestural System) stuff posted!

Videos: (all with Curt Miller):
The first three are the first videos I’ve been able to make with a satisfactory capture setup. (Frame rate not quite up to snuff on the first one so it looks jerkier than in “real life,” but the “June 8th, 2018” videos are full 60fps.)
“Utopian Algorithm #1” — PIGS (Percussive Image Gestural System) Studio Rehearsal/Demo, June 2018
“June 8th, 2018 – excerpt” — PIGS Real-time animation excerpt
“PIGS film! – June 8th, 2018 – excerpt” The full uncut real-time animated studio performance — a recording turned “PIGS Film.”
Documentation video of PIGS performance at ICLI (International Conference on Live Interfaces), Porto.

Finally, we’ve done some substantial writing about PIGS, the colliding histories behind it, our responses to working in mixed-modal (audio/visual) improvisation, and our responses to improvising with algorithmically curated, near real time videos by real people on YouTube. Hope you’ll have a read!
“On PIGS:” Chapter-length interview with audiovisual developers/performers Amy Alexander and Curt Miller.

Abstract: Amy Alexander and Curt Miller discuss mixed modal improvisation with their custom integrated
software systems PIGS (Alexander, visuals) and The Farm (Miller, sound.) In this free-flowing
discussion, Alexander and Miller discuss historical visual, music and programming practices
including abstract animation, graphic scores, and object-oriented programming. They discuss
how these trajectories feed into the development of PIGS, a system designed to facilitate
improvisation by using drums and visual controllers to perform structured visuals. The artists
then discuss the specificities of mixed-modal collaborative improvisation, including the impact of
representational content (algorithmically curated YouTube videos) on their responses as
improvisers. They review responses to PIGS performances to date and discuss future plans for
new PIGS performance context. They conclude with a discussion of PIGS as audiovisual
performance research and propose some ideas for the future role of frameless visuals in music
ensemble performance.

Looking forward to doing some new PIGS performance and installation work with the AlgoCurator in the coming months.

Meanwhile, you can find the whole slew of past and present PIGS info at the usual place:

Performing Code Live @ UC San Diego Featuring Shelly Knotts

I’m pleased as pizza to have curated this spring’s “Performing Code” exhibition featuring Shelly Knotts at gallery@calit2.
Now I’m especially ecstatic to be hosting the closing event, “Performing Code Live:”
Please join us Thursday June 7th for “Performing Code Live” at Calit2 Auditorium.
“Performing Code Live” is the closing event for the “Performing Code” gallery exhibition.

Shelly will be here live for this event. There’ll be a panel discussion on a hopefully enticing slew of topics encompassing collaborative improvisation, social hierarchies, networked collaboration as cyberfeminism, coding as performance, laptop ensembles and liveness.

Then we’ll have a live coding performance by ALGOBABEZ!

It’s free, and there’s a reception afterwards!

Performing Code Live
Featuring Shelly Knotts
Curated by Amy Alexander
Thursday, June 7, 2018
Time: 7pm-9pm

7:00 Intro from Amy Alexander and Presentation from Shelly Knotts
7:20 Calit2 Auditorium Panel with Amy Alexander, David Borgo, Shelly Knotts, Curt Miller, Suzanne Thorpe, Michael Trigilio, Pinar Yoldas and Q&A
7:40 ALGOBABEZ Performance
8:10 Reception

ALGOBABEZ: Joanne Armitage and Shelly Knotts

More info:
Maps, Directions, etc.: