TL;DR : When artists and art historians are asked to engage with the tech industry, introspection happens.

The Austin-based AI conference, Time Machine, annually brings together thought leaders in tech, government, industry, business, research, and the arts to discuss how AI and smart technology are transforming their respective fields. Taking place over the course of two days on November 13 and 14, the conference coincided with the run of artist Maria Antelman’s exhibition Mechanisms of Affection at the Visual Arts Center, curated by then-graduate student, curator and writer Taylor Bradley (MA, 2012; Ph.D. in Art History, 2019). Bradley was invited to the conference to moderate the panel, “Creativity At the Intersection of Humans & Machines,” which hosted artists Refik Anadol, Sougwen Chung, and Mike Tyka in a conversation about the role of AI in a future for visual art and material culture. 

On the first day of the conference, Hyperallergic published a review of Mechanisms of Affection from art critic Lydia Pyne, where Pyne channeled Bradley’s own ethos as a panel moderator and art historian interested in finding critical focus in conversations about the use of or reference to technology in contemporary art, writing, “From satellites to eye exams, from microfilm to IBM mainframes, machines chronicle human experiences, and, at least nominally, tells us something about ourselves. Machines are mediators.” Where much of the conference was focused on the implementation of AI in industry, Bradley asked the artists on the panel to engage in a conversation about mediation and the broader implications of AI on our society and culture.

In a question posed to Tyka, Bradley asked about the social impact of using public data as it pertained to Tyka’s AI Portraits series. Tyka responded, 

The machine learning algorithm is a complicated way of doing regression. You correlate one thing with another, you have a bunch of data; and the system is very good at picking up any correlation it can find in order to satisfy the objective you’re giving it. And so, it is very tempting to think that because the algorithm is just a machine, that the data you’ve collected is presumably objective, that the result will be objective. And none of those things are really true. Data is not objective. You can get completely different results depending on how you collect the data, who collects the data, and you have to be extremely careful about not amplifying biases that are inherent to the data.

A conversation ensued among the artists about projects where inherent bias of data was laid bare, including Tyka’s own work and the work of the AI Now Institute. Chung’s responses to Bradley’s questions would continually emphasize her own interest in complicating authorship and how it serves as a metaphor to describe how these technologies are deployed in culture (e.g., Google mail auto-suggesting responses within email.)

Machines chronicle human experiences, and, at least nominally, tells us something about ourselves. Machines are mediators.

Bradley continued to reframe and help contextualize conversations around AI in art when called upon by the Austin American-Statesman to lend her expertise to a December piece about Austin artists working in the generative art space, noting that artists have “used algorithms and computers to generate art as early as the 1950s.” 

Industries concerned with the technology landscape use the rhetoric of racing—to accelerate beyond, to be at the cutting-edge, to speed ahead. Artists and curators offer a vital function to this ecosystem by reminding us of how artificial intelligence, machine learning, or other new technology is simply another iteration of technologies and systems that have come before. And that by critically engaging in a discourse on the mechanics of tech, we can see it more clearly, more thoughtfully, and with greater agency. 

To watch the full panel conversation at Time Machine, click on the embedded video below.

Published
Jan. 30, 2020
Tags
Alumni
Art History