top of page

Automated Culture Symposium Monday, November 25, 2019


Automated Culture Symposium

Monday, November 25

Loop Project Space and Bar

23 Meyers Place, Melbourne 3000

9:45am-8:30pm

Sponsored by Culture, Media Economy, Liquid Architecture and the Automated Society Working Group (Monash)

Register HERE

9:45-10:00

Morning Coffee and Introduction

10:00-11:15

Panel 1: Automating Inequality

There has been growing concern about the impacts of automation on vulnerable communities with the automation of social services, such as Robodebt and the cashless welfare card. This panel brings together academics and industry professionals to discuss the effects of automation on marginalised communities. It aims to foreground how an automated culture may erase the voices of marginalised people and considers how we can address and redesign automated processes to build a more inclusive society. The panel will discuss a series of provocations that spark discussion about the automation of traditional social services such as Robodebt, the cashless welfare card, as well as the automation of moderation online.

Moderator: Verity Trott

Panelists:

  1. Alex North

From Automated Payment Suspensions via text message to Robodebt notices, Australia’s poorest have been the guinea pigs of the neoliberal dismantling of the welfare state and the creation of - what Virginia Eubanks terms - the “Digital Poorhouse”. Alex North is the National Coordinator of the Australian Unemployed Workers’ Union. Besides coordinating the day-to-day operations of the union, he is currently researching and working with the social security recipients affected by the on-going privatisation, automation, and digitalisation of Australia’s social security system.

2. Cameo Dalley

As an anthropologist my research investigates ethnographically the ways in which the Cashless Debit Card (CDC) has been experienced in a small town in Western Australia. Though the CDC is fundamentally designed to facilitate data surveillance, my work has instead focused on the CDC’s unintended impacts including that many users regularly encounter local privacy breaches in order to manage its basic functions. I also invite consideration of how denoting people or communities as ‘vulnerable’ draws on a similar ameliorative logic to that which led to the inception of the CDC in the first instance.

3. Venessa Paech

Machine intelligences, including automation, are disrupting the ontology of our communities – online, offline and the spaces between. Venessa's work explores the rise of machine culture and how artificial intelligence is impacting the way we experience and manage intentional online communities. She is focused on the work of online community management, exploring how these socio-technical workers apply, retool and resist machine intelligences and logics in their practice.

4. Monique Mann: This contribution explores the role that “good data” practices have in the collection, aggregation and automated analysis of information by government and private actors

4. Ben Eltham

Automated error as design feature rather than flaw: Looking ahead to the ongoing challenges posed by Robodebt.

11:15-11:25

Break

11:25-12:40

Panel 2: Algorithms in Media Production and Distribution

This panel will discuss and reflect on algorithms in media production and distribution. Drawing upon examples of the use of algorithms in different media sectors, including screen distribution, live streaming, and creative arts, the panel will address questions of power, subjectivity, and creativity and by doing so, also historicize the discussion of algorithms within longer histories of cultural curation.

Moderator: Nina Li

Panelists:

1. Ramon Lobato RMIT

This paper explores the changing landscape of TV hardware and software, and its significance for debates about algorithmic media distribution. Unlike broadcast TV channels using public spectrum, smart TV platforms are unregulated in terms of how they display, organise and restrict content, and app stores within each platform are governed privately through terms of service agreements. This paper asks two questions: What does power and control look like in the context of a smart TV interface? And how do we locate these algorithmically curated environments within longer histories of screen distribution, promotion, and gatekeeping?

2. Sijun Shen, Monash

This presentation explores the role played by algorithms in shaping the popularity of eat-streaming videos on Chinese live-streaming platforms. It argues that eat streaming is both a symptom of algorithmic culture and its enactment: an attempt to overcome the boundaries of the subject by consuming everything, all at once.

3. Jon McCormack, SensiLab, Monash

Over the next decade it is highly likely that much of our cultural production will be outsourced to intelligent machines. Over the last few years, huge technical advances have allowed artificial intelligence (AI) software to generate music, visual art and text that mimics human-level capabilities, at least at a surface level. Can the next generation of software go deeper, both semantically and creatively? That remains an open question, but already there is a “machine aesthetic” that encapsulates the quirks and limitations of this new, non-anthropomorphic intelligence, something that represents collective human creativity rather than that of any single individual. What will be the consequences of this new machine aesthetic on cultural production, distribution and consumption if all are automated, connected and autonomous?

12:40:1:15

Lunch

1:15-1:45

Performance: Simulation 1.5 by Roslyn Helper

A meditation on paranormal emotional behaviours and the co-option of human agency through online choice architectures. The reverse engineering of the human, channelling shell memories from within digital databases, conjuring a technopsychosomatic mapping of queer love. A multi-voiced poetry built from predictive text, neural network generated text and found text online.

1:45-2:00

Break

2:00-3:15

Panel 3: Surveillance, Vision, and Classification

Automated culture goes hand-in-hand with so-called surveillance capitalism: as we offload curation processes onto automated systems we become increasingly subject to their programmed priorities. This process invites a certain level of social de-skilling, in the sense that it tends to suppress the conditions for recognizing the shared or communal aspects of culture while privileging a commercialized version of individualized and customized taste culture. This panel brings together original perspectives on cascading logics of cultural automation, from automated image capture and classification to automated sense-making and the resulting forms of individualized targeting.

Panelists:

1. Thao Phan: Automating Desire

Ex Machina (2015) is a cultural text that both represents and exploits contemporary anxieties regarding surveillance capitalism and the modulation of identity in algorithmic culture. I argue that the film that takes these anxieties to their furthest limits. It positions the most private and intimate parts of one’s identity—feelings, emotions, and latent sexual desires —as a part of a technological system that can be deployed against us. Like all digital cinema, however, the film is enfolded into modes of production and distribution that rely on algorithmic techniques. It is a text that simultaneously represents, critiques, and enables forms of surveillance capitalism.

2. Nic Carah: Seeing Like a Brand

This talk draws on an automated image classification developed to imitate the forms of monitoring that brands conduct on social media sites like Instragram. The system demonstrates the ways in which large user-generated image databases can be put to use by advertisers and marketers using machine learning systems.

3. Anthony McCosker: Automating Vision: Deepfakes and the paradox of the new camera consciousness

The same technology that has enabled machines to see like humans has opened a pandora’s box on new forms of automated image production. In the process, machine vision has changed the landscape of visibility and visuality. I explore what generative adversarial networks have in store for us in the age of influencers and political mischief making, and how we might intervene.

4. James Parker: The Law and Politics of Machine Listening

The UK company Audio Analytic claims they are ‘on a mission to give all machines a sense of hearing.’ They are not the only ones. Machine listening is an emerging field of power, the future scope and repercussions of which we hardly understand, and which urgently demands our attention. This paper presents some preliminary thoughts and research challenges in this respect, as I begin a new project in this area.

3:15-3:30

Break

3:30-4:00

Patent Futures

The patents filed by the big technology companies like Google, Amazon, Apple, and Facebook anticipate, shape, and stake a material claim on the future. This workshop will provide a platform for participants to collectively analyse components from these automated futures and to diagrammatically synthesize new imaginaries.

4:00-5:15

Panel 4: Digital Rights

As citizens the world over become increasingly surveilled, profiled, and datafied, we must interrogate our capacity to consent to these systems. How can we protect our privacy and our digital rights when technology companies are undermining our individual agency? What kinds of assumptions are automated systems making about complex, messy humans? How can we address and push back against these threats to our personhood?

1. Niels Wouters, Head of Research and Emerging Practice for Science Gallery Melbourne, and Research Fellow in the Interaction Design Lab at the University of Melbourne

2. Sam de Silva, Board Member for Digital Rights Watch

3. Kobi Leins, Senior Research Fellow in Digital Ethics at the University of Melbourne

4. Monique Mann: This contribution explores the role that “good data” practices have in the collection, aggregation and automated analysis of information by government and private actors

5:15-5:30

Break: snacks

5:30-6:30

Keynote

“Mathematics is Ordinary: Culture, Computation, and Philological Power”

Ted Striphas - University of Colorado Boulder, USA | @striphas

This talk focuses on the figure of Moḥammed ibn-Mūsā al-Khwārizmī (c. 780-850 C.E./163-235 A.H.), purportedly the inventor of algebra and a key player in the popularization of the Indo-Arabic number system. Significantly, the word “algorithm” is said to derive from his name. Instead of accepting these claims—and thus the standard origin stories for algorithmic culture—at face value, the talk explores their emergence under shifting colonial regimes. It does so by examining the conditions under which al-Khwārizmī’s key writings, and the translations of those writings, were produced. The purpose of the talk is to surface some affinities between culture and mathematics, affinities that were not simply “lost in translation” but were unthinkable within the context of colonial-philological power. The talk excerpts material from Striphas’ upcoming book, Algorithmic Culture, and endeavors to provide deeper historical context for recent incidents of algorithmic discrimination.

6:30-7:00

NEXUS DESTINY 3.0

This performance is inspired by the ideology of production immanent to everyday machines of work and leisure. This performance will re-enact searches for productivity enhancing supplements, and re-perform the desirous production of custom .wav files.

7:30

Book Launch: Eavesdropping, A Reader

James Parker and Joel Stern


Recent Posts
Blog Categories
Blog Tags
No tags yet.
Blog Archive
bottom of page