Eco Media Papers 

21 November 2022

IMAGE: Sam Leach, Polar Bear Stack, 2022, oil on linen, 50cm X 50cm
AEGIS is connected to eco-media at RMIT University. In 2022, eco_media focuses on media technologies and extraction. In particular, how are media technologies and industries extracting and exploiting natural resources and the environment.

In 2022 papers were presented by AEGIS members Dr Sam Leach, Dr Pia Johnson and Dr Clare McCracken, and Dr Rebecca Najdowski.

Embracing the low resolution image
Pia Johnson and Clare McCracken 

As we ready ourselves for digital imaging futures, we must stop to consider its carbon footprint. High fidelity image streaming services and data cloud storage, through to cryptocurrency and banking, collectively creates 3.7 per cent of global greenhouse emissions (2019)1. This paper critically reflects on a series of works created by photographer Pia Johnson and mixed-media artist Clare McCracken during the pandemic with a particular focus on works that utilise technology with a lower resolution. In doing so the paper asks would it be so bad if we lost resolution making our files smaller and less environmentally impactful? 

Encompassed inside the low-resolution image, is the glitch, the disjointed slow connection, the noise of pixels and the blurred frame. It is within this loss of information that we argue the capacity for human intimacy emerges. Zooming into domestic settings of bedrooms, kitchens and the debris of life in lockdown, the screensharing enables us to delve beyond our boundaries, inserting each other into once private sanctuaries. While the glitch, as a rupture through an image, a piece of missing information, provides opportunities for imagining a world beyond the image.  Drawing from Winnicott’s object relations theory, Susan Best describes art being positioned on the cusp of the inner subjective world and the outer objective world2. It is this liminal state where the low resolution image affects both the making and the audience by bridging the gaps. Here we fill in the glitch or the blurred image with our own experiences, thus enabling a deeper connection between the subject and object. We will argue that the resulting images not only afford less travel (as we can work remotely across great distance), but they also use less energy to process, less data to store, and are instantly accessible as they are stored as low-res jpg or png’s.

Through a series of artistic case studies this presentation offers new ways to create and connect through the low resolution image, while lowering carbon emissions.


[1] Geneva Environment Network. 2021. Data, Digital Technology, and the Environment, 25 November.

[2] Best, Susan. 2007. “Rethinking Visual Pleasure, Aesthetics and Affect”. Theory & Psychology, Special Section, vol. 17 (4): 505-514. DOI: 10.1177/0959354307079295

What it is like to be a Polar Bear: how can AI be used in human/animal relations during the current extinction event.
Sam Leach

This presentation discusses an installation work exploring this question that I presented at Sullivan Strumpf gallery, Sydney, 2022. Viewers were invited to test themselves using AI to determine how closely they resembled a polar bear. A companion work used AI to give viewers a chance to see the world from a polar bear’s perspective by classifying objects in the field of view into categories that might be meaningful to a polar bear: food or mate.

AI is notorious for its tendency to entrench the prejudice and bias in datasets, as highlighted by Gebru et al. (2018), who points out that the biggest datasets are combed from the billions of daily internet users, so whatever mistakes those people make are being picked up and repeated. The way these systems make decisions is opaque to the end user, so they tend to be viewed as authoritative. My project intentionally confused and mis-trained commonly available AI models into thinking everything is, to some degree, a polar bear.

Training large AI models, such as GPT3, can release as much as three hundred tons of carbon, which is about the same footprint as a rocket launch. In addition, the growing range of applications for AI means that processors are being installed in more and more devices—all of which consume resources, especially rare minerals, some of which are linked to incredibly destructive mining practices, and exploitative labour practices. Recent developments in AI have focused on animal studies for alternate approaches to understanding cognition, learning and embodiment (Crosby et al. 2019).

There is an irony in animals being used as a “steppingstone” for the advancement of AI in the face of pressures on non-human animals being exerted by the systems that support and promote the development of AI.  I argue that AI developments must consider non-human animal viewpoints and interests rather than continuing to exploit and diminish the non-human biological world.

Screening: Deep Learning the Climate Emergency
Dr Rebecca Najdowski

Deep Learning the Climate Emergency uses photography and machine learning to rework the aesthetic of ecological collapse. The moving image artwork was made with generative adversarial networks (GANs) trained on image-based datasets depicting the effects of a heating planet: wildfire, bleached coral, drought, melting glaciers, and suns through smoke-filled skies. Refracted through the prism of machine learning, the video is a rendering of the shapes and textures of the climate emergency.

Acknowledgement of Country

AEGIS acknowledges the people of the Woi wurrung and Boon wurrung language groups of the Kulin Nation on whose unceded lands we work. We respectfully acknowledge their Elders, past and present. We also acknowledge the Traditional Custodians of the lands and waters across Australia and its Dreaming.