Night Mode, open Unity game world with web cameras, 2022. (test image)

Night Mode, open Unity game world with web cameras, 2022.

Night Mode, open Unity game world with web cameras, installed at the Rubenstein Arts Center 2022. Documentation by Bree von Bradsky.

An automated camera floats through Night Mode in a celestial open game world, where a wooded field provides some ground. Cameras throughout the exhibition space feed images into the world, placing the installation space and its viewers inside. Before rendered in the world, these livestreamed images are analyzed for their brightness and color values, compared to other images in the space, and composited over or subtracted from each other. The title, Night Mode, takes its name from iPhone camera software that aggregates several images at differing exposures over time under lowlight conditions. Computational photography practices such as this once again call into question the representation of time and space in the photographic image. Moreover, the idealized lighting parameters for how a lowlight image “should look” form a new kind of exposure—one that, quite simply, is afraid of the dark.

I made Night Mode for the Latency CMAC PhD exhibition at the Rubenstein Arts Center February 24 –  March 6, 2022. Night Mode is the beginning of a longer project exploring computational photography using open game worlds.

Night Mode, image taken from web camera, 2022.

Night Mode, image taken from web camera, 2022.

Night Mode, image taken from web camera, 2022.

Night Mode, image taken from web camera, 2022.

Night Mode, image taken from web camera, 2022.

Night Mode, image taken from web camera, 2022.

Night Mode installed

Night Mode, open Unity game world with web cameras, installed at the Rubenstein Arts Center 2022. Photo by Bree von Bradsky.

Night Mode, open Unity game world with web cameras, installed at the Rubenstein Arts Center 2022.

Night Mode, open Unity game world with web cameras, installed at the Rubenstein Arts Center 2022. Photo by Bree von Bradsky.

Process

iPhone 11 night mode image at the Eno River, Durham, NC, 2021.

I began this work after taking a photo with my iPhone 11 at the Eno River one night. I wanted to test the camera under a severe lowlight condition. I was surprised by the photo’s grainy texture, almost as if it were film, and what I think of as color leakages—where unexplained blotches of RBG colors emerge.

iPhone night mode image zoomed in (displayed here on my desktop), 2021.

Artist Talk Notes | February 24th, 2022

Apple came out with “night mode” for iPhones in 2019, which uses new software and hardware capabilities to take brighter photos in lowlight conditions. 

The iPhone 11 was the first phone to feature night mode, and with its wide angle lens and accompanying larger sensor, the camera could “take in” more light.

When lowlight conditions are detected, machine learning algorithms analyze the light available and determine the number of images that should be taken over a period of time and aggregated into a single photo. 

This method is similar to high dynamic range methods developed to photograph a scene with multiple lighting conditions, such as a landscape, by aggregating several images with various exposures. Gustave Le Gray developed this method as early as the 1850s to composite a seascape, that without this method, could not render the sky and the sea in the same image because the range of luminosity was too wide for one exposure. 

I find these techniques a way of forcing two worlds into the same world, and I’m interested in exploring the aesthetics of not tying the separate light worlds together. In some images in Night Mode, for instance, there is a light and dark threshold that renders the pixels clear if surpassed.

Not only do night mode images composite different light worlds, they also composite different times into one image, once again calling into question the representation of time and space in the photographic image, which has never been of a single moment. I’m recalling Louis Daguerre’s 18th century long exposure of a busy Paris street, one of the first photographs, that rendered only a single still man getting his shoe shined. However, computational photography today leans toward compositing an abundance of certain information over time into the photograph as present, instead of rendering this information as absent, such as the crowded sidewalks and streets in Daguerre’s photograph.

In this effort to force information, Apple’s night mode sometimes leaks color blotches into the image, spots that appear too red, blue, or green. This is an effect I’ve exaggerated here.

On a personal note, I grew up outside Chicago, not far from the hundreds of miles of soy and cornfields that make up the midwestern landscape. If Le Gray’s project was to composite sky and sea, we in the midwest must composite sky and GMO corn. Living in North Carolina, where the trees often block a clear view of the horizon, has made me wonder at how trees are analogous to cuts in sight and enlightenment ways of knowing. Considering that Western art history’s linear perspective developed in deforested environments, I wonder how to recuperate forested landscapes as a way of seeing and knowing, as perhaps, a confidently uncertain form of machine vision.