The saying goes that “a picture tells a thousand stories”. But nothing is ever that simple.
At the basic level a camera simply records whatever is in its field of vision. The choices made by people, such as who was shown and from what angle, very often determined whether or not the story being told with those images was accurate and honest.
However, even the choices made by humans after the fact don’t offer the full picture of how photographs can be prejudiced.
Sarah Lewis, an assistant professor at Harvard University whose work “looks at how the right to be recognized justly in a democracy has been tied to the impact of images and representation in the public realm”, says that racial and cultural biases can go beyond the selections made by editors. They are also embedded in the technology itself.
Photography is not just a system of calibrating light, but a technology of subjective decisions. Light skin became the chemical baseline for film technology, fulfilling the needs of its target dominant market. For example, developing color-film technology initially required what was called a Shirley card. When you sent off your film to get developed, lab technicians would use the image of a white woman with brown hair named Shirley as the measuring stick against which they calibrated the colors.
Quality control meant ensuring that Shirley’s face looked good. It has translated into the color-balancing of digital technology. In the mid-1990s, Kodak created a multiracial Shirley Card with three women, one black, one white, and one Asian, and later included a Latina model, in an attempt intended to help camera operators calibrate skin tones. These were not adopted by everyone since they coincided with the rise of digital photography. The result was film emulsion technology that still carried over the social bias of earlier photographic conventions.
Modern digital cameras, which are essentially light gathering computers, have made photography more flexible and adaptable to different situations. But those computers are programmed by people. Digital photography still carries some “algorithmic bias” carried over from film.
Digital photography has led to some advancements. There are now dual skin-tone color-balancing capabilities and also an image-stabilization feature — eliminating the natural shaking that occurs when we hold the camera by hand and reducing the need for a flash. Yet, this solution creates other problems. If the light source is artificial, digital technology will still struggle with darker skin. It is a merry-go round of problems leading to solutions leading to problems.
You see it whenever dark skin is invisible to facial recognition software. The same technology that misrecognizes individuals is also used in services for loan decisions and job interview searches.
Lewis says some high profile people in the film and television industry are working on the problem. Hopefully their solutions will be quickly incorporated into the cameras that most people use everyday.
She closes her essay with this very appropriate thought.
Race changed sight in America. This is what my grandfather knew. This is what we experience. There is no need for our photographic technology to foster it.
Digital photography is just one, possibly less obvious, example of why we need a basic awareness of the algorithms running our digital devices. Especially an understanding that they are written by humans. People who weave their biases into the code. Sometimes without realizing it; but also sometimes with purpose.
The photo at the top shows a friend using her smartphone to capture the group. Selfies is just one of the capabilities that was more difficult in the previous century when we shot film.