In photography, there are two primary types of biases: the subjects we choose to capture and the images we ultimately decide to share. These two biases, while often intertwined, come from different motivations. The first—the choice of subject—is driven by a sense of ownership, the feeling that this moment, scene, or person is worth preserving. The second bias, in deciding what to share, is motivated by social expectations and the subtle (or not-so-subtle) anxiety about how our images will be judged.

What we choose to photograph is rarely made in isolation; there’s often an audience in mind. So, even as we decide what to capture, the question of sharing starts to shape our choices. Sometimes, though, the final result surprises us, making us reconsider our initial intentions.

Ultimately, a sense of expectation influences what we share. We post what we think we’re supposed to post, rarely crossing the boundaries of who we’ve presented ourselves to be. This is true not only for individuals but for businesses too, especially in media. Sometimes this conformity is called a “style,” but it’s more than that. It’s a strict framework that restricts the type of images and subjects we share, confining them to one very specific, narrow aesthetic. While companies may formalize these rules with brand guidelines, individuals internalize them as self-imposed boundaries, creating a form of self-censorship. This subtle, often unconscious filtering shapes what we see online—and ultimately fuels the biases we encounter when we discuss AI bias.

A screenshot of a Google image search for images of Kitchen
A Google image search for images of Kitchen

Let’s take kitchens as an example. The kitchens you find online are always clean, empty, fresh, and modern—often Western European in design. You won’t find images of your kitchen or mine, or those of everyday people in Asia, India, Africa, or South America. We might take photos of our kitchens for practical reasons, but we rarely post them online. Instead, what you see are the sleek kitchens of the celebrity class, straight from Architectural Digest or property listings. Always spotless, newly renovated, with high-end appliances, these kitchens define what we think of as “worthy” of sharing. And when you ask an AI to generate a kitchen image based on “publicly available content,” guess what you get?

This isn’t a matter of protection or advocacy. There’s no push for everyday kitchens to get their due representation, unlike efforts to diversify portrayals of CEOs or professionals. And because of that, this , along with many other “invisible” bias will likely go unchallenged, making it harder to ever create a truly human-equivalent AI. AI will continue to mirror the narrow, polished view of the world that our curated choices project, missing out on the richness of real, everyday life.

Opening image : Photo by Scott Umstattd on Unsplash

~

 

Author: Paul Melcher

Paul Melcher is a highly influential and visionary leader in visual tech, with 20+ years of experience in licensing, tech innovation, and entrepreneurship. He is the Managing Director of MelcherSystem and has held executive roles at Corbis, Gamma Press, Stipple, and more. Melcher received a Digital Media Licensing Association Award and has been named among the “100 most influential individuals in American photography”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.