A.I. Is Changing How You See the World

A.I. Is Changing How You See the World

By systematically distorting the images we see and create, A.I. is shifting perceptions and ultimately changing our relationship with the physical world

Credit: The Washington Post/Getty Images

Earlier this year, I hiked with my family to Claigan Coral Beach on the outer edge of the Isle of Skye in Scotland. We wanted to see the impressive stretch of coral we’d seen on Google searches for the area, but instead of the pristine white beach and turquoise sea from the internet, what we found was a rather grubby stretch of crushed coral. I felt similarly when I saw the fairy pools, the crystal clear rock pools fed by small waterfalls. I used to love discovering these kinds of places, but seeing the photos online before we set off had killed any sense of wonder. It made being there feel strangely unreal, like returning to a place from your childhood only to find it smaller and shabbier than you remember.

Our impressions of the world are heavily influenced by the images we see online. Digital photography boomed with the popularization of smartphones in the early 2010s; today, Snapchat users create 3 billion snaps every day, and 3.5 billion Instagram posts are liked every day, many of them filtered and enhanced. Anyone with a smartphone can produce enhanced images that only professional photographers and retouchers could have achieved just a few years ago.

Yet photo manipulation at this scale means that rather than using photography to reflect the world around us, we are systematically distorting our record of the world. Photography increasingly serves to represent an idealized version of our lives, and this comes at a price. “We know that fake images can alter what we believe about the past, what we remember about the past and even our future intentions,” says Dr Kim Wade, a psychologist from the University of Warwick and a specialist in photography and memory. “The act of routinely enhancing images could change our perspective of the past and ultimately, our relationship with reality.”

While algorithms have always automatically enhanced the saturation and contrast of digital images, the introduction of machine learning has been a step change. Training comprising millions of images can teach camera phones to recognize what kind of scenes they are photographing, and which image characteristics are desirable or not. The Huawei P30 smartphone can recognize more than 1,500 scenes and automatically apply optimized settings. If the phone detects a portrait, it will smooth skin, blur the background, and apply more flattering lighting.

Most current smartphones will also apply High Definition Rendering (HDR). If the phone detects a high contrast scene, with too many shadows or not enough highlights, it will take a burst of images with different exposures and combine these into one perfect shot.

A.I. is also starting to select images for us. The Google Pixel Top Shot function captures 90 images before and after the shutter is pressed, and selects the frame it determines to be the best, according to what it has learned from its training set. That might be the biggest smile, or the best classical composition, though the designers are usually guarded about what kind of training sets and algorithms they use to shape the images we take.

And then there are filters and apps, routinely used to make everything look more polished and impressive. Most phones come with standard filters, such as “vivid,” “dramatic,” and “noir,” and countless apps like Snapchat and FaceApp offer masks that smooth skin, make eyes bigger and chins smaller. A few taps can convincingly age or de-age portraits by decades.

“Our research has shown that people place too much faith in photos to be reliable representations of the past.”

Moving images can also be manipulated more easily. Marvel’s Ant Man featured a de-aged Michael Douglas, and Netflix’s Orange Is the New Black featured actors playing their younger selves in flashbacks. Sophisticated and believable “deepfakes” of Mark ZuckerbergBarack Obama, and Vladimir Putin have already been widely shared, while Stanford University has also developed software that has made changing what people say in a video as easy as editing text. Even lay people can use software like FakeApp to swap faces in videos, and — if the last decade is anything to go by — this technology will trickle down to our smartphones before too long.

Yet, however knowledgeable and discerning we think we are, research shows that we still have a strong bias toward believing what we see in images. “Our research has shown that people place too much faith in photos to be reliable representations of the past,” Wade says. “People tend to be overconfident in their ability to detect fake images, and have a bias towards believing that images are true if they’re unsure. Even people who regularly edit their own photos struggle to distinguish between real images and fakes.”

Being constantly exposed to exaggerated images can make real life seem underwhelming and disappointing. A 2017 study by the University of California found the more time people spend on social media looking at other people’s seemingly perfect lives, the lower their satisfaction with their own lives, including a well-documented, negative effect on young women from images depicting an unrealistic beauty ideal. And counterintuitively, one 2016 study by the University of Warwick found that labelling images showing an unrealistic beauty ideal as “manipulated” or “enhanced” was even more likely to make participants want to change their appearance than unlabelled images. Plastic surgeons report a 47% rise in cosmetic procedures that aspire to look more like heavily edited and filtered selfies, a condition being labeled “Snapchat dysmorphia.”

“The standardization of aesthetics by way of tools [such as AI-enhancements] is very dangerous,” says Constant Dullaart, a Dutch artist whose work deals with our new digital world. He is concerned about the normalization of digital manipulation, and that this sexist aesthetic is facilitated by for-profit corporations whose business model relies on maximizing engagement.

“Reality is a construct, and tons of people are subscribing to a reality I find oppressive,” says Dullaart. “This automated aesthetic reinforces it. The media is becoming so dominated by these kinds of images. What I’m most scared of is that children feel they need to comply with this aesthetic standard, body standard and behavioral standard.”

While photography has always been subjective, it has always helped shape our understanding of the world around us. A shared reality is a profoundly important currency that allows us to communicate and create meaning, and if photography becomes less reliable as a medium, we lose an important common denominator.

“No matter the content, manipulated images chip away at our understanding of reality,” says Katy Cook, author of Ethical Threats and Emotional Unintelligence in the Tech Industry and founder and CEO of the Centre for Technology Awareness, a technology ethics consultancy. “Doctored images of politicians or world events threaten not only to confuse us on a systemic level, but are amongst the most serious threats to our democracy. To tell us that we can’t believe what we see before our eyes is extremely at odds with our evolution, where we rely on sight as one of the primary mechanisms for understanding the world around us.”

“When this basic function is compromised by deepfakes or manipulated images, our ability to agree on basic facts diminishes and our social cohesion suffers as a result,” she continues.

There are some signs of a backlash against this manufactured reality, from the strong sales numbers of analog cameras to women sharing photos of their #postpartumbody or telling the world to #effyourbeautystandards. Comedians like Celeste Barber mock “instaperfect” lives, parodying manipulated selfies and celebrity images.

And the popularity of the #nofilter hashtag, used to tag more than 250 million images on Instagram to date, shows a growing awareness that Cook says is the first step in changing our understanding. “Awareness — in this case, of our own vulnerability — from deepfakes to software that nudges us towards changing our appearance to the inching away from what is true and authentic (facts, a spherical earth, the circumference of our thighs). There’s no formula per se, but the desire to change, education, and practice generally form the next steps. But awareness is always first.”

SOURCE: https://onezero.medium.com/a-i-is-changing-how-you-see-the-world-965e347b04ee

Please follow and like us:
Share

Post a comment

Your email address will not be published. Required fields are marked *

Contact
close slider
  • +44 (0)203 004 9596

    info@versatilestaffing.co.uk
  • This field is for validation purposes and should be left unchanged.