top of page
Writer's pictureJustin McBrayer

Google Pixel 8: Cameras Now Shoot Fiction

If you've watched TV at all in the last couple of months, you've seen the ads for the newest Google smart phone, the Pixel 8. Most of the ads focus on a new AI system that edits pictures (Magic Editor) and videos (Magic Eraser). Here's how Google describes the feature in their online store:

Check out the fine print: the new editing tool allows you to alter photos not so that they better match reality but so they better match your vision. Cameras are becoming more and more tools to create fiction rather than tools to document reality.


The website provides an innocuous example of the Magic Editor in use. A woman poses on the beach, but she is off-center, the color is washed out, and there is a lifeguard and stand in the background. It's a perfect example of reality intruding on our "vision."

Never fear. The Magic Editor can get this all cleaned up. We can delete the aspects of reality we don't like (goodbye, lifeguard!), enhance our perspective on the world (like putting ourselves at the center), and add fictional elements that would make our world seem better (like color). The net result is a fictional image rather than an accurate one:

The TV commercials for the phone give even more dramatic examples of what this kind of AI editing might look like. In one, a father is out on the beach with his young child, and he throws him up in the air to catch him. He doesn't throw him very high. In fact, he throws him far lower than the photographer would like. And so, the photographer uses the Magic Editor to cut-and-paste the child several times higher in the air. When the photo is all done, the dad looks like some sort of super-dad, hurling his child into the sky.


In another, a mother is taking pictures of the Macy's Day Parade. There's a group of professional dancers doing a kickline in the middle of the street while a little girl watches along from the sidelines. The subtext is that the little girl is the woman's daughter and that she desperately wishes she could be in the parade, too, right alongside the professionals. Easy work: the mother takes a picture of the scene, highlights her daughter, and drags her image into the street right alongside the professionals. The edited image makes it look like the daughter was part of the parade all along. You'd think that was pretty impressive when you saw the image on your Instagram feed.


All this goes for video editing, too. Magic Eraser lets you edit out aspects of your video that don't align with "your vision." Did you take a video during the game when the announcer was giving the score in the background? No problem: Magic Eraser allows you to select just the audio associated with the announcer and delete it from your final video file. You can get videos of landscapes without the sound of traffic, beaches without the sound of waves, and birthday parties without the sound of singing.

The advent of this level of editing technology literally at your fingertips is guaranteed to make the fake news problem even worse.


Yes, of course, editing photos and videos is a matter of degree. But the new AI editing tools go far beyond the minor tweaks that happen with filters and the like. Harmlessly adjusting the warmth of a color is a minor thing. But adding and subtracting people from the frames is a major thing. This year's Wedding Photo of the Year featured a bird that landed on a bride's head during the vows. That would be a lot less impressive if produced with the Magic Editor.

It's also true that we've had the ability to manipulate photos and videos for a long time. Photoshop was developed in 1987, and its power has grown every year since. But the Pixel 8 is not just another small step down the editing continuum. It allows amateurs to alter media in radical ways in mere seconds. Up until now, those kind of alterations were possible but required time, expertise, and special programs. Well-funded political campaigns could afford to hire experts to produce fakes, but your neighbor down the street couldn't. Now the playing field has been leveled.


The direct result is that now everyone has AI editing available at their fingertips with no training required AND strong incentives to create and post media that captures "our vision" of the world rather than the world as it really is.


The indirect results will be many and damning. First, we can expect that the mental health problems associated with social media use will increase. It was hard enough to keep up with the Joneses when they were posting real pictures (albeit ones that only portrayed a selected angle of their overall lives). It will be impossible to keep up with the Joneses when they start posting fictional pictures. In particular, Magic Eraser and other AI editing tools will make it easier than ever for teens to create unrealistic pictures of themselves and their social situations.

Second, we should anticipate an increase in fake news. It will be easier than ever to create altered photos and videos that stoke fears, incense voters, and otherwise manipulate people through propaganda. And note that these photos need not be negative--they might be images of hope or promise. For example, in 1993, a photographer released a photo of an Israeli and Arab boy standing side-by-side and overlooking Jerusalem. The photo was staged, but it had the intended effect of making people feel hopeful about resolving the Israel-Palestine conflict. (See here for the image and an excellent story on AI as art propaganda.) AI editing will make photos like this commonplace in our media feeds.


Third, there will almost certainly be a decrease in social trust. I did my graduate work in Missouri, the "Show Me" state. Missourians are famously skeptical and reticent to take your word for something. Instead, they want you to show them. But with the advent of AI editing, even showing them won't be good enough anymore. Our default is to trust our senses. When someone posts a picture, we are inclined to believe that it's showing us the world as it is. When someone posts a video, we naturally believe that it's an accurate rendition of what happened. A picture was good enough, even in Missouri.


But given the wide availability of convincing, AI editing, it's less-and-less likely that a picture or video is an accurate facsimile of reality. And that means that we ought to trust images and videos less-and-less as time goes on. This decline in media trust is likely to be ubiquitous. Those on the left won't trust images from Republican politicians, videos of Hamas from the Israeli Defense Forces, or media posts from their conservative neighbors. Those on the right won't believe the videos of police brutality, images from Democratic super-PACs, or media posts from their liberal friends.


Fourth, we should expect decreased accountability for those in power. In 2016, the world learned about the Access Hollywood tape, an audio recording of Donald Trump talking about how he used his position of power to abuse women. Though it didn't turn the election, the release of that tape had an effect on American voters. Even Trump supporters granted that the audio file was accurate and instead tried to explain away Mr. Trump's commentary as "locker room banter."

But with AI software, we can create deepfakes of scenes that never happened and edit images and videos in ways that make them look very different from reality. Nowadays, it's hard to imagine a photo or video that would cause a politician much panic. If the opposition releases an unflattering photo or a problematic video, just deny that it ever happened. After all, any kid with a Pixel 8 could have doctored the photo to make it look like that happened.


87 views1 comment

Recent Posts

See All

1 Comment


Peyton McBrayer
Peyton McBrayer
Dec 05, 2023

This is a really good post! I also watched the commercial on TV and had a similar reaction. The tech is truly amazing, but the implications are daunting.


Like
bottom of page