October 2025
Perceivings
Alan Dean Foster

As Reality Fades into Myth

"Pictures don't lie."

journal.everypixel.com/ai-hallucination-examples

Well, that’s another old-timey saying modern technology has shot down. Thanks to Artificial Intelligence, nowadays they can lie all the time, and the software that allows images to lie is getting better even as you read this.

For starters, spare some sympathy for those charged with judging what remains of photography contests. AI is steroids for imagery. That gorgeous picture of a sunset? Enhanced with a simple at-home app. Experienced judges can usually catch such alterations. What is hard to espy are the small changes. A drop of water added to the tip of a leaf. Someone’s eyes closed in a street scene when in actualitly they were open. A blonde streak added to an actress’s hair. We used to look for such subtle changes in still pictures. Now anyone can do them in video.

The tech isn’t perfect. Not yet. But the apps and software necessary to fake reality are avalanching ahead. If you’re interested, one thing to look for in identifying AI video is to look not at the main character or subject, but at the backgrounds. At small details. If there is any kind of writing on an advertisement, a street sign, in a store window, it’s often gibberish that’s not even recognizable as writing. If a scene is taking place in a city, look at the architecture. Does it fit the date, time, and locale? Is the weather consistent throughout the scene? If reflective surfaces are present (water, store windows, etc.) are reflections shown? Correct shadows? Does every living creature in the video have the correct number of limbs or digits (an ongoing issue with video AI). 

So, there remain subtle hints that something is Off and not entirely natural. The more elements present in the video (a street scene with many people, say, versus a portrait, or a herd of animals), the more likelihood there is that there will be mistakes that can be picked out by a viewer.

But not for long.

It’s one thing for a commercial, public release like Veo3 (which may be obsolete by the time this column appears) to simulate reality with increasing verisimilitude. It is another to have a peek into what the real, unreleased advances are achieving. More than a year ago I was given a look at some of what a cutting-edge company working in the commercial field could do. Most mind-boggling was a longish video that began with a singer having touch-up applied by a makeup artist. The singer then rose from his chair, walked toward and past the “camera” and other people, into a studio where he sat down and waited to proceed. Everything about the singer and critically, his immediate environment, was quite real. I could not spot any giveaways.

The singer was Prince, who of course is deceased. The short video blew me away. David Bowie appeared in another, shorter video. 

This is where the line “pictures never lie” dies a permanent death. It is also where I get worried. If someone decides to resurrect Marilyn Monroe or John Wayne for a commercial, or for an appearance in the background of a film, I can live with and appreciate that. What troubles me is when someone decides to make a video of Trump greeting Putin and instead of shaking hands, Putin slips something into Trump’s pocket. Or vice versa.

How is one to believe anything anymore, especially in the absence of first-person correlation? For that matter, first-person observation can now be effectively faked as well. Did country X really bomb a hospital in country Y? Here’s the video not only of the bombed-out hospital but of commentary from first-person witnesses. Except that while the witnesses may be real enough, the movements of their lips have been oh so slightly manipulated to match the invented testimony they are giving.

We are on a slippery slope, reality-wise. What happens when testimony excluded from a trial appears online? Except that the testimony as well as the testifier are entirely conjured by AI? We see videos of police chasing suspects all the time, but what if the entire sequence is created in something like Veo3?

Our current president is busy changing the name of the Department of Defense back to the Department of War when what we really need is a Department of Reality. A place where people can have video verified and be reassured that no, a bomb did not go off in the Supreme Court this morning (an easy enough video to make, but also thankfully one for which veracity is simple to check).

But what about one showing a school shooting in Iowa? How do you ensure that people get the truth? And if it is the truth, that it’s not modified or enhanced? I’m waiting for someone to redo Orson Welles’ 1938 War of the Worlds broadcast, only in something like Veo3. If radio could panic people back then, imagine what well-done AI video could do now.

I do hereby attest that this column and these thoughts were concocted entirely by a living human author, i.e. me. So nothing to worry about there.

Or is there?

Prescott resident Alan Dean Foster is the author of 130 books. Follow him at AlanDeanFoster. com.