In April 2022, a video clip circulated showing former First Lady and Secretary of State Hilary Roddam Clinton endorsing Florida Governor Ron DeSantis for the presidency. The idea of the leftist Democrat promoting the candidacy of a conservative Republican raised more than a few eyebrows. However, given the public profiles of Mrs. Clinton and Mr. DeSantis, the recording was rapidly proven false.
That clip was a “deepfake.”
My exposure to deepfakes was superficial, as I relegated them to something related to Hollywood or politics, like the clip cited above. Since I don’t spend a lot of time in either world, I never paid much attention.
Moving Into the Schools
Then, I got my copy of Education Week, a paper that primarily carries news of interest to school administrators. An article about deepfakes caught my eye. The article referred to a study by the Center for Democracy and Technology.
“Forty percent of students and 29 percent of teachers say they know of a ‘deepfake’ depicting individuals associated with their school being shared during the 2023-24 school year, according to the nationally representative survey of high school students, middle and high school teachers, and parents.
“The primary perpetrators-and victims-are students.”
Eternal and Natural Law: The Foundation of Morals and Law
No One Gets a “Time Out”
As I read, two thoughts raced to the fore.
First, American schools (public, private and charter) have a rough total of fifty-eight million students and four million teachers. So, those forty and twenty-nine percent figures translate to a massive number. Moreover, these are not celebrities; they are “normal” people. They could be my children and grandchildren or those of our readers. A topic that formerly had no relationship to everyday life suddenly became real.
The second thought had little to do with the classroom. Early on, deepfakes were the province of professionals with expertise in computers and motion pictures. If a seventh-grade student can create a convincing deepfake, then the available software has made this a game that any keyboard puncher can play.
Thus, I decided to learn a little more about deepfakes.
An Unfamiliar New World
So, let’s start with the most basic question. What is a deepfake?
It is the use of Artificial Intelligence (AI) to create a photo, audio recording or video clip of an event that never happened. Business Insider devised an especially memorable five-word description-“fake content of real people.”
AI is very different from commonly used computer programs. Older technologies could manipulate images. Skilled operators could soften or enhance facial wrinkles, remove distracting items in a photo’s background, and brighten or dim colors. These elements might make a picture appear more or less attractive, but these were real photos of actual events. The same thing is true of most audio or video editing software.
A deepfake is very different. It can place people in locations they never inhabited, doing things they never did, saying things they never said, in the company of people they never met. The possibilities are endless, as are the deepfakes’ illegitimate purposes.
Learn All About the Prophecies of Our Lady of Good Success About Our Times
A Fast-Growing Dilemma
The problem is increasing at a geometric rate. An international business consultancy firm, Spiralytics, compiled some amazing statistics.
- “The number of online deepfake videos roughly doubles every six months.”
- “Social media users shared around 500,000 voice and video deepfakes in 2023.”
- “The entire process of creating a deepfaked photo or video can be as quick as 8 minutes.”
- “‘Face swaps’ are among the most common deepfake categories, with a 704% increase from the first to the latter half of 2023.”
- “71% of people aren’t aware of what deepfakes are; only one-third know of them.”
Moreover, the law on this topic is a messy patchwork of measures. The computer security firm Norton summarized the current state of things.
“Currently, there is no federal law governing deepfakes, though the creation of deepfakes is illegal in some US states, while distributing them is illegal in other states. Minnesota outlawed creating and distributing deepfakes that could be used to influence an election or as a form of nonconsensual pornography. Other states are considering similar laws, and the UK has included language around how deepfakes can and can’t be created in its forthcoming Online Safety Bill.” (Emphasis in the original).
Recognizing Deepfakes
That leaves us with two topics to round out this brief overview: how to recognize deepfakes and how to prevent you or your loved ones from becoming a victim of this growing threat.
The first rule is to consider the source. If, for instance, my Aunt Ida sends me a photo of a family event that I missed, I can place almost infinite trust in the picture’s accuracy. First, I have no doubts about her personal honesty. Second, even if she weren’t honest, she would have no reason to lie about a family reunion. Third, I know that computer graphics are not among her skills. On the other hand, if I get something over the Internet from someone using an obvious pseudonym, it is time to either disregard it or examine it more closely.
10 Razones Por las Cuales el “Matrimonio” Homosexual es Dañino y tiene que Ser Desaprobado
Most deepfakes, being artificial, share certain flaws. The raw material may be only a small photograph or brief audio clip. However, that means that specific details often lack clarity.
Look closely at hair and eyes. In this case, the old saying that “the eye is the window of the soul” is accurate. In deepfakes, the movement of eyes, which is almost constant in real life, is slow or even non-existent. Hair tends to move in relation to other bodily movements or even a quick rush of wind. Hair appearing to be “pasted down” is a common sign of a deepfake.
Clues You Can Use
The same idea applies to audio recordings. A deepfake of my voice could be constructed even though the person producing it only has a few seconds of me speaking. However, we all have vocal mannerisms-words we habitually mispronounce or usages specific to particular places where we lived.
For instance, President George Bush, the younger, always pronounced the word nuclear as nu-cul-ler. Nonetheless, unless the deepfaker’s original audio clip contained that particular word, the resulting recording would have a more standard pronunciation, even though the vocal tone would mimic that of the former president.
Another frequent flaw is lighting. Everything in a genuine image shares common sources of light. However, a deepfake often does not. If the deepfaker uses an image taken outdoors at midday, transposing that image to an event happening indoors at night renders it out of place.
Many amateur deepfake videos contain sound that is out of sync with the action. A similar issue is the coordination between lip movement and spoken words. Again, these details are difficult to get right.
Science Confirms: Angels Took the House of Our Lady of Nazareth to Loreto
You Are On Your Own
Perhaps the most applicable rule of thumb is “if it looks fake, it probably is fake.”
Let’s conclude with a couple of ideas about ways to protect yourself-or those you love-from being a deepfake victim.
First, don’t plaster the Internet with pictures or videos of yourself. The more information the deepfakers have, the better their products will look. If you want to tell the world about your travels, share photos, show the vehicles you use and even the food you eat, it will facilitate your becoming a victim. Leave your “selfie stick” at home.
Of course, one can be a victim of a deepfake without actually being in it. Learn from those who “shared” an image on Facebook that later turned out to be unreal. A bit of innate skepticism goes a long way. Turn the doubt dial to “Max” when the possible deepfake conveys a message with which you agree. One aspect of our fallen human nature is our tendency to believe lies we wish were true.
One last bit of advice is never to make a major decision based solely on one piece of evidence, no matter how convincing that piece may be. Holy Scripture (John 8:17 and 1 Timothy 5:19) admonishes us to doubt any accusation that is not substantiated by at least two witnesses. That is also a very good test in the world of deepfakes, as well.
Photo Credit: © Feng Yu- stock.adobe.com