The Camera Does Lie: The Danger Of Deepfakes

Lord, may Americans be a people committed to truth. Please open our eyes to see truth and see through deepfakes and other deception.

How many times have people told you that they will only believe something if they can see it? No one wants to be “had,” and wariness of being played for a fool can be a tough wall to surmount in a relationship. But what if you could not even convince someone (or yourself) by seeing something with your own eyes?

Our attention can be distracted, and we can be made to think we see things we don’t actually see — at least in the heat of the moment. For example, “magic” shows using illusions have tricked observers — often just for fun — for generations. However, there has long been a sentiment that “the camera doesn’t lie.” That is, if you looked closely enough at still pictures or film of an event, what really happened can be discerned.

But that’s not necessarily the case anymore. Technology, specifically artificial intelligence, has given rise to a new sophisticated digital tool called “deepfakes.” A combination of the words “deep learning” (an A.I. term) and “fake,” deepfakes show video footage of people doing or saying things that did not actually happen.

Some light-hearted uses of deepfakes have come in ads like one from State Farm in which an ESPN analyst from the 1990’s seems to make shockingly accurate predictions about 2020. Or there are movie “cameos” like that of Princess Leia in Star Wars Rogue One. Those with a “glass half-full” outlook may well see much promise for good in this technology. Indeed, marketers are already thinking about potentially powerful capabilities to harness to connect more effectively with individuals around the world.

But deepfakes have also already begun to be used more provocatively. In the public sphere, for example, a PSA about not trusting what you see on the internet was released in the form of a deepfaked President Barack Obama.

The risks seem dangerously high. After all, Americans’ trust in government, media, and other institutions are already low, and the COVID pandemic and other events of the past year seem to have only shaken that trust more. We are already skeptical about what politicians are or are not actually saying to us; or what journalists are or are not actually reporting. What if we could not even trust what our eyes are seeing, or our ears are hearing to evaluate the messages being presented to us?

Congress is stepping into these waters. Just this week the Senate Homeland Security Committee advanced bipartisan legislation that would direct the Department of Homeland Security to form a task force committed to reducing the spread of deepfakes through authentication and warning tools by content creators and industry leaders. One of the authors of the bill, Sen. Rob Portman (R-Ohio) declared, “Deepfakes represent a unique threat to our national security and our democracy.”

That’s a strong warning. And a telling one because the dangers of deepfakes are certainly not limited to political spats or personal slander. How could markets react to a particularly good spoof? Or how could a world leader with weapons of mass destruction respond if deceived by a nefarious third party about an impending attack? Just the presence of deepfakes can cause us to question everything we see and lead to paralysis — or worse.

Constitutionally robust government and industry remedies can take time, but this matter is certainly worth praying about now. The technology can feel intimidating and overwhelming, but at the end of the day it is a tool in the hands of people. We need hearts of people in this nation to be committed to light and truth — so we need the help of the Truth and the Light of the World.

(Aaron Mercer is a Contributing Writer with two decades of experience in Washington, D.C.’s public policy arena. Photo Credit: Possessed Photography/Unsplash).

Reprinted by permission. Ifapray.org

Comments to: The Camera Does Lie: The Danger Of Deepfakes

Your email address will not be published. Required fields are marked *