Texas is the first state to criminalize altered video clips known as “deepfakes,” or footage manipulated using a form of artificial intelligence, according to the legal newspaper Texas Lawyer. The technology has also raised concerns among legal professionals about whether these videos might be used as phony evidence in legal matters.
Everyday people might have some experience with deepfakes as jokes, such as the humorous videos that, say, swap the face of actor Keanu Reeves in a scene from the 1999 film The Matrix with that of actor Nicolas Cage, according to Wired magazine.
However, the crux of a deepfake is that it’s a video showing someone saying or doing something that they did not in fact say or do, Sam Gregory, program director at the nonprofit WITNESS, told Wired. Since 1992, WITNESS has taught video basics and safe filming techniques to citizens, activists, journalists, lawyers, and others advocating for human rights.
“Deepfakes are the next generation of video and audio manipulation, and sometimes images,” Gregory told Wired. Beyond swapping faces, deepfakes can manipulate someone’s lips to sync them with a fake or real audio track, or move a person’s body in a way that the person didn’t move.
They also can remove items from footage, such as one car passing behind another, according to one example in a PBS NewsHour report earlier this year.
Beyond the Texas Law
Texas Senate Bill 751 specifically addressed deepfakes in terms of the state’s election code, saying that anyone who “fabricates a deceptive video with intent to influence the outcome of an election” can be charged with a class A misdemeanor, punishable by a year in a county jail and a fine up to $4,000. ((2019-2020 86th Legislature (Passed) Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election. [SB751 2019 Detail]))
However, University of Texas School of Law professor Robert M. Chesney said the real danger of deepfakes goes beyond the political sphere. Someone could use a doctored video to harm another person’s professional life or reputation.
The legal profession has run across fraudulent audio and video in the past, but not to this “highly realistic” and “easily spread” degree, Chesney told Texas Lawyer.
“Certainly, it’s possible fraudulent content of this kind might one day have an unrecognized but serious impact on a jury,” he said, noting that those involved in “litigation, arbitration and other dispute resolution systems will face increasing challenges with respect to the authentication of evidence, including a growing need for forensic experts.” ((https://law.yale.edu/yls-today/yale-law-school-events/deep-fakes-looming-challenge-privacy-democracy-and-national-security-bobby-chesney-and-danielle))
‘It Does Increase the Pressure on Us’
The technology is not at the point that anyone can swap a face from one image to another in videos, but even imperfect videos can cause harm, Gregory said. He predicted this type of content in better quality becoming more widespread because of the commercialization of similar technology used in-app filters that apply funny faces and accessories to photos.
He advised people to watch online videos with another filter: skepticism.
“It does increase the pressure on us to recognize that photos and text are not necessarily trustworthy. We need to use our media literacy on them to assess where it came from. Is there corroboration?” Gregory said.