In case you have not heard, deepfake videos are multiplying online. “Deepfake” is a term coined in late 2017 by a Reddit user of the same name. With origins in pornography, the use of deepfakes has transcended well beyond the explicit. A deepfake generally refers to media manipulation where a person in an image or video is swapped with another person’s likeness. A more recent deepfake depicts Belgium’s prime minister linking the coronavirus pandemic to climate change during a manipulated recorded speech. Although some deepfakes are created for parody and entertainment, others are not. Creation of deepfakes has also led to bullying, harassment, and other criminal offenses.
One Pennsylvania mom has found herself in hot water over the creation of deepfake videos. In Bucks County, Pennsylvania, 50-year-old Raffaela Spone allegedly sent deepfake photos and videos of her teenage daughter’s cheerleading rivals to their coaches in a bid to get them kicked off the Victory Vipers team. Spone has been charged with cyber harassment of a child and harassment for manipulating photos from social media accounts belonging to her daughter’s cheerleading rivals to make it appear as though the girls were drinking, smoking, and even appearing naked.
Everyone should be aware of deepfake technology and its increasing popularity. Deepfakes can be created using simple phone applications that allow for photo, video, and audio editing. In the legal setting, deepfakes can pose a severe threat to the authenticity of evidence, especially for family law practitioners. In this day and age, where individuals highlight every aspect of their lives on social media, content created on various social media platforms becomes a valuable resource to use as evidence to highlight a parent’s digressions, parenting style, etc. However, if this evidence suddenly becomes unreliable due to alterations akin to the deepfake concept, admitting such content into evidence to be considered by a judge at trial may be difficult.
The 2021 U.S. National Defense Authorization Act (NDAA) requires the Department of Homeland Security (DHS) to issue an annual report for the next five years on deepfakes. The growing problems deepfakes have created should not be ignored, and the annual reports should help analyze possible detection and mitigation solutions. Several states have already enacted several protections and prohibitions related to deepfakes. Still, the technology allowing for the creation of deepfakes has become so accurate and precise that detection is more difficult than one may think. Only time will tell how this exploding issue affects the legal landscape.