Encountering altered videos and photoshopped images is almost a rite of passage on the internet. It’s rare these days that you’d visit social media and not come across some form of edited media — whether that be a simple selfie with a filter, a highly embellished meme or a video edited to add a soundtrack or enhance certain elements.
But while some forms of media are obviously edited, other alterations may be harder to spot. You may have heard the term “deepfake” in recent years — it first came about in 2017 to describe videos and images that implement deep learning algorithms to create videos and images that look real.
For example, take the moon disaster speech given by former president Richard Nixon when the Apollo 11 team crashed into the lunar surface. Just kidding — that never happened. But a hyper-realistic deepfake of Nixon paying tribute to a fallen Buzz Aldrin and Neil Armstrong appeared in a 2019 film, In Event of Moon Disaster, which showcased the convincing alteration of the president’s original speech.
Other current and former world leaders, such as John F. Kennedy, Barack Obama and Vladimir Putin have been the subjects of deepfake videos, too, in which they appear to say and do things that they never actually said or did. Though the rise of deepfakes in recent years has been discussed in popular media, the pool of academic literature on the topic remains relatively sparse.
But researchers have expressed concern that these doctored images and videos could present a growing security risk in the coming years. A report last week in Crime Science predicts that deepfakes will pose the most serious security threat in the next 15 years out of a host of other AI-powered technologies.
“Humans have a strong tendency to believe their own eyes and ears,” the researchers ultimately concluded. So when the media we consume looks too good to be fake, it’s easy to fall victim to trickery. And the amount of deepfakes online continues to grow, though not always in the places you might expect.
The term deepfake doesn’t refer to just any convincing edited video or image — more specifically, the term is a conglomeration of “deep learning” and “fake.” This specific type of media relies on neural networks to alter audio and video.
The technology to create deepfakes has gotten easier to access over the years, with a handful of programs and websites cropping up that allow users to make their own, sometimes at a hefty price. Still, many of the deepfakes that populate various corners of the internet aren’t that convincing, says Giorgio Patrini. He’s the CEO and founder of Sensity, a company in Amsterdam that has been researching the spread of deepfakes since 2018.
Patrini says most of the deepfakes he’s come across are made with the same few open-source tools. “The reason is they are very easy to use and they are very well-maintained and known by the communities,” he adds. And most media they find “in the wild,” as Patrini puts it, use the same few methods to alter digital footage.
Recently, Facebook announced the results of a competition where experts built new algorithms to detect deepfakes — the winner was able to detect 82 percent of the AI-altered media they were exposed to. Some deepfakes can be created using methods that are still hard for current detection algorithms to spot, but Patrini says deepfake creators in the wild tend to use cheaper, simpler methods when making videos. The detection software we have now is actually pretty successful at sorting through the large swaths of media found online, he adds.
“I would say maybe 99 percent, or even more, of the deepfake videos that we find are … based on face swapping,” he says. “There are other ways to create fake videos, even changing the speech and lip movement, [or] changing the body movement.” But so far, those are not the most popular methods among deepfake connoisseurs, says Patrini, so current algorithms can still weed out much of the AI-altered content.
And though face-swapping technology can be applied to literally any photo or video with a human face in it, deepfake creators seem to have an affinity for one type of media in particular: pornography. An overwhelming amount of AI-altered videos are created to place one subject’s face onto the body of a porn star — a phenomenon that disproportionately targets women and hearkens back to the dark origins of deepfakes themselves.
In 2019, when Sensity released a report on the state of deepfakes under the name Deeptrace, they detected 14,678 total AI-altered videos online. Of those, 96 percent were used in pornographic content.
And the first deepfake videos, in fact, were made for the same reason. In 2017, users on Reddit started to post doctored videos of female celebrities whose faces were non-consensually swapped onto the bodies of porn stars. Reddit banned users from posting these explicit deepfakes in 2018, but reports show that other ethically problematic sites and apps still popped up in its place.
“We haven’t gone very far from it,” Patrini says. Despite widespread media coverage of political deepfakes, pornographic edits have been the reigning form of AI-altered content to spread across the web. And so far, women are pretty much always the targets — Sensity’s 2019 report found that 100 percent of detected pornographic deepfakes featured female subjects.
Just two months ago, Sensity identified 49,081 total deepfake videos online — a trend showing that the numbers are doubling nearly every six months. Lately, Patrini says, they’ve observed an increase in videos targeting people who are popular internet personalities, or influencers, on Youtube, Instagram and Twitch.
“Maybe a year ago we saw that most of the content was featuring known
celebrities that could be … from the entertainment industry,” he says. But deepfake creators are also targeting individuals, often women, who lead active lives online.
While AI-altered media might seem all bad, the technology itself isn’t inherently damaging. “For many people, deepfakes already have an intrinsically negative connotation,” Patrini says. But the technology behind it can be used for a host of creative projects — such as translation services or visual tricks in movies and TV shows.
Take the Nixon deepfake, for example. The directors didn’t present their creation to mislead viewers or make them think the history books got the Apollo 11 mission wrong. Rather, the film used an experimental new technology to showcase what an alternate historical timeline might have looked like, while educating viewers on how convincing deepfakes and video editing can be.
But that’s not to say deepfakes can’t mislead, nor that they aren’t already being used to carry out nefarious deeds. Besides the widespread use of non-consensual, doctored porn, Patrini says he’s also seen a rise in cases where deepfakes are used to impersonate someone trying to open a bank account or Bitcoin wallet. Video verification can be required for these processes, and it’s possible for a deepfake to trick the cameras.
“With some sophistication, people can actually fake an ID and also fake how they appear on the video,” Patrini says. Sometimes that can mean opening accounts under a stranger’s name, or a fake name and creating a persona that does not exist. For now, Patrini says, this kind of trickery does not appear to be widespread — but it does represent a more sinister application for deepfakes.
And with the technology getting easier to access, it’s likely that the spread of deepfakes will continue. We can only hope people will choose to use them for good.
Table of Contents Introduction to Study Skills The Importance of Time Management Developing Active Reading…
Technological advancements in manufacturing keep improving efficiency and production speed. Learn how RIM is changing…
Discover the exciting world of Bitcoin rewards and learn how to earn crypto while you…
Key Takeaways Medicare and Medicaid fraud drains financial resources and harms patient care. Recognizing signs…
As a business owner, knowing how to use holiday cards to your advantage is important.…
When it comes to grooming, your eyebrows play a significant role in framing your face.…