Imagine someone creating a deepfake video of you simply by stealing your Facebook profile pic. The bad guys don’t have their hands on that tech yet, but Samsung has figured out how to make it happen.
Software for creating deepfakes — fabricated clips that make people appear to do or say things they never did — usually requires big data sets of images in order to create a realistic forgery. Now Samsung has developed a new artificial intelligence system that can generate a fake clip by feeding it as little as one photo.
The technology, of course, can be used for fun, like bringing a classic portrait to life. The Mona Lisa, which exists solely as a single still image, is animated in three different clips to demonstrate the new technology. A Samsung artificial intelligence lab in Russia developed the technology, which was detailed in a paper earlier this week.
Here’s the downside: These kinds of techniques and their rapid development also create risks of misinformation, election tampering and fraud, according to Hany Farid, a Dartmouth researcher who specializes in media forensics to root out deepfakes.
When even a crudely doctored video of US Speaker of the House Nancy Pelosi can go viral on social media, deepfakes raise worries that their sophistication would make mass deception easier, since deepfakes are harder to debunk.
“Following the trend of the past year, this and related techniques require less and less data and are generating more and more sophisticated and compelling content,” Farid said. Even though Samsung’s process can create visual glitches, “these results are another step in the evolution of techniques … leading to the creation of multimedia content that will eventually be indistinguishable from the real thing.”
Like Photoshop for video on steroids, deepfake software produces forgeries by using machine learning to convincingly fabricate a moving, speaking human. Though computer manipulation of video has existed for decades, deepfake systems have made doctored clips not only easier to create but also harder to detect. Think of them as photo-realistic digital puppets.
Lots of deepfakes, like the one animating the Mona Lisa, are harmless fun. The technology has made possible an entire genre of memes, including one in which Nicolas Cage‘s face is placed into movies and TV shows he wasn’t in. But deepfake technology can also be insidious, such as when it’s used to graft an unsuspecting person’s face into explicit adult movies, a technique sometimes used in revenge porn.
See the pics and read the rest here: https://www.cnet.com/news/samsung-ai-deepfake-can-fabricate-a-video-of-you-from-a-single-photo-mona-lisa-cheapfake-dumbfake/