In recent years, more and more deepfakes have read the internet — pieces of synthetic media that take an image or video and use someone else’s face or voice to create a new, fake image of people or occurrences. And chances are, you’ve already seen one and didn’t know it was a deepfake.
The very real-looking characteristics of deepfakes have allowed for many instances of misinformation, hoaxes, and fraud to disseminate online. In response, Intel announced a new technology called “FakeCatcher” to detect deepfake media with a 96% accuracy rate.
Deepfakes use impressive technology derived from machine learning and artificial intelligence to create scarily accurate impressions of celebrities and politicians doing and saying things they haven’t.
Existing technologies can take hours to dispel web surfers’ trust in a deepfake, as they use deep learning to investigate signs of digital manipulation.
Also: How to spot a deepfake? One simple trick is all you need
Intel’s FakeCatcher can detect a deepfake in real-time by “assessing what makes us human – ‘blood flow’ in the pixels of a video,” according to a press release.
Intel says its technology can identify changes in our veins’ color when blood circulates through the body. Signals of blood flow are then collected from the face and translated by algorithms to discern if a video is real or a deepfake.
It’s becoming increasingly important to have software to help us identify deepfakes to avoid harmful consequences. Some deepfake videos and images are graphic in nature, and others perpetuate distrust in the media.
In the past, scammers have used deepfakes to pose as job seekers to access sensitive company information. They’ve also been used to impersonate prominent political figures to say inflammatory statements.
Also: The next big security threat is staring us in the face. Tackling it is going to be tough
Although some movements and mannerisms in deepfakes give away their deceptive nature, most people mindlessly scroll through their Twitter feed and don’t take the time to find out if a video is real or fake.
And by the time a deepfake garners millions of shares, it’s far too late, as 63% of adults in the US admit that an altered video has confused them about current events, according to the Pew Research Center.