The face of the donor file is copied and pasted, it mimics the emotions and motions of the source file.
Generated faces by a neural network that learned with a big dataset to create new unique faces.
The person in the source file mimics the emotions and motions of the donor file.
How do neural networks make deepfakes?
Before a neural network can make deepfakes, it needs to learn about the object it has to manipulate. Therefore, the neural network needs a dataset of this object. A bigger dataset generally means higher quality.
At the moment, pre-trained neural networks can be found and bought online. With these products even less tech savvy people can create deepfakes.
The neural network 'knows' after training how to alter certain objects. When the neural network is provided with donor footage, the motions can be copied and mimicked by the source footage.
The software alters the image or video by altering the pixels within the source footage. In the alteration and adding of the pixels, artefacts can occur which are traces that sometimes can be detected by us humans.
Sometimes the deepfake neural network has a feature to fool detection software by disguising the deepfake artefacts within the image.
These techniques range from quite simple to technically advanced. The full images can be down-sampled but also so-called 'blackbox'- and 'whitebox attacks' can be used. The techniques make it harder for detection software to spot the deepfakes.
How does DuckDuckGoose detect deepfakes?
The DeepDetector pinpoints the pixels in the image the software used to determine whether the analysed footage is a deepfake or authentic. This analysis is called the 'activation map'. With this analysis the users of the software can understand the conclusion and evaluate the decision of the software themselves maintaining their autonomy.
The DeepDetector is a neural network. It outputs probabilities which indicate whether the input is deemed a deepfake or not. Higher percentages indicate that the software is more confident of its decision.
The DeepDetector is trained on a dataset of deepfake and authentic images. During training it learns to distinguish authentic images or videos from deepfaked ones. As more deepfake techniques are invented, we will increase the datasets used for training the DeepDetector. This ensures the DeepDetector will always stay up-to-date with the latest advancements in deepfake generation.
Learn more about the DeepDetector here!
How to detect deepfakes
What can you do to protect yourself from being a victim of deepfakes or being influenced by them? Deepfakes are becoming better and high quality deepfakes are becoming easier and cheaper to make. It consequently becomes harder to spot deepfakes with the naked eye. However, at this moment you can still look for several typical graphic inconsistencies (called artefacts) in deepfakes, the typical artefacts to look for are listed below.
- Colour transition
In neural puppetry and faceswap deepfakes, the face is manipulated. Sometimes a hard colour transition can be seen on the edges of the area the deepfake software manipulated.
- Unnatural looking eyes
Eyes are rather hard to deepfake due to the complexity. Check if the reflections in the eyes have the same angle. Does the person in the video blink? Are the irises of both eyes equally large?
Items like glasses might look convincing at first, but, when looking a bit closer, might actually contain some artefacts.
- Blurry and nonsensical context
StyleGAN deepfakes are good in making faces, but bad at creating context that makes sense. Check the background and clothing, for example, to see if these make any sense.
What else can you do when you are in doubt?
Does the footage spark an emotional response for you? Disinformation often tends to influence your opinion and does frequently do so by sparking negative emotions regarding specific people. Do you react emotionally to the footage? Stay calm and check the information more closely. By thinking critically one could potentially question the footage on its authenticity.
Use free online tools
There are several fact checking websites like Snopes and Bellingcat that you can visit for free. For faceswap and neural puppetry deepfakes a source file is needed to create the deepfake. If in doubt, you can reverse image or video search the image or video. And, of course, you can always use our online tool, the DeepDetector!