With technologies developing extremely fast, it is becoming more complicated to locate the boundary between reality and illusion. As a result, it is becoming easier to use technologies for harmful activities, such as bullying. Get to know deepfakes and explore how to talk about them with children.
What is a deepfake
Deepfakes use deep learning (a form of machine learning) to create pictures, videos, or audios that fake events – this is also where the name comes from. They usually duplicate someone’s voice and/or facial features and paste them onto an existent recording or a photograph. The result depicts people in situations where they were never actually involved; this is also why deepfakes can be – and often are – used against people as a means of mocking and even bullying.
Simple versions of deepfakes can be easily made using mobile apps, such as FaceApp or FaceSwap, but the more complex deepfakes often require skill and proper technical equipment. Consequently, we can encounter videos of Mark Zuckerberg speaking about stealing people’s data or Tom Cruise doing magic tricks. While they may seem convincing, these videos are not real – they are just well-made deepfakes.
The uses of deepfakes
At first, most deepfakes involved pornographic content, and this trend continues until this day. In 2019, an AI firm Deeptrace searched the internet for deepfake videos, finding that 96% of the deepfakes online were pornographic. The issue of unconsented deepfake pornography is yet to be legally solved. Deepfakes were also previously weaponised to discredit people and harm their careers. But while the media commonly covers these situations, discussions of how deepfakes endanger children are rare.
With the development of apps that make deepfakes accessible to various people, the synthesised videos have found their way into schools and become a vehicle for bullying. What does this look like? Imagine, for example, putting the face of a shy girl into a music video of scantily dressed women dancing provocatively. While the creators of such videos may view their actions as an innocent source of entertainment, the created deepfakes can cause the targeted child to experience shame. The unpleasant experience can easily damage their relationship with school.
Still, deepfakes can also be used for good. For instance, they can allow children to become stars of their favourite shows, video games, and movies. The Dalí Museum in Florida has used deepfake to generate interactive videos of the Catalan artist. The deepfake Dalí can now greet the visitors and even respond to them, making the museum more attractive to a young audience. Overall, as with other technologies, it would be short-sighted to judge deepfakes as purely evil without considering their possible benefits. Still, some steps need to be taken to prevent the malicious use of deepfakes against children.
How to talk about deepfakes with children
1. Watch some of the deepfake videos together and start a conversation.
Nowadays, children are often in much closer contact with technology than adults, so you may be surprised at how much they already know about the unreliability of internet content. Still, the first step in preventing deepfake misuse is to take the time to watch deepfake videos with your children and explore the issue together. Discuss why deepfake videos exist and what can they be used for. Share with them whether or not you enjoy the videos and why. Talk about responsibility and consent – explain why we should only use someone’s face or voice after getting their approval. Create a space of safe sharing, and tell your child that if they ever encounter the use of deepfakes for bullying, they can let you know. Make sure that children know whom to contact in case of need – their family and specified authorities in their school.
2. Try to spot the differences between deepfake and real videos together.
It is getting more challenging to identify a deepfake, but still, some attributes can make the falsity of the videos noticeable. First off, look for unusual movements, such as unnatural blinking. Deepfake videos tend to have issues replicating the more subtle physical attributes: the audio may not fully correspond with the direction of the person’s lips, and there may be glitching along the lines of the person’s face or close to the hairline. There are also common discrepancies in the lightning – the face can be lighter/darker than the body, or there may be an unusual shadow, or each of the eyes may reflect a different image. However, it needs to be said that some of these glitches may soon be hard to notice as the technology continues to develop rapidly. Finally, look at the content. When the words coming from the person’s mouth are scandalous, hard to believe or clearly put in a way to evoke an emotional reaction, there is a greater possibility that the video is a deepfake.
3. Talk to children about being online.
When kids put their images and videos on the internet, they might unknowingly share all that is needed to create a potentially harmful deepfake video of them. In addition to looking at what your children share, talk to them about what social media or apps they use. While larger platforms may have more advanced privacy settings, some newer or less famous social media apps or websites may not protect your child’s privacy. You can offer your kids some alternatives, such as suggesting that they share their photos only in private group chats with your family or close friends. Presenting possibilities instead of judgment and coming up with substitutions rather than prohibitions can help ease your child’s protection.
4. Explore with your children and find preventative measures.
Finally, explore the internet with your children and get to know the world they may be joining. Respect your child’s privacy, but perhaps ask them to scroll through their social media with you to see what type of content they consume. If your children decide to use apps that enable them to create deepfakes, such as FaceApp or FaceSwap, try them together and use the app for entertainment, awareness and education, not for mocking others. Despite the possible misuse, deepfakes can be a great way to explain the possibilities of technology to children and perhaps to discuss hoaxes and fake news in more depth.
If you would like to take further preventative measures, try our security solutions for parents.
Blogs such as this, ESET’s WeLiveSecurity and the industry-backed Internet Matters can offer great advice and support for parents who don’t know their Fortnite from their Minecraft. Visit Digital Matters - a free online educational platform created to help educate children about online safety through a variety of interactive materials.