As privacy professionals, we often forget an important target group when we talk about privacy and security: children and their parents. While we use GDPR legislation to ensure the proper protection of minors, there are challenges to implementing it.
One of the biggest criticisms of the European GDPR legislation is that neighbouring countries have opted for different age limits for valid consent. Belgium has chosen 13 years, while France, the Netherlands, and Germany have chosen 15 and 16 years. This can be confusing for parents and companies who are not always aware of how to better protect the data of children.
Moreover, the current generation of young parents may not be good at teaching their children to consciously deal with privacy. Many parents have rolled into social media themselves, without giving it much thought. Most parents’ accounts are public, and photos of their children are shared on a massive scale.
A parent of young children myself, I have started giving presentations aimed at parents of children in primary school. In this article, I bring together three observations from the dozens of school presentations that I gave in this respect.
Observation 1: The return of Snapchat and issues with incorrect dates of birth.
During this school year, I have heard from parents that Snapchat is now the most popular social media platform among children in primary school. Snapchat is used as a chat platform for one-on-one conversations as well as group chats. However, Snapchat itself has a minimum age limit (13 years) for creating an account.
Many parents create Snapchat accounts for their children and use the parents’ or a random date of birth when registering. That causes problems with automatic content filtering and shielding of features for under-18s. A well-intended action by parents can entail privacy-related risks for their children.
Observation 2: How do we deal with TikTok?
Several actions have been taken towards the use of TikTok in professional environments due to questions about this Chinese social media platform.
However, there is no mention of TikTok and the protection of children anywhere. The domestic version of TikTok, Douyin, which is only allowed in China, has greater protection for young users with hard time limits and content settings.
My opinion on this is that our public broadcaster, which promotes TikTok towards their target group via Ketnet and the daily youth news programme Karrewiet, has an important role to play. According to TikTok’s terms of service, this target group is too young to open an account.
As a side note, we should certainly not underestimate the role of US commercial interests in this crusade against TikTok.
Observation 3: Social media platforms have launched additional features.
GDPR decisions in Europe towards TikTok and Meta have led to the launch of measures and features such as Family Link, which links parents’ and children’s accounts.
Instagram, TikTok, and Snapchat have built-in restrictions on the content that minors see in their social media feeds. An important tip for parents and children is that this filtering depends on the use of the correct date of birth when creating an account.
Thinking about the Future?
As adults, we are already experimenting and using the new generation of AI chatbots in our professional lives (yes, ChatGPT helps me correct grammar mistakes). However, these features are also launched on the various social media platforms used by children. For example, Snapchat has opened its “My AI” chatbot to all users.
Parents need to keep up with the speed of these evolutions and communicate with their children. Teachers and schools may not always provide enough education on these subjects, as they cannot keep up with all the latest evolutions. As a presenter, I must update my slide deck for my school presentations every week.
As certain evolutions cannot be banned anymore, it’s crucial for privacy professionals to provide clear explanations and interpretations to all target groups. We must prioritize educating parents on privacy as well.