In its binding decision, the European Data Protection Board (EDPB) analyzed the design practices implemented by TikTok in the context of two pop-up notifications that were shown to children aged 13-17: the Registration Pop-Up and the Video Posting Pop-Up. The analysis found that both pop-ups failed to present options to the user in an objective and neutral way. The EDPB’s decision was issued last month and covers TikTok’s processing activities between July and December of 2020.
“Social media companies have a responsibility to avoid presenting choices to users, especially children, in an unfair manner—particularly if that presentation can nudge people into making decisions that violate their privacy interests,” said Anu Talis, chair of the EDPB, in a statement. “Options related to privacy should be provided in an objective and neutral way, avoiding any kind of deceptive or manipulative language or design. With this decision, the EDPB once again makes it clear that digital players have to be extra careful and take all necessary measures to safeguard children’s data protection rights.”
TikTok’s Unfair Privacy Practices
In the Registration Pop-Up, children were nudged to opt for a public account by choosing the right-side button labelled “Skip,” which would then have a cascading effect on the child’s privacy on the platform, for example by making comments on video content created by children accessible.
In the Video Posting Pop-Up, children were nudged to click on “Post Now,” presented in a bold, darker text located on the right side, rather than on the lighter button to “cancel.” Users who wished to make their post private first needed to select “cancel” and then look for the privacy settings in order to switch to a “private account.” Therefore, users were encouraged to opt for public-by-default settings, with TikTok making it harder for them to make choices that favored the protection of their personal data. Furthermore, the consequences of the different options were unclear, particularly to child users. The EDPB confirmed that controllers should not make it difficult for data subjects to adjust their privacy settings and limit the processing.
The EDPB also found that, as a result of the practices in question, TikTok infringed the principle of fairness under the GDPR. Consequently, the EDPB instructed the IE DPA to include, in its final decision, a finding of this additional infringement and to order TikTok to comply with the GDPR by eliminating such design practices.
Troubling Lack of Protections
Another problem noted by the EDPB was a troubling lack of protections to keep strangers from accessing the accounts of some minors. TikTok’s ‘Family Pairing’ feature, which was also under scrutiny, was found to be faulty as it allowed adult users to link their accounts with those of minors aged 16 and above, even if they could not verify their status as parents or guardians. This raised serious concerns about potential risks to children, as the non-child user gained the ability to enable Direct Messages. TikTok also failed to provide adequate transparency information to its young users, thus hindering their ability to fully comprehend the platform’s data processing practices.
In January, TikTok was fined €5 million ($5.4 million) by France’s data protection authority (CNIL) for not sufficiently informing users on how it uses cookies and making it difficult to opt-out.