TikTok, the popular video-sharing app, has been fined €345 million ($368 million) by the Irish Data Protection Commission (DPC) for violating the European Union’s General Data Protection Regulation (GDPR) in relation to its handling of children’s personal data.
The DPC, which is the lead regulator for TikTok in the EU, issued its decision on Friday after a lengthy investigation that began in December 2020. The DPC found that TikTok had breached several articles of the GDPR, including:
- Failing to provide adequate transparency information to child users about how their data is processed and shared.
- Implementing “dark patterns” to nudge users towards choosing privacy-intrusive options during the registration process and when posting videos.
- Setting the profile settings for child users to public by default, thereby exposing them to potential risks from anyone who can view their content, including predators and bullies.
- Allowing a non-child user, who is not verified as a parent or guardian, to pair their account with a child’s account and disable some of the safety features, such as direct messaging limits and content filtering.
The DPC ordered TikTok to bring its data processing practices into compliance with the GDPR within three months and to pay the fine within 28 days. The fine is the second-largest ever imposed by a European data protection authority under the GDPR, after the €746 million ($888 million) fine levied against Amazon by Luxembourg’s regulator in July.
TikTok said it respectfully disagreed with several aspects of the decision, particularly the level of the fine, and that it was considering its next steps. The company also said it had made several changes to its app before and during the investigation to enhance the protection of teenagers, such as setting accounts for 13-15 year olds to private by default and removing the option for anyone to comment on their videos.
The DPC’s decision is the latest in a series of regulatory actions against TikTok over its data privacy and security practices. In 2019, TikTok agreed to pay $5.7 million to settle allegations by the U.S. Federal Trade Commission that it violated the Children’s Online Privacy Protection Act (COPPA) by collecting personal information from children under 13 without parental consent. In 2020, TikTok faced bans or restrictions in India, Pakistan, Indonesia and other countries over concerns about its content moderation and national security risks.
The DPC’s decision highlights the challenges that TikTok and other social media platforms face in complying with different data privacy laws around the world. While the EU has a comprehensive and strict data protection framework under the GDPR, which applies to any company that offers services or products to EU residents, the U.S. has a patchwork of federal and state laws that cover specific sectors or types of data.
For example, California was the first state to enact a comprehensive data privacy law, the California Consumer Privacy Act (CCPA), which gives consumers more control over their personal data and imposes obligations on businesses that collect or sell such data. Other states, such as Virginia, Colorado and Connecticut, have followed suit with similar laws. However, there is no federal law that regulates data privacy across all industries and states.
This creates uncertainty and inconsistency for both consumers and businesses, especially in the digital age where data flows across borders and jurisdictions. A federal data privacy law could harmonize the rules and standards for data protection in the U.S. and provide more clarity and certainty for all stakeholders. It could also facilitate cooperation and coordination with other countries and regions that have their own data privacy laws, such as the EU.
However, enacting a federal data privacy law is not an easy task, as it involves balancing various interests and perspectives from different stakeholders, such as consumers, businesses, regulators, lawmakers and civil society groups. Some of the key issues that need to be addressed include:
- The scope and definition of personal data and sensitive data
- The rights and obligations of consumers and businesses regarding data collection, use, sharing and deletion
- The enforcement mechanisms and remedies for data breaches or violations
- The exemptions or exceptions for certain sectors or purposes, such as health care, education or national security
- The compatibility or interoperability with other data privacy laws or frameworks
A federal data privacy law could benefit both consumers and businesses by enhancing trust and transparency in the digital economy. However, it also requires careful consideration of the costs and benefits of different approaches and options. Ultimately, it depends on how much value and importance is placed on data privacy as a fundamental right and a social good.