Data Protection Authorities Propose Enhanced Safeguards for Children Online

Data protection authorities have outlined a comprehensive set of recommendations aimed at strengthening the privacy and security of children navigating digital environments. These proposals, presented on the occasion of the International Day of Children's Rights, call for significant reforms to existing data protection laws and practices to better address the vulnerabilities faced by minors on the internet.

Focus on Stronger Safeguards in Digital Spaces

The independent supervisory bodies stress the urgent need to update the General Data Protection Regulation (GDPR) in light of the modern digital landscape that increasingly shapes children's lives. Their recommendations target online platforms, mobile applications, and social networks, where minors' personal data is frequently processed and sometimes exploited for commercial purposes.

Ban on Personalized Advertising and Profiling for Minors

Among the key suggestions is the introduction of a blanket ban on personalized advertising and profiling practices involving those under the age of 18. The authorities emphasize that children require special protection, particularly given the complexities and potential risks associated with behavioral advertising and data-driven targeting on social media and other platforms. The proposed measures seek to ensure that digital services operate with privacy-friendly default settings and remove the option for minors to consent to automated decision-making processes without adequate oversight.

Limiting Disclosure of Sensitive Data

The recommendations also advocate for stricter limitations on the collection and processing of sensitive data from children. This includes data related to health, religious beliefs, or political opinions. The intention is to prevent minors from inadvertently sharing information that could impact their digital footprint or privacy for years to come. By restricting the ability to disclose such data, regulators aim to reduce the long-term risks associated with early online exposure.

Ensuring Confidentiality in Health and Counseling Services

Data protection authorities also propose that children be granted confidential access to health and counseling services from a certain age, without mandatory parental notification. This measure is intended to provide a safe space for minors to seek assistance, especially in sensitive circumstances, and to recognize their right to privacy as they mature. Concerns have been raised in the medical community regarding the permanent storage of sensitive diagnoses in electronic health records, and these guidelines aim to address such risks.

Restrictions on Automated Decision-Making

Another core aspect of the proposals is the restriction of fully automated decision-making processes that affect children. The authorities recommend that any significant decisions impacting minors, such as those made by online learning platforms or evaluation systems, be subject to human oversight to ensure fairness and accountability.

Addressing the Use of Children's Data in AI Training

The use of minors' data in training artificial intelligence systems is identified as a growing concern. Researchers have highlighted challenges in verifying the true age of users on social media platforms, which can result in children's content being misclassified and utilized in AI model development. The data protection authorities urge companies to implement robust age verification and opt-out mechanisms, particularly for vulnerable groups, to prevent unauthorized use of minors' personal information.

Call for Legislative Action

Collectively, these ten recommendations are designed to systematically adapt the GDPR and related data protection frameworks to the unique needs of children in the digital era. The authorities advocate for clear legal prohibitions, enhanced default privacy settings, and stronger technical safeguards to support the data rights of minors. The objective is to create an online environment where children can participate safely, free from manipulative practices and excessive data collection.

Ongoing Challenges and Enforcement

Despite ongoing efforts, challenges remain in ensuring that large technology companies comply with child-focused data protection requirements. Legal experts and consumer organizations have noted the complexity and lack of transparency in opting out of data processing for AI training, especially for younger users. The authorities stress the importance of effective enforcement and continued refinement of protective measures as digital technologies evolve.