1. Protection of minors in the digital environment and fundamental rights

The protection of minors in the digital environment is a multifaceted issue that may be examined from a range of disciplinary perspectives, including sociology, psychology, political science, and law. The issue has recently become the subject of renewed attention at the global level, particularly in light of the decision of the Australian government to prohibit minors under the age of sixteen from holding accounts on social networking platforms.

International human rights law recognises freedom of expression as a fundamental right of all individuals, including minors. In this respect, two key provisions merit consideration. First, Article 19(2) of the International Covenant on Civil and Political Rights (ICCPR) guarantees the right of every individual to freedom of expression, defined as the «freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of his choice». Second, Article 12 of the Convention on the Rights of the Child (UNCRC) requires States Parties to ensure the right of minors «to express those views freely in all matters affecting them», with due weight being given to their age and maturity.

In addition, Article 16(1) of the UNCRC recognises the right of the child to privacy, which must be protected against «arbitrary or unlawful interference with his or her privacy, family, home or correspondence», as well as against «unlawful attacks on his or her honour and reputation».

In light of this framework, the Australian model appears to adopt a particularly restrictive regulatory approach, insofar as it excludes minors below the age threshold of sixteen from any form of access to digital platforms. Such an approach effectively results in the inability of these individuals to freely express their opinions online and, at the same time, to access information and form their views through digital tools. Moreover, this regulatory strategy indirectly relieves parents and educators of their responsibilities to monitor and guide minors’ online activities, insofar as it replaces parental supervision with a blanket statutory prohibition.

By contrast, the European Union (EU) has placed at the centre of its regulatory action the development of a safe digital ecosystem in which minors can express themselves and can acquire the skills necessary for informed digital citizenship. This approach does not exempt parents from their educational responsibilities; rather, it is grounded in the adoption of privacy-by-design mechanisms, the use of responsible algorithms, and the development of age-appropriate user interfaces.

  1. The Australian Regulatory approach: the Online Safety Amendment (Social Media Minimum Age) Bill 2024

In 2022, the Online Safety Act 2021 (OS Act) entered into force, introducing a regulatory framework aimed at defining the responsibilities of Internet service providers (ISPs) with a view to promoting and strengthening online safety, particularly in relation to phenomena such as serious online abuse, cyberbullying, and the dissemination of harmful content. In continuity with this framework, two years later the Online Safety Amendment (Social Media Minimum Age) Bill 2024 entered into force on 10 December 2025, with the objective of imposing mandatory age limits on access to certain digital platforms, primarily social media services such as TikTok, Instagram, Facebook, X, and Snapchat.

More generally, the legislation applies to designated platforms, ten in total, that satisfy three cumulative criteria: first, the service must allow end users to publish content; second, it must enable interaction between users; and third, its primary purpose must be to facilitate online social interaction among end users.

In light of these applicability criteria, while the list of platforms subject to age restrictions is not exhaustive, it is clear that certain categories of services fall outside the scope of the legislation. These include, in particular, online video games, such as Roblox and LEGO Play, as well as standalone messaging applications, including Messenger and WhatsApp. By contrast, messaging services that are accessed through social media accounts, such as Threads, and that incorporate features characteristic of social networking platforms, enabling forms of interaction that go beyond direct messaging, may fall within the scope of the age-based restrictions.

Within this framework, pursuant to Part 4A, Division 1, section 63D of the Online Safety Amendment (Social Media Minimum Age) Bill 2024, ISPs are required to adopt reasonable and effective measures to prevent minors below the minimum age from holding, accessing, or creating social media accounts. The legislation deliberately refrains from setting out an exhaustive list of measures that would be considered adequate and effective, in order to avoid imposing rigid compliance mechanisms that may become outdated in a fast-evolving digital environment. Failure to comply with these obligations may give rise to the imposition of administrative fines of up to approximately 50 million Australian dollars (equivalent to approximately 28.3 million euros). In addition, ISPs are required to implement age verification mechanisms capable of ensuring an adequate level of accuracy.

Compliance with these obligations is governed by a graduated model, commonly referred to as a cascading approach. This model prioritises the use of the least intrusive controls available, including the analysis of personal data already held within platform databases. Only where such controls indicate the need for more stringent verification, or where the precise determination of a user’s age is required, are platforms obliged to resort to age estimation systems based on artificial intelligence, combining facial recognition technologies with probabilistic age assessment. These systems are designed to minimise the processing of personal data, in accordance with the applicable privacy protection framework, in particular under the Privacy Act 1988 and the OS Act.

In order to strike a balance between the deployment of such tools and the protection of personal data, Part 4A, Division 3, section 63F of the Online Safety Amendment (Social Media Minimum Age) Bill 2024 prohibits platforms from collecting or retaining personal information, including age-related data, for purposes other than determining whether a user is subject to age-based access restrictions. The same provision further requires that such information be destroyed once it has been used for that purpose. Failure to comply with these obligations constitutes an interference with privacy under the Privacy Act 1988 and may give rise to the imposition of significant sanctions.

  1. The EU approach: regulatory fragmentation and the role of soft law

While Australia has become the first country in the world to introduce a statutory ban on access to social media for individuals under the age of sixteen, the EU remains engaged in an ongoing debate on the most appropriate strategies for protecting minors from the risks associated with social networking platforms.

Unlike the Australian approach, which is characterised by a uniform and prohibitive regulatory model, the European framework continues to be marked by regulatory fragmentation. In the absence of a binding legal instrument specifically and exclusively dedicated to the digital protection of minors, the EU has primarily relied on a combination of sector-specific legislation and soft law instruments to address the issue.

In this context, on 26 November 2025 the European Parliament presented a proposal aimed at establishing a minimum age of sixteen for access to social media, video-sharing platforms, and artificial intelligence devices. The proposal, however, allows minors between the ages of thirteen and sixteen to access such services subject to parental authorisation. This approach reflects a logic of shared family responsibility rather than an absolute prohibition, in contrast to the Australian model, which excludes minors under the age of sixteen from social networking platforms irrespective of parental consent (Part 4A, Division 1, section 63 of the Online Safety Amendment (Social Media Minimum Age) Bill 2024).

At present, the protection of minors’ rights in the European digital environment is ensured through a heterogeneous set of provisions dispersed across several EU legal instruments, including the Digital Services Act (DSA), the Audiovisual Media Services Directive (AVMSD), and the General Data Protection Regulation (GDPR).

Within this fragmented regulatory landscape, on 14 July 2025 the European Commission adopted its Guidelines on measures to ensure a high level of privacy, safety and security for minors online, with specific reference to Article 28(4) of the DSA. These Guidelines constitute a key soft law instrument aimed at enhancing the online protection of children and adolescents by providing a non-exhaustive catalogue of proportionate and appropriate measures designed to mitigate risks such as exposure to harmful content, cyberbullying, and exploitative commercial practices.

First, the Guidelines recommend that online platforms accessible to minors integrate high standards of privacy, security, and protection at the design stage, adopting a human-centric approach, as set out in Section 4 (“General Principles”), Article 17(c). These principles are further specified in Section 6.3.1, Article 57, which provides, among other measures, for the prohibition of downloading or capturing screenshots of contact, location, or account information shared by minors, as well as the deactivation of non-essential tracking functionalities, including geolocation, microphone access, contact synchronisation, and access to photos and the camera.

Second, pursuant to Section 6.1.4, Article 49, the Guidelines emphasise the need for effective, accurate, and reliable age-assurance methods, subject to regular oversight based on clear and publicly accessible metrics. Providers are required to carry out periodic assessments of the technical accuracy of such systems and to ensure their alignment with the most advanced technological standards. In his respect, age-assurance mechanisms must be robust, resistant to circumvention, and non-intrusive, in accordance with Article 28 of the DSA.

To this end, ISPs may collect only data that are strictly necessary for age-assurance purposes, and the use of such mechanisms for additional objectives, including identification, localisation, profiling, or tracking, is expressly prohibited. These processing activities must also comply with Article 12 of the GDPR, which imposes transparency obligations and requires that users be adequately informed of potential risks, while ensuring compliance with the principle of non-discrimination.

  1. The Italian regulatory framework on minors’ access to social media

In the absence of a European framework establishing an age threshold for access to social media, several Member States of the EU are pursuing assertive national initiatives, drawing inspiration from the Australian experience. In this context, on 26 January 2026, the French upper house approved Bill no. 2107, tabled on 18 November 2025. The bill seeks to amend Law no. 2004-575 of 21 June 2004 on confidence in the digital economy by introducing a section specifically devoted to the online protection of minors. The proposal provides for a prohibition on access to social media for children under the age of fifteen. Should the text be definitively adopted by the French Senate, it will enter into force on 1 September 2026, thereby making France the second country worldwide, after Australia, to impose such a ban.

Italy is following a similar regulatory trajectory. As of 30 September 2025, Bill no. 1136 has been under consideration by the competent Committee of the Senate. Entitled «Provisions for the protection of minors in the digital environment», the bill seeks to delineate an adequate and effective regulatory framework for the protection of minors, aimed at maximising the opportunities offered to them by the digital space, while minimising the risks to which they are exposed online, without, however, hindering the country’s digital transformation process.

In this context, three main innovations introduced by Bill no. 1136 are worthy of note. First, Article 2 provides that «activating accounts on online social networks and video-sharing platforms is permitted for minors only after they have turned fifteen years of age». In the event of a violation, the nullity of the contract is established pursuant to Article 3. However, a single exception to this rule is provided for under Article 3(1), which states that only «accounts already created and held by minors under the age of fifteen» who, at the time of the entry into force of the present law, have already reached the age of fifteen shall remain valid.

Furthermore, Article 2(2) stipulates the procedures through which platforms are obligated to verify users’ age. In line with the Australian approach, the Italian legislature refrains from predetermining specific technical methods for age verification. This choice reflects two considerations. First, it avoids locking rapidly evolving technologies into statutory law. Second, it assigns responsibility for detailed age-verification rules to the Italian Communications Regulatory Authority, in line with the applicable European sectoral framework. To this end, the bill expressly refers to the use of a «national mini-wallet» for age verification, situating it within the broader framework of the European Digital Identity (EUDI) Regulation, which seeks to establish a universal and secure European digital identity wallet.

Finally, Article 4 amends Article 2-quinquies of the Personal Data Protection Code by adjusting the statutory age threshold for digital majority. Under Article 8 of the GDPR, the age of digital majority is set at sixteen, without prejudice to the discretion granted to Member States to provide by law for a lower age, provided that it is not set below thirteen. In exercising this discretion, the Italian legal system has established, in Article 2-quinquies of the Personal Data Protection Code, that a minor may validly consent to the processing of personal data from the age of fourteen. The resulting framework thus gives rise to a two-tier age model, under which the applicable threshold varies according to the type of service: fifteen years for access to social networks and video-sharing platforms, and sixteen years for other online services.

  1. Concluding remarks in the post-ban framework

The Australia’s adoption of the Online Safety Amendment (Social Media Minimum Age) Bill 2024, despite strong opposition, has given rise to an intense debate extending beyond the national context to academic and institutional circles at the international level. Proponents of the introduction of a minimum age threshold of sixteen for access to social media primarily rely on neuropsychological considerations and on the need to safeguard minors’ mental health. By contrast, a wide range of scholars, civil society actors, youth advocacy groups, and non-governmental organisations (NGOs) in the technology sector have drawn attention to the risks associated with the general exclusion of minors from digital spaces, particularly with regard to access to information, social participation, and the development of digital skills.

This debate must also be situated within a broader trend of renewed regulatory attention to minors’ online activities, as evidenced by similar initiatives that have been introduced or discussed in Europe, the United Kingdom, Belgium, and China, where the so-called “Minor Mode” Regulation has been implemented.

From a comparative perspective, a generalised ban on access to social media risks imposing disproportionate restrictions on minors’ fundamental rights. This consideration calls for the prioritisation of regulatory models capable of reconciling protection, proportionality, and accountability on the part of the actors involved, while avoiding rigid solutions that, although motivated by legitimate protective objectives, may ultimately prove incompatible with the effective and holistic protection of minors in the digital environment.

Share this article!