Disconnecting Australian Children from Social Media

Flushed with the success last month of granting workers the “right to disconnect” from work communications after hours, the Australian government now plans to disconnect children from social media. On Tuesday, September 10, Prime Minister Anthony Albanese announced plans to introduce legislation to enforce a minimum age for social media access—likely around 14 to 16—before the end of 2024.

The law aligns with Australia’s leadership, whose 2021 Online Safety Act installed an eSafety commissioner, and comes from the parliamentary inquiry into “the influence and impacts of social media on Australian society” announced on May 10. Albanese is no fan of social media. The parliamentary inquiry was announced just after an Australian court ruled on the limits of the eSafety commissioner to compel X to take down content internationally deemed harmful to Australians and at the same time as Meta announced that it would not be renewing agreements to compensate Australian news media outlets for lost online advertising revenues formerly used to subsidize news content creation.

kids check social media on their phones
Via Adobe Images

While the proposed law is likely to prove politically popular in Australia, and internationally will please proponents of Jonathan Haidt’s call to ban children’s use of smartphones, significant concerns exist about its likely success and the feasibility of implementing it.

First, most social media platforms already have terms stating a minimum age for those signing up. For Meta, this is 13. However, enforcing compliance is problematic. Without access to reliable proof-of-age tools at sign-up and every use, it is impossible to verify that the individual using the service is not a child. It has proven problematic in the UK for firms to verify that users of adult sites are adults even with a suite of age-verification tools available, such as using banks and credit card issuers to verify age, matching the user with a pre-loaded photograph, and facial age estimation. And that’s before we get to the matter of privacy or whether these tools are appropriate for people under 18. However, Australia is undaunted. A tender was announced for a trial to assess technological solutions—including AI—to the age-verification problem.

Second, obtaining cooperation from international platforms to comply with an Australia-specific law or technology will be difficult. These firms must serve in multiple jurisdictions with different laws and provisions. The costs of adjusting to local requirements are not trivial. As Meta and X have shown in their various encounters with Australian legislative provisions, these firms might not be legally accountable, could simply ignore the rules, or could choose not to offer their services in those jurisdictions. At best, the law is impotent; at worst, adult users of the platforms in Australia will be denied too.

Third, the law presumes that children’s use of social media platforms is more harmful than beneficial. It is far from clear that this is the case. Yes, online bullying and access to adult and other undesirable content are negative, but access to safe content and the ability to engage with others online in a social context is very valuable for some children too. For marginalized and rural groups, social media interaction is vital—just a modern equivalent of telephones and citizens band radios used in the past for these purposes. And just imagine how even urban young people would have coped through the COVID-19 pandemic lockdowns without social media to connect with each other.

Fourth, as is illustrated with other online bans (e.g., content removal), the undesirable behavior does not go away; it simply shifts to other unsupervised places. While mainstream media companies may comply with a ban, activity by informed individuals (and there are many younger than 14 out there) can simply shift to unsupervised spaces.

Finally, a ban removes the opportunity to use education constructively to teach children how to use social media responsibly and keep themselves safe online. It also abrogates parental responsibility for managing children’s online experiences and educating them and places it in the hands of the state and social media companies.

Rather than racing to implement legislative bans, one wonders what could be achieved by putting more effort into researching social media’s actual effects on all groups in society. Until such evidence exists, the risk is that regulation is responding primarily not to measured evidence of real harms but to anecdotal accounts, uncertainties, prejudices and biases. Sometimes, doing nothing and learning more before acting is better than acting too hastily. Bad regulation once passed is rarely rapidly repealed.

The post Disconnecting Australian Children from Social Media appeared first on American Enterprise Institute – AEI.