TRENDING NEWS
Back to news
14 May, 2025
Share:
Under-16 social media ban: Three pain points as Australia grapples with how to implement it
@Source: nzherald.co.nz
A cross-jurisdiction effort to chase the US-based owner Elon Musk of X, formerly Twitter, for a fine of that size could be challenging. (There will be no punishment for children or adults who break the rules.) Another is that Australia’s eSafety Commissioner Julie Inman Grant – who is charged with hashing out how the ban will work – earlier noted research by her office that found 84% of 8- to 12-year-old Australians are already on social media. So, as well as some kind of new system to verify the age of new users (which is currently on faith), the likes of Facebook and TikTok will also have to correctly identify all of the under-16s already on their platforms. Previous Australian legislation has threatened big tech with tens of millions of dollars in fines for violating harmful content or fraud and cyber-security laws – or even jail time for social media CEOs – but there has yet to be a test case (although it could be argued that the threat of tougher penalties has seen Meta introduce tougher anti-scam measures to Australia ahead of New Zealand). Which platforms will be covered? It’s recently been reported that Australian government officials have, on the quiet, let Google know that its YouTube video-sharing platform will be exempt from the ban. It’s also been reported that gaming platforms will get a free pass, despite many now having a strong social element. Ditto for WhatsApp (which, like Facebook and Instagram, is owned by Meta). Those are not unreasonable expectations, given Australia’s (now re-elected) Prime Minister Anthony Abanese said on November 29 last year: “The bill ensures that the law is responsive to the ever-evolving nature of technology, while enabling continued access to messaging, online gaming and services and apps that are primarily for the purposes of education and health support – like Google Classroom and YouTube." Around the same time, Communications Minister Michelle Rowland said it was “likely” YouTube would get an exemption. And while the legislation doesn’t name any specific services, it does offer a reasonably tight definition of the type of platform restrictions should apply to, including a service that “has the sole or significant purpose to enable social interaction between two or more end-users”. But a spokeswoman for Inman Grant (who has been on leave during the post-election lull, during which New Zealand’s U16 furore has broken out) said no decisions had been made about which platforms would be restricted, or not. “The legislation requires the minister to seek the advice of the eSafety Commissioner and to have regard to that advice before making the rules specifying which services are covered,” the spokeswoman said. “The eSafety Commissioner will provide independent advice to the minister on the proposed rules once the request for the advice is received. “The minister has not yet sought this advice and accordingly, the eSafety Commission has not yet provided it.” That minister won’t be Rowland. A post-election reshuffle – revealed yesterday – saw her named Attorney-General, while Sports Minister Anika Wells added Communications to her portfolios. Some features could be banned, rather than whole apps And there’s another twist. “When this law takes effect, there’s not going to be some switch that’s flipped off. Every user under 16 will not automatically have their apps disappear,” Grant told NPR in December last year. “I’ve been having high-level discussions with social media companies. And there’s the possibility that some of the social media functionality could be removed, rather than an entire app being blocked off, to ensure those dark patterns and addictive design features are addressed,” the eSafety Commissioner said. “And maybe when they turn 16, the full functionality of the social media app can be enabled – whether that’s the Snap Map [a SnapChat feature that allows you to track your friends’ movements, or they to keep tabs on your location] or being able to post Reels [videos] on Instagram.” How to tell someone’s age? The U16 law puts an obligation on social media platforms to “take reasonable steps” to ensure age-restricted users do not use the platform. But it does not prescribe what “reasonable steps” are required to ensure compliance nor provide any technology or methodology requirements on age assurance. Grant’s eSafety office is responsible for developing guidance on the “reasonable steps” certain service providers will be required to take, the spokeswoman said. Uploading a birth certificate, passport or driver’s licence is one way to establish age – especially if combined with footage from your laptop or smartphone camera. If you’ve ever had a Facebook account hijacked, you’ll know that Meta requires just that before it will offer support. But privacy advocates questioned the desirability of sharing such documents en masse with Big Tech firms, which have often been in hot water over the way they treat data. Albanese agreed, saying: “The bill makes clear that no Australian will be compelled to use government identification – including Digital ID – for age assurance on social media. Platforms must offer reasonable alternatives to users.” “There are really only three ways you can verify someone’s age online, and that’s through ID, through behavioural signals or through biometrics – and all have privacy implications,” Grant told NPR. She said she had met with an age-assurance provider in Washington DC that used an artificial intelligence-based system that looked at hand movements with a 99% success rate. “Say you do a peace sign, then a fist to the camera. It follows your hand movements. And medical research has shown based on your hand movement, it can identify your age,” Grant said. Does facial recognition work for age? An issues paper released by Grant’s office links to a summary of various facial recognition technology studies that says: “Despite advancements in machine learning [a form of artificial intelligence, or AI], a preliminary review of age estimation algorithms concludes their lack of suitability for restricted access systems ... one in every five subjects will be incorrectly classified as belonging to an age group that is not their own.” The eSafety issues paper says: “Accuracy is strongly influenced by algorithm, sex, image quality, region-of-birth, age, and interactions between these factors. For example, some algorithms had higher error rates for people wearing glasses and error rates were almost always higher for female faces than for male faces.” There was “higher accuracy for faces categorised as Caucasian”. Search for the best way continues “The Department of Infrastructure, Transport, Regional Development, Communication and the Arts is responsible for the Age Assurance Technology Trial, which is currently considering the effectiveness of a range of technologies and their suitability in the Australian context,” the eSafety spokeswoman said. “To inform this guidance, eSafety will soon commence a broad and meaningful consultation to ensure we hear the views and expertise of domestic and international stakeholders. We will be talking to industry, academics, advocates, rights groups and most importantly, children and young people themselves.” In short, it’s still totally up in the air whether any age-verification technologies are effective or practical enough for eSafety to recommend them as the “reasonable steps” that social media firms must adopt to ensure under-16s do not access their platforms. Postscript: A ‘pause’ The eSafety spokeswoman said it “does not refer to the Social Media Minimum Age Bill as a ‘ban’”. “It is more accurately described as a social media delay. “This delay will help us protect young people’s health and wellbeing by keeping them from being exposed to harmful and deceptive forces online. “It also allows us more time to equip young people with the digital literacy and skills they need to engage online safely.” Social media firms have introduced more restrictions recently, with Meta expanding Instagram for Teens to cover under-16 users in New Zealand – including protections from being messaged by people who do not follow you and various parental controls. In a social media post, Grant said the social media platforms’ various parental controls had been introduced in piecemeal fashion and were too difficult for parents to find out about and use. Chris Keall is an Auckland-based member of the Herald‘s business team. He joined the Herald in 2018 and is the technology editor and a senior business writer.
For advertisement: 510-931-9107
Copyright © 2025 Usfijitimes. All Rights Reserved.