© 2024 WGCU News
PBS and NPR for Southwest Florida
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

How will Australia's under-16 social media ban work? We asked the law's enforcer

Julie Inman Grant, Australia's top internet regulator, will be enforcing one of the strictest social media crackdowns in the world.
Provided by the Australian eSafety Commission
Julie Inman Grant, Australia's top internet regulator, will be enforcing one of the strictest social media crackdowns in the world.

Australia passed one of the strictest internet crackdowns in the world last month, banning children under 16 from being on social media or opening new accounts.

The law, which takes effect a year from now, holds social media companies responsible for verifying kids' ages. Not complying could trigger fines up to nearly $50 million.

The law came over the objections of social media companies, which have criticized it as a form of free speech suppression. Tech companies have also argued that blocking kids from being on social media will drive them to darker, less regulated corners of the internet.

The law's passage comes as scrutiny intensifies in Washington over legislating online safety protections for children, with proposals under debate that would hold platforms responsible for exposing young users to dangerous, hateful or toxic content online.

From Silicon Valley to state capitals, all eyes are on how Australia's law will be implemented, and the person tasked with enforcing the law is Julie Inman Grant, Australia's eSafety Commissioner, the country's top internet regulator.

NPR spoke with Grant about what led to the social media ban, what enforcement looks like and how her agency plans to address the unintended consequences of criminalizing the use of social media for kids under 16.

The conversation has been edited for brevity and clarity.

For our American audience, can you just explain what the eSafety Commission does? 

Grant: The eSafety Commission was set up nine years ago, in 2015, and has been the first online safety regulator in the world. Part of our function is to provide research, prevention, education — and then we've got complaint schemes for kids who are being cyberbullied. For all Australians who've experienced image-based abuse with the non-consensual sharing of deepfakes and intimate images, for instance. And then we do a lot of work around assessing tech trends, becoming an anticipatory regulator so that as new technology paradigms shift and move our way, we're prepared to address them

The new law draws a line at 16 years old, that anyone under that age should not be able to access social media. Why 16?

Grant: We've set out arbitrary numbers for the age of a child for a long time. Many social media apps require users to be 13. But it really depends on the actual circumstances of the child. Do they have parental supervision? Do they have underlying mental health issues? What kind of content are they looking at, and for how long? So a whole range of things are important. The prime minister decided to go with 16, but there were other proposals for 14, or 15.

There are a lot of questions about how age verification will work. A proposal to require government-issued IDs was nixed over privacy concerns. Using facial recognition technology, or biometric scanning, has been discussed. How will these kinds of systems work?

Grant: There are really only three ways you can verify someone's age online, and that's through ID, through behavioral signals or through biometrics. And all have privacy implications. There was big concern with providing government ID. But there are digital identity providers, like one called Yoti, that can estimate someone's age using facial recognition technology.

But we do want to make sure there is not discrimination, or bias, and some of these technologies are less accurate depending on the kind of face being scanned. I met with an age assurance provider last week in Washington, D.C., who is using an AI-based system that looks at hand movements and has a 99% success rate.

Wait, what? Using hand movements to confirm someone's age? 

Grant: Yes. Say you do a peace sign then a fist to the camera. It follows your hand movements. And medical research has shown based on your hand movement, it can identify your age. So there are some innovative solutions out there. But whatever social media companies end up using, it's going to be balanced against privacy, and it must ensure it does not undermine a user's security.

Research that's examined the link between social media use and teens' emotional states has come back mixed. There really is not a super clear causal link between greater use of social media and upticks in anxiety and depression among teens. So knowing this, isn't this law based on a false premise?  

Grant: For teens in marginalized communities, like the LGBTQA+ community, or teens with disabilities, or those who are neurodivergent, our own research has shown that online communities can provide a space for them to feel more at home — almost provide a lifeline — but also be places of hate. So both of these issues have been raised.

I think the genesis of this movement has been Jonathan Haidt, author of the book The Anxious Generation, and he even admits some of the research is mixed. And it's true that it is not necessarily causal. But in many circumstances, it's certainly correlational. And this law is focused on the addictive design and features, and dark patterns that emerge on social media platforms.

Now, messaging services and gaming apps will be exempt. The Minister of Communication will ultimately decide which platforms are in and which are out. And I will do my own separate analysis and make recommendations.

Companies, like TikTok, have said pushing under-16 teens away from established social media apps could make young people drift toward darker corners of the internet where there are no rules or safety measures in place. What's your response to that? 

Grant: I believe we should approach online safety the same way we have water safety. And what I mean by that is: Decades ago, there were tragic backyard drownings in swimming pools. So Australia made a decisive decision that all pools would be fenced, and that would be backed by enforcement. But we don't try and fence the ocean because that's futile. What we do is we teach our children to swim at the youngest age, just like we need to teach them digital literacy. We teach them to swim between the flags. We have lifeguards. We have shark nets where we know there are predators and we teach them about rip [tides].

And you could use the analogy of the algorithmic rip. We want to keep them swimming between the flags where there is supervision, so they aren't going to the darker, murkier waters where there is no supervision. So I think that is a reasonable concern. And the reason I refer to this as a social media restriction rather than a total ban is that messaging and gaming sites and anything that delivers education or health care information, like community forums, will be exempted.

I talked to a 15-year-old in Australia who can't imagine living, or being social, without social media. What do you say to other teens who feel that way? 

Grant: I've been having high-level discussions with social media companies. And there's the possibility that some of the social media functionality could be removed, rather than an entire app being blocked off, to ensure those dark patterns and addictive design features are addressed. And maybe when they turn 16, the full functionality of the social media app can be enabled — whether that's the Snap Map, or being able to post Reels on Instagram.

When this law takes effect, on Dec. 10, 2025, there's not going to be some switch that's flipped off. Every user under 16 will not automatically have their apps disappear. The first thing we've tasked social media companies with doing is identifying who all the under 16-year-old users are on their platforms. We did research in September of this year finding that 84% of 8- to 12-year-olds are already on social media. And interestingly, we asked, "Were your parents or any adults aware that you were setting up these social media accounts early?" And 80% of them said yes. And in 90% of cases, it was parents that helped them set up their accounts. So I wouldn't say it's necessarily willful blindness, but, to date, social media companies may not even exactly know how many under-16-year-old users are on their platforms.

The onus to date has been falling on the parents and the children themselves, and this law is the government making a very definitive statement and saying: We need to put the burden back on you, companies, just like we did with car manufacturers 60 years ago with seatbelts. And now, there's so much lifesaving technology in our cars, like anti-lock brakes and airbags, that we take for granted. Back then, the car manufacturers pushed back, but now they compete on safety. This law is really aimed at making normative change, that the onus should fall on platforms.

Copyright 2024 NPR

Bobby Allyn is a business reporter at NPR based in San Francisco. He covers technology and how Silicon Valley's largest companies are transforming how we live and reshaping society.