Skip to main content
News Analysis 21 Nov 2024 - 3 min read

Social Media Ban: Platforms face $50M fine and will need to authenticate existing users, definition of platforms affected is filled with nuance

By Andrew Birmingham - Martech | Ecom |CX Editor

I'm 21, I promise. Platforms will have to authenticate anyone who might be under 16 or risk millions in fines.

The Government's new social media ban will almost certainly require young people, perhaps well into their 20s, to reauthenticate their accounts as social media firms and others caught up in the ban seek to manage their risk. And that risk comes with a hefty price - $50m potential fines. Rather than proscribing a list of banned sites, the legislation sets out criteria to define what it calls 'age-restricted social media' which in turn creates some interesting nuance. For instance, Facebook will be banned for teens and kids but Facebook Messenger is fine. Young gamers won't be happy with Discord and Twitch both caught in the net. Snap will have to adjust its 14 to 40 pitch as it gets roped in, but WhatsApp gets a leave pass. 

What you need to know

  • Facebook, X, YouTube, Snap, Pinterest, Tumblr, Roblox, Discord and Twitch are all casualties of the new social media ban, but WhatsApp and Minecraft are breathing a sigh of relief.
  • Platforms are firmly in the frame for fines that run as high as $50M.
  • Existing social account holders are affected too. Users well into their 20s could be forced to authenticate existing accounts as platforms seek to mitigate their risk and ensure the net doesn't let too many people through.
  • All of which adds to the importance of the Age Assurance Trial, now getting underway. 

The Bill puts the onus on social media platforms, not parents or young people, to take reasonable steps to ensure fundamental protections are in place

Michelle Rowland, Communications Minister

Platforms owners, not parents or children, will bear the penalties under the new social media ban introduced into the parliament today as an amendment to the Online Safety Act. And those penalties are potentially severe with platforms on the hook for up to $AUD50M.

What's more, many of the platforms will almost certainly need to authenticate existing users not just new users, given the way the amendments are written.

That means hundreds of thousands of Australian will likely be asked to reauthenticate themselves across potentially all of their social media platforms.

The rationale for the ban is set out in a series of explanatory notes accompanying the legislation. "Until now, the incentive for social media companies has been to optimise user engagement and time spent on platforms. As previously articulated by the chief executive of a major video streaming service, 'we’re competing with sleep, on the margin'. While this impacts all users of social media, it is particularly detrimental to children and young people, who are generally more vulnerable to the harms associated with platforms. As a result, ‘digital natives’ are growing up in an online ecosystem where protections put in place by platforms have not kept pace with the harms."

Introducing the legislation for its first read in the House of Representatives, Communications Minister Michelle Rowland told Parliament, "The Bill puts the onus on social media platforms, not parents or young people, to take reasonable steps to ensure fundamental protections are in place. This is about protecting young people – not punishing or isolating them – and letting parents know we’re in their corner when it comes to supporting their children’s health and wellbeing."

She acknowledged establishing a minimum age for young people having social media accounts was not the only approach that needs to be taken "and we know that this measure will not be met with universal acceptance."

"But this is one step among many that a Government should take in the protection and not the isolation of young people. There is wide acknowledgment that something must be done, in the immediate term, to help prevent young teens and children from being exposed to streams of content – unfiltered and infinite," Rowland said.

The legislation has now been referred to the Senate Environment and Communications Legislation Committee and is subject to review. The Government is hoping to have the legislation passed before Parliament rises and has put 26 November down as its due date for a report. However it is reported the crossbench is looking to have it referred to a public inquiry, while the Greens were looking to February for a report due date, claiming the bill "is rushed and reckless" and hasn't gone through enough genuine scrutiny.

Who gets poked

As to which platforms are affected, there’s no actual list, but rather a set of criteria that defines the concept of an ‘age-restricted social media platform.’ These include:

  • The main purpose, or at least a significant part of the purpose of affected platforms is to help people socialise online
  • It helps people connect to each other
  • It lets people post content which is how companies like Snap and YouTube get roped in
  • It’s not principally about business. So a product like Salesforce Slack is fine, but Discord may not be.

There are some other exemptions. Platforms that can’t be accessed from Australia are not covered by the changes, and the Government can explicitly exempt platforms if it chooses.

Nuance

As might be expected, failure to proscribe a list throws out some unusual nuances. For instance, Facebook is definitely in the frame but Facebook Messenger arguably is not since it doesn’t fit under the broad "social interaction" focus that includes connecting with larger groups or public sharing.

Likewise, Snap, which sees itself as a communications channel, is almost certainly in the frame, while WhatsApp is probably not since it focuses on private messaging and group communication, rather than for broader online social interaction involving public or semi-public content sharing. (Snap may have a different view).

Sites like Tumblr, X and Pinterest are also out of luck. Elon Musk will no doubt cope with his usual good grace.

A little further afield, Minecraft is likely safe. While it allows for some social interaction due to things like its multiplayer servers, the focus is not solely or primarily on socialising for "social purposes" as defined in the Act. The status of Roblox where hedge fund Hindenburg Research recently flagged issues around alleged sexual predation (claims Roblox disputes) is less clear cut. That uncertainty has less to do with the content, but rather that it is arguably a closer fit to the criteria than Minecraft.

Discord, beloved by gamers as a messaging platform, likely gets swept up in the net, likewise Amazon's Twitch.

Who needs to authenticate

A lot of people over 16 and perhaps well into their 20s are likely to get caught up in authentication.

There is no exemption for existing users, the way the Amendments are written. The Act states providers of age-restricted social media platforms must take reasonable steps to prevent children who have not reached a minimum age from having accounts. The implication is clear - platforms will need to implement measures to authenticate both current and new users.

This is where the age authentication trial will become critical. Given the size of the fines, platforms could be forgiven for taking a risk-averse approach – for instance insisting everyone under 25 authenticate to give themselves plenty of wriggle room.

What do you think?

Search Mi3 Articles