Social media ban: Porn age checks in play, gambling's status unclear, exec in charge says centralised age authentication 'possible', not ideal
The government has yet to decide whether to create a single centralised system that all social media platforms would need to integrate with, although that would be unusual according to Tony Allen, the executive leading the consortium overseeing the trial. Meanwhile a scientist tasked with assessing the rigour of the results of the trial says stopping people under 18 from accessing porn is definitely in play, although there are question marks over gambling content. For its part the government is waiting until the results of the trial before determining the design of any age assurance process, as well as policy questions such as to whether every Australian with social media accounts will need to authenticate – not just kids – and whether to ban will apply to the current accounts of children and teens on social media. The proposed ban comes in light of a parliamentary committee report revealing that over 90 per cent of Australian adolescents use social media daily. Artificial intelligence is expected to play a key role.
What you need to know:
- The Australian Government Age Assurance Trial ahead of a proposed social media ban for users under 16 years old will also ban under 18s from accessing porn, but the status of gambling content is unclear.
- Over 90 per cent of Australian adolescents use social media daily, with most starting before their 16th birthday.
- The proposed social media ban has widespread community support and new data from Ipsos reveals Australians have a much more negative view of the platforms than their peers overseas.
- UK based Age Check Certification Scheme (ACCS) will lead a consortium which also includes Australian firm KJR, and scientists including the UNSW AI Institute’s chief scientist (in a private capacity) who will assess age assurance technologies, focusing on privacy, security, and effectiveness.
- The project will run for eight months and evaluate various technology solutions to verify users' ages, meaning the method of implementing the social media will likely not be known when the legislation goes to Parliament.
- The government is expected to wait until after the trial before addressing how broadly the ban will be applied, the design of any age assurance system, and whether gambling content is covered by the ban will have to wait. However Comms Minister Michelle Rowland has ruled out forcing Australians to upload official credentials to circumvent a ban. But that does not negate forcing Australians to authenticate via other means, such as facial recognition.
- Experts warn that current age verification technologies may struggle with accuracy, particularly for users aged 12 to 16.
- Execs working on the age assurance trial believe it will not apply to children who are currently registered, or to those who register before the ban is put in place.
- A recent Parliamentary report highlights the dual nature of social media's impact on youth, noting both benefits and significant mental health risks.
Typically, governments will say 'you're creating a risk, and we're going to legislate to say that we want you to address that risk in a certain way.' They usually leave it to them to work out how to do it, but they don't normally prescribe how.
The executive overseeing the Commonwealth's Age Assurance Trial for a potential social media ban says a single government-run authentication system isn’t the goal – but it’s within "the bounds of possibility."
The announcement of the successful consortium for the Age Assurance Tender coincides with the release of a Parliamentary Committee report yesterday that reveals how central social media already is in the lives of Australian children and young teens: more than 90 of Australian adolescents use social media daily on platforms such as TikTok, Snapchat, and Instagram and most first register in late primary, years before their 16th birthday – the age beneath with the ban will operate.
The social media ban is the latest is a long and ongoing series of legislative and regulatory changes in Australia addressing the impact of digital platforms on the community, which includes a sweeping review of privacy policy, tackling misinformation, and addressing online scams.
The ban has bipartisan support, and widespread community backing. New data from Ipsos reveals why: Australians believe social media does more harm than good, bucking overwhelming global support for the technology.
The findings are contained in the 8th edition of Ipsos’ Global Trends report, titled "In search of a new consensus: from tension to intention". The report is the largest public survey in its history, with 50,000 people interviewed across 50 markets.
“Australians overwhelmingly believe social media does more harm than good, with data showing their opinions are in stark contrast to those in Asia and across the world,” according to the researchers. "More than half of Australians (54 per cent) said social media did not have a positive impact on people’s lives – just 38 per cent said it did.”
In developing guidance for the Commonwealth Government, the Australian Privacy Commissioner is believed to have leaned heavily on the EU Consent program, which was led by the executive in charge of the successful bidder for the Australian government’s age assurance trial.
Mi3 spoke with Tony Allen, executive director of the UK-based Age Check Certification Scheme last week, just minutes after the ACCS was announced as the lead company in the consortium behind the successful bid for the Government’s tender. Australian firm KJR is another member of the consortium.
Asked if the government might build a centralised Age Authentication platform for others to plug into, Allen said such an approach would be possible, "but it might be surprising."
More commonly, per Allen, “The organisation that creates the risk has the responsibility to be able to demonstrate how it is addressing that risk.”
“Typically, governments will say you're creating a risk, and we're going to legislate to say that we want you to address that risk in a certain way. They usually leave it to them to work out how to do it, but they don't normally prescribe how.”
Allen describes ACCS as an independent conformity assessment body that principally tests ID and Age Check all over the world.
“We're a UK-based company, we're accredited by the UK accreditation service and we've done a number of these projects for the European Union and in the UK.” Allen is also the technical editor for the international standard on age assurance, and he chairs the UK Government's expert panel on age restrictions.
ACCS’s role is to assess the potential technology, not to implement it, or to make policy choices about how a social media ban would work.
He calls it a technology readiness assessment.
Eight month project
“Our brief is to test whether the age assurance solutions that could be deployed in Australia will work. We're looking at all different kinds. The project is an eight-month project between now and June 2025." That timing would take it beyond the timetable for the introduction of the legislation, and beyond the next election.
The trial will examine the full technology stack involved in any age authentication systems, he says. “We will do that at the website, at the app, at the App Store, on the browser, on the connection, device, mobile telephones etc.”
ACCS has also been asked to explore parental consent and parental control as a part of the evaluation.
“We will be going out to the marketplace and inviting participants from both age assurance providers and from relying parties that people could use results, whether that be social media or whatever.
“We will be putting it [the tech] through its paces and see whether it works. We use a range of tools and techniques to do that, depending on the type of age assurance solution it is.”
Allen told Mi3, that the reasons for age restriction, or authentication triggers, are separate from the actual process of doing that.
“It could be that they want to sign up for social media accounts, it could be they want to access pornography or start gambling or buy alcohol. Whatever it is, that's a policy matter for governments and for regulators, and they'll determine what that is going to be.”
“We are assessing the types of technologies used, the performance, the functionality, how privacy-preserving they are, how secure they are, how acceptable they are, and how they deliver a result.”
He said, “There are so many technical options out there. The challenge for us is to find a reliable, reproducible, repeatable way of testing those so that we can do that in a scientific way.”
“This project has got a lot of scientists involved in it, a lot of analysis involved in it, a lot of deployment and use of international standards to make sure that it's robust and can be relied upon."
Can kicker?
As to whether any system would need to authenticate everyone on a platform, not just children and teens, he said that is a policy decision for the Minister, although others Mi3 spoke with involved in the project say they did not believe that was the government’s intent.
Nor they said, is current thinking to re-authenticate existing users. If that's the case, then under 16s currently registered on social media platforms would get a leave pass, as would anyone who registers before the legislation passes, or even after the legislation was passed but before the technology was put in place.
That approach may have political ramifications, and the government is believed to be waiting until after the trial before addressing how broadly the ban will be applied, the design of any age assurance system, and whether gambling content (or other content) is covered by the ban. Communications Minister Michelle Rowland has made one concession to internal party critics – ruling out forcing Australians to upload credentials to social media sites to pass any age ban. She made the comments in a Party Room meeting in Parliament yesterday.
The Minister's office remained tightlipped yesterday when Mi3 asked for clarification on issues such as a gambling content ban, grandfathering the authentication for existing teen accounts, building a centralised authentication platform, or whether it would attempt to force all social media users to authenticate, not just teens and adolescents.
World watching
According to Julie Dawson, Chief Policy and Regulatory Officer at Yoti, one of the firms expected to bid for part of the stack, "Other governments are looking at how to check age for social media and introducing legislation for this."
For instance she said, "The UK Online Safety Act will require regulated companies to implement and enforce age limits and effective age checking. Europe’s Digital Services Act aims to create a safer digital space for all users. This includes requirements for online intermediaries and platforms to verify their users' ages." Plus, she noted, "The vast majority of US states are now in the process of enacting new age verification laws for social media and adult platforms."
Yoti, which works with social media organisations others across the world says it has completed over 700 million age estimate checks.
Its work with Meta for instance is demonstrative of the kind of approach open to the Federal Government. Dawson told Mi3, "Anyone who tries to edit their date of birth on Instagram from under 18 to 18 or over is asked to verify their age by either uploading their ID document or by using Yoti facial age estimation technology. This is to help users have an age-appropriate experience on the platform. This was first introduced in the US and is now available globally."
The company was also involved in the recent September update to Meta's Instagram and the introduction of teen accounts.
"Meta introduced Teen Accounts on Instagram for users under the age of 18. Teen Accounts have built-in protections including the ability to set daily usage limits, restrict access during certain hours and monitor their child’s interactions, such as the accounts they are messaging and the types of content they’re engaging with on the platform. We’re helping Instagram verify the age of users when users attempt to change their account date of birth from under to over 18 (which would allow them to lift the IG Teens account restrictions)."
Another key member of the ACCS consortium (and one of the scientists Allen referred to) is Toby Walsh, engaged as an independent expert to oversee the study and to assess whether the experiments are well designed, and so that the results are statistically significant.
Walsh is the chief scientist at UNSW AI Institute and the author of Faking It! AI in a human world and Machines Behaving Badly.
However, his role in the age assurance project is independent of his work for the university.
He told Mi3, “I think the study is going to be quite impactful, not just within Australia but more broadly because Australia is leading the way in respect of age limits around things like social media. It could set a very important precedent for a lot of other countries as well so it's important for us to get it right.”
Australia is seen as taking a regulatory lead on the issue of a social media ban, but it is not a lone ranger on the topic says Walsh.
“The US has rules coming in on social media prohibitions. In the European Union, it's already illegal to process a child's data if they're under the age of 13 without their parent's consent.”
Age of AI
AI is likely to feature in any age authentication system.
According to Walsh, “That's one of the technologies that's going to be strongly looked at. Yes, it has various benefits if you can make it work. That's that's what the study will look at. How accurate can it be? Because you don't then have to worry about people registering with the government for some credentials, and you don't have to worry about privacy issues around those credentials.”
He acknowledged that systems are not very reliable at certain ages, because people’s faces change dramatically during their teens.
“The other interesting thing is we only have to decide, are you under 16 or if it’s pornography, are you under 18?
“It’s hard to tell a girl's age when they’re 12 or 13 because they have a big growth spurt; the systems might be quite unreliable [around] those ages. But that's not actually important. You just have to know if they are under 16 or over 16 and the accuracy might be much better if you are just deciding between those two."
Porn verification
Walsh also confirmed that the scope of the age assurance trial extends beyond just social media, and also includes limiting access to adult content for people under of people under 18 to adult content.
“It’s up to the government how they use the results of the trial. But online access to pornography is in play currently.”
“I would expect the results of our study at the very least to inform what expectations people like the e-Safety Commissioner have of this industry sector.”
Indeed the industry is currently developing a code of conduct about adult content that is meant to add some heft to what are currently voluntary standards.
According to Online Safety.org.au, a steering group of industry associations that represent the online industry; the Software Alliance (BSA); the Australian Mobile Telecommunications Association (AMTA); Communications Alliance,; the Consumer Electronics Suppliers Association (CESA); the Digital Industry Group Inc (DIGI); and the Interactive Games and Entertainment Association (IGEA), “Industry is seeking views on the draft Consolidated Industry Codes of Practice for the Online Industry (Class 1C and Class 2 Material) under the Online Safety Act 2021, which seek to protect children from exposure to online pornography and other harmful content.”
The scale of social media usage by children in Australia was revealed yesterday with the publication yesterday of the final report of the Joint Select Committee on Social Media and Australian Society called Social Media – The Good, the Bad and the Ugly:
- Smartphone ownership: Almost 100 per cent of adult Australians own a smartphone, and by the time children reach 16–17 years, 100 per cent of them have a device that is not shared with others.
- Social media usage: It is estimated that 78.3 per cent of Australians use social media.
- Daily use: 93 per cent of Australian adolescents use social media daily, spending an average of two to three hours on platforms such as TikTok, Snapchat, and Instagram.
- Age of access: The typical age at which young people begin using social media (Instagram and Snapchat) is late primary school, either with or without their parents' permission.
- Platform usage by age: Young people aged 12–13 years use an average of three social media platforms, while those aged 14–17 years use four to five platforms.
- Gender differences: Social media use varies by gender, with young women aged 14–24 years spending more time on social media than young men of the same age.
- Risks of harmful content: Approximately 45 per cent of young Australians reported being treated in a nasty or hurtful way online.
- Exposure to negative content: Almost two-thirds of young Australians aged 14–17 years were exposed in the last year to negative content, such as drug taking, suicide or self-harm.
- Mental health impact: The impact of social media on mental health appears complex; while multiple participants acknowledged benefits, there are significant concerns about the decline in mental health among young Australians.
The core of this complexity is not simply technological, it is also in the fact that while there are many harms that can be caused by social media, it also brings significant benefits to users, and is integral to the ways people interact in the modern world, particularly young people.
Head or tails
The Parliamentary Committee report drilled down into the dual nature of social media usage.
According to the report, "The core of this complexity is not simply technological, it is also in the fact that while there are many harms that can be caused by social media, it also brings significant benefits to users, and is integral to the ways people interact in the modern world, particularly young people. This makes the task of regulating social media to promote and protect those benefits, while minimising harm, a complex policy problem to solve."
It described how social media platforms such as Instagram, Snapchat, and TikTok have become integral to the social lives of young Australians.
While the platforms offer avenues for self-expression, community building, and access to information, the committee’s findings highlight significant risks associated with their use. The inquiry emphasised that social media can exacerbate existing mental health issues, with evidence indicating a correlation between excessive screen time and an increase in anxiety, depression, and body image concerns among teens.
However, the report also identified how many young people utilise social media as a tool for connection and support, particularly during formative years that are often fraught with challenges. Data from UNICEF Australia found that 81 per cent of teenagers believe social media positively influences their lives, providing a critical outlet for expression and community. However, it also highlighted that nearly 60 per cent of parents express concern over their children's social media use, indicating a stark divergence in perspectives between generations.
The case for a social media ban
Advocates for the ban argue that the current age requirement of 13 is insufficient to protect young users from exposure to harmful content, cyberbullying, and predatory behaviours. Parents and experts alike argue that children often lack the developmental maturity to navigate the complexities of social media safely. The Heads Up Alliance described social media as the "tobacco of our time," calling for urgent regulatory measures to safeguard youth.
The committee reported that social media platforms frequently deploy algorithms designed to maximise user engagement, which can inadvertently amplify harmful content. Research was cited suggesting that social media behaviours can alter neural development in adolescents, raising concerns about the long-term implications for mental health. This insight has fuelled the argument that delaying access could provide young people more time to develop resilience and critical thinking skills necessary for navigating the digital world.
The counterpoint: risks of a blanket ban
Despite arguments in favour of protective measures, the blanket ban's efficacy remains in question. Critics, including the Butterfly Foundation, argue that such restrictions might not address the root causes of online harms, suggesting that education and digital literacy should take precedence over outright bans. They contend that young people are determined and tech-savvy, often finding ways to circumvent restrictions, which could push them toward less regulated online spaces.
Further, the complexity of social media use is highlighted by the committee’s findings that individual experiences vary significantly based on personal circumstances and the nature of content consumed. A nuanced approach that includes young people's voices in policy discussions is essential for crafting effective regulations. Many youth advocates emphasise that solutions should be co-designed with young people, respecting their autonomy while ensuring safety.
For policymakers, the challenge lies in balancing the benefits of social media with the need for safety. The committee’s recommendations urge the Australian Government to implement a statutory duty of care for digital platforms, mandating them to assess risks and mitigate harm. This approach acknowledges the responsibility that social media companies have in protecting their users, particularly vulnerable teens.
Parents, too, must navigate this evolving landscape. Many express a desire for clearer guidelines and tools to help them manage their children's social media use. The inquiry suggests that increasing parental awareness about the risks associated with social media can empower families to engage in more informed conversations about digital safety.