Skip to main content
An evolving AI project from Mi3 | Automation with Editor curation. And oversight. Always.
In partnership with
Salesforce
Posted 10/10/2024 10:41am

Image by DALL·E Pic: Midjourney

Editors' Note: Many Fast News images are stylised illustrations generated by Dall-E. Photorealism is not intended. View as early and evolving AI art!

hAIku

Meta takes a stand,
For youth safety in their hand,
Bill's fate in the sand.

In partnership with
Salesforce

Meta responds to SA Government's social media safety bill 2024, points to 'unintended consequences' for app providers

Meta has submitted feedback on the South Australian Government’s Children (Social Media Safety) Bill 2024, welcoming the opportunity to build on its "engagement with the Hon Robert French AC during his consultations to prepare the Report of the Independent Legal Examination into Banning Children’s Access to Social Media (French Report)".

In a blog post written by Meta Policy ANZ, the tech giant says it aligns with the government's objective of ensuring young people have an age-appropriate online experience, and has been investing heavily in safety and security measures. Meta went on to outline its own efforts to address child safety on its platforms.

However, Meta has raised concerns about the Bill as currently drafted, stating it will be challenging to operationalise for both industry and the proposed regulator, and may not achieve the intended net improvement to online safety. The company suggests the Bill adopt a 'whole-of-ecosystem' approach that requires app store/OS level age verification and parental approval before a child downloads an app.

"A more effective and simple approach for the Bill would be to adopt a ‘whole-of-ecosystem’ approach that requires app store/OS level age verification and app stores to get a parent’s approval before their child downloads an app, allowing parents to oversee and approve their teen’s online activity in one place. Teens and parents already provide companies like Apple and Google with this information and these companies have already built systems for parental notification, review, and approval into their app stores. Legislation should provide an overarching framework to this existing practice, require app stores to verify age and then provide apps and developers with this information, which can then be used by app providers as part of their individual age assurance tools," read the blog post.

"An app store/OS-level solution would not exempt Meta and other app providers from implementing their own age assurance tools. Rather, it would be an important complement to these efforts, which recognises the technical limitations of age assurance technology, the practical realities of how young people and parents use apps, and preserves privacy by minimising data collection.

"We want to be clear that we are not recommending this approach in order to divest Meta of our responsibility to ensure safe and age appropriate experiences for teens across our services - a narrative that has gained momentum in some circles but is very much misguided. We make this recommendation based on our long experience in building online safety into our products and services."

Meta urged the SA Government and other governments looking to implement similar legislation to consider the insights shared by Australia’s leading mental health organisations calling for evidence-based measures to help improve the safety of social media platforms for young people and to ensure a genuine multi-stakeholder approach that brings industry to the conversation in identifying the most optimal long-term solutions.

The tech giant highlighted areas in the draft bill that it believes would "lead to unintended negative consequences".

"To illustrate this point with a local example, at present, under the current language proposed, an app provider could potentially be liable if a 13 year-old viewed the well-known video of a tour with the Calypso Star Charters11 that was originally posted on their Facebook Page and was shared by a family member in a family group chat.

"It is challenging to see how restricting young people from enjoying the small business entrepreneurship and beauty of South Australia in this way, which social media enables, is within the intended scope of the Bill."

Meta claims to have invested over US$20 billion in safety and security since 2016, employing around 40,000 people in this area. The company utilises proactive detection technologies to identify and take action against problematic content. For categories such as terrorism, child exploitation, or suicide and self-injury, it says the proactive detection rate is over 95%.

To further support teens and families, Meta has developed safety tools, including defaulting new teen accounts to private, turning off location sharing by default, and applying warning labels for sensitive content. The company recently launched Instagram Teen Accounts in Australia, the US, UK, and Canada. These accounts automatically place teens in built-in protections. Teens under 18 will be automatically placed into Teen Accounts, and those under 16 will need a parent’s permission to change these built-in protections to be less strict.

Meta has also invested in User Age Group APIs in the Meta Quest Store, which help developers understand how old their users are and tailor a more age-appropriate experience. This is part of Meta's ongoing commitment to creating a safer online environment for young users, aligning with the objectives of the Children (Social Media Safety) Bill 2024.

Search Mi3 Articles