Ex-Facebook boss: Making algorithms ‘dumb’ a smarter approach than teen social media ban that lacks support programs – campaigners disagree, legal challenges either way
Regulating algorithms and forcing transparency about how they work – or just making platforms make them “dumb” for teenagers – might be a more proportionate response than an outright ban by the Australian government, argues former Facebook ANZ managing director, Liam Walsh. He warns a lack of support infrastructure means kids face shock impacts if “suddenly their whole network – not only their social network, how they commune with others – is gone”. But campaigners, and school principals, say the harms go far beyond algorithms with attention gamified and platforms made addictive in the process and young teens simply not developmentally ready to face the pitfalls of the anonymous digital world. But what comes next, in terms of education and support infrastructure for teens – and society more broadly – will be critical so that the ban comes with a plan. Either way, says Walsh, prepare for challenges.
What you need to know:
- The proposed ban on social media for teens has polarised industry and academia with warnings aplenty it could backfire.
- Ex-Facebook MD Liam Walsh argues that dumbing down algorithms, forcing algorithmic transparency through regulation or removing them altogether – could actually be the solution if fears of the effects of algorithmically-generated dopamine hits and attention-hogging dark patterns on teenage mental health are the primary problem.
- “If we took that out, how many problems do we have with social?”
- Walsh warns society has no structures in place to deal with fallout that could land in nine months’ time if kids “lose their entire network”. He doubts teens will “suddenly start hanging out in the park and helping old ladies paint their fence.” And with either option, expect legal challenges from platforms, says Walsh.
- Production company Finch’s Rob Galluzzo and Greg Attwells fully expect challenges from platforms – who they claim have told staff to “stonewall” 36 Months, the campaign they founded with Nova’s Michael ‘Wippa’ Wipfli to push for a social media ban for under 16s.
- Dumbing down algorithms won’t cut it, says Attwells. Keeping regulation about health, not tech, and moving fast is key, they suggest – with more backer brands about to break cover.
- The next phase is designing the massive educational and societal infrastructure required to fill the looming gap.
You can make an algo dumb, you can make it absolutely useless. It can just be chronological, so that a teen is just seeing a chronology, and they're not being surfaced with information from Andrew Tate and all those weirdos ... There isn't a technical barrier to it – it's tedious for [the platforms] and it will suck for them, but they can do it. If we took that out, how many problems do we have with social?
Cause vs. effect
Facebook Australia and New Zealand’s former boss Liam Walsh thinks banning teens from social media risks using a warhead to crack a nut. He fears there will be fallout from taking away kids’ main means of communication and suggests first trying more targeted approaches.
“I don’t know if I’d say I’m against [a ban for under 16s]. But I think from a diagnostic perspective, everyone is broadly in the same camp: the problem stems from algos [algorithms] – and the consequences of the algos sending information to teens and everyone else of things that they might like,” says Walsh, who is current chair of Civic Data among other NED roles.
“That seems to be the cause of most of the issues. I would rather we were focusing most of our efforts on fixing that problem before we go to bans.”
He thinks targeting algorithms with regulation could be less counterproductive than raising the age limit to 16 – and questions why it’s not already happening.'Stockholm syndrome’
“It’s odd that we don’t have transparency [on algorithms]. We’re like Stockholm syndrome on that one, because we're kind of used to the tech platforms saying no you can’t have it. But in other industries, we just regulate and say, ‘you have to show us’. Like food, you have to show us what's in the food. So we could do that,” says Walsh.
“Transparency would be good, but it's not amazing. But you can make an algo dumb, you can make it absolutely useless. It can just be chronological, so that, for example, a teen is just seeing a chronology, and they're not being surfaced with information from Andrew Tate and all those weirdos,” he adds.
“So you can make it dumb. That's literally a decision that a government or a platform can make. There isn't a technical barrier to it – it's tedious for them, and it will suck for them, but they can do it. If we took that out, how many problems do we have with social?”
‘Do both’
Erica Thomas, Principal at private girls school Kincoppal in Sydney's Rose Bay, has almost 1,000 pupils under her watch from kindergarten to year 12. She told Mi3 about the highly damaging effects of social media she now sees playing out every day.
Concentration levels are dropping with learning and results affected despite teachers’ best counter strategies. Meanwhile, girls are being conditioned to perfectionism from a young age and exposed to “toxic and harmful advice” daily, while boys are pushed increasingly violent content and AI bot-created “highly sexualised” stereotypes of women.
Thomas says the negative impacts have accelerated in the last five years and thinks Walsh’s suggestion of making algorithms “dumb” to reduce increasingly extreme content being pushed to kids is a good idea – but in conjunction with a raised lower age limit, not instead of it.
The platforms will challenge our government whatever we put through [algo change or ban]. But in addition to that, if we put through a ban, realistically that means someone like a 15-year-old, for example ... Suddenly their whole network – not only their social network, their network, how they commune with others – is gone. That's kind of a big deal.
Social challenge
Either approach – or both – will likely be rigorously contested by the platforms and an army of lawyers. But it could be that a sovereign government imposing a ban, even if it is subsequently challenged, is the fastest approach that does not rely on platforms making technical changes.
Walsh, who held senior positions with Facebook locally between 2011 and 2014, including as ANZ managing director, thinks government will have little doubt of what is coming. But he says the implications go beyond legal battles.
“The platforms will challenge our government whatever we put through. If we put through a ban they will challenge us. If we put through a requirement to change the algo, they are going to challenge us.
“But in addition to that, if we put through a ban, realistically that means someone like a 15-year-old, for example, would wake up on whatever day it starts and suddenly they've lost their social platforms – and that's not trivial,” says Walsh.
While many parents are deeply worried about the negative impacts of social media, “they are not the only stakeholders, the kids are stakeholders too”, says Walsh.
“What happens to that community? Suddenly their whole network – not only their social network, their network, how they commune with others – is gone. That's kind of a big deal.
“So what I'm worried about is, if we do this, then where do they go? Because the kids, the 15 year olds, aren't going to just be hanging out at the park and helping old ladies paint their fence.”
The issue here is not just the algorithm. [There is also] the gamification of interaction, on Snapchat for example with Snap streaks and Snap scores, so it's addictive – so they're on it for longer and longer, trying to get their scores up. The other thing that I think is potentially dangerous at a young age is creating online persona and starting to share yourself with world before you have the maturity to do so.
Beyond algorithms
“The issue here is not just the algorithm,” counters Greg Attwells, a partner at 36months.com.au, a campaign to raise the minimum age of social media access from 13 to 16 years old, i.e. by 36 months. Nor is it the concerns from the tech sector about VPN work-arounds and the like when trying to enforce social media age limits.
Attwells cites the bewildered response from high-profile New York University marketing professor, Scott Galloway, on the claimed technical limitations for introducing age bans: "What's more challenging - figuring out if someone's younger than 16 or building a global, real time communication network that stores a near infinite amount of text, video and audio, retrievable by billions of simultaneous users in milliseconds, within 24/7 uptime...the social media giants know where you are, what you're doing, how you're feeling and if you're experiencing suicidal ideation, but they can't figure out your age? You can't make this shit up."
Attwells says the technical capability is "100% there - they [the platforms] just need a stick in order to act unfortunately, and that's what this is about."
On opening up and dumbing down the algorithms, he argues “yes, it plays a role in the information that it feeds young people, but there are two other things that I think are really harmful within this window of development [between the ages of 13 and 16].
“One is the gamification of interaction. So on Snapchat, for example, I'm building up Snap scores and Snap streaks. I send photos to my friends, and so it's addictive – so they're on it for longer and longer, trying to get their Snap streaks and their Snap scores up,” he adds.
“The other thing that I think is potentially dangerous at a young age is creating online persona and starting to share yourself with world before you have the maturity to do so.”
Hence one of 36 Months’ key taglines: ‘Take another 36 months to get to know yourself before the world does’. Rather than 13 year olds “putting themselves out there for anonymous commentary on who they are, and then watching the likes or the lack of likes and making judgments about themselves based on that”, says Attwells.
“It's a minefield for a young mind to have to deal with. So it’s not just an algo problem, there's a the gamification of interaction and putting yourself out there for anonymous online commentary before you have a maturity to be able to handle that,” he adds. “I think there's a bunch of issues there that make it pretty dangerous.”
In every school improvement plan, wellbeing is present as a key area of focus – and the tenets of wellbeing are things like connection, compassion, community, things like self-esteem, belonging, resilience. That's our horizon two brief: how can we work with parents and educators to better cultivate these between the ages of 13 and 16?
Post-ban plan?
If government does follow through with a ban – with both major parties committed – the challenge is what comes next in terms of the supporting infrastructure to soften withdrawal effects.
“We're going to have to have education programs across our society,” acknowledges Kincoppel principal Erica Thomas. “We're going to need to work with our young people [so that they understand] we’re not banning them from the internet – there’ll be ways to communicate effectively outside of [social media].”
36 Months’ Attwells says that is the next phase of work.
“Horizon one for us was all about legislation, campaigning to change policy and raise the age. Horizon two is a hundred per cent more about education, so the movement isn't just about removing something, it's about filling that gap with meaningful, positive growth for young people. Intentionally preparing them for the digital world when they are ready gives the movement depth and a clearer, longer term vision,” he says.
“For example, in every school improvement plan, wellbeing is present as a key area of focus – and the tenets of wellbeing are things like connection, compassion, community, things like self-esteem, belonging, resilience.
“That's our horizon two brief: how can we work with parents and educators to better cultivate these between the ages of 13 and 16 so that 36 Months becomes a window of opportunity to show up with more intention and imagination to better prepare young people for the world that awaits.”
Liam Walsh and Greg Attwells were speaking as part of a five-strong podcast panel, alongside Kincoppal-Rose Bay Principal Erica Thomas (get her frontline view here); head of influence agency Kindred, Katie Palmer-Rose, and Rob Galluzzo, CEO of production company Finch and one of the founders of the 36 Months campaign pushing for the minimum age for social media access to be raised to 16. (Read Palmer-Rose and Galluzzo's take here.)