Skip to main content
Deep Dive 25 Jun 2024 - 10 min read

The pro-consumer privacy lobby speaks - and why the Federal Government listens on privacy reform clampdowns for hashed emails, geolocation, loyalty data trading and new definitions of personal information

By Paul McIntyre & Brendan Coyne

Frank exchange of views: Choice's Kate Bower (left) and Salinger Privacy's Anna Johnston say just about everything the digital ad industry takes for granted is about to become illegal.

There’s little contention today that the pro-consumer privacy lobby is winning the war over industry on privacy reform – they’re informed on industry techniques, loaded with compelling consumer research and aligned entirely on the need for a clampdown on the collection and use of an individual’s online data trail. Former NSW Deputy Privacy Commissioner and Salinger Privacy boss Anna Johnston and Choice Consumer Data Advocate, Kate Bower, unpack what and why they expect a series of hard, industry-challenging privacy reforms to land in parliament next month - that’s less than six weeks away. Just how deeply the $25bn-plus marketing supply chain and tens of thousands of practitioners will be impacted will become clear as the reforms are tabled in Federal Parliament. Johnston and Bower think the updated Act will go harder than anywhere in the world. Hashed emails will be classified as personal information. Trading of geolocation data will be out. Trading of loyalty scheme data – the stuff that powers retail media and a vast targeting-attribution industry – will require companies to prove they have lawful consent to do so and they won’t be able to deny services to those that say no. But consent, says Johnston, is a very fragile thing – and companies might actually be best off concentrating on one of the legislation’s central tenets: Fair and reasonable use of data. In other words, says Choice’s Bower, does what you are doing with customer’s data pass “the privacy pub test?” If it does, meeting a very high consent threshold doesn’t apply. Right now, most are badly flunking the test. 

What you need to know:

  • Industry-challenging privacy reforms land in parliament next month.
  • Consumer advocates have Canberra’s ear.
  • The definition of personal information is set to be massively expanded, restricting if not outlawing just about everything digital advertising does as standard.
  • Consent is no get-out-of-jail card.
  • Choice’s Kate Bower and Salinger Privacy’s Anna Johnston unpack what’s about to hit brands and the marketing supply chain – and how to prepare.
  • There’s more in the podcast on what’s affected – from geolocation to loyalty data trading to hashed identifiers. Get the full download here.

We've got AI companies training on people's photos and messages to friends that are now being used to train large language models with no opt out ... We've got increasing scam losses, increasing identity theft, increasing examples of deep fake sextortion ... and consumers are asking where is our protection here?

Kate Bower, Consumer Privacy Advocate, Choice

Injury time

Plenty of companies have been guilty of talking up their tech smarts via industry forums – knowing when shoppers are in store, when they are on the train home from work and what they listen to on the journey. Those firms are about to be in breach of the Privacy Act – if they are not already, reckon Salinger Privacy boss Anna Johnston and Choice consumer data advocate, Kate Bower.

The two represent a hard truth hurtling towards an industry still stuck at the midpoint of Elisabeth Kübler-Ross’ five stages of grief. The digital ad industry is still trying to bargain with lawmakers, but the truth is that the pro-consumer privacy lobby is winning the war over industry on privacy reform – and in six weeks time, the 116 proposed Privacy Act reforms land in parliament. The industry has convinced itself that Australians want personalisation – and Bower and Johnston actually agree. Just not at the expense of privacy. Bower sets the scene for those perhaps too close to the tools – and it's ugly.

For the consumer, they've got things like facial recognition technology when they go to the shops, when they go and see their favourite band at the stadium. They've got things like facial analysis in digital billboards that are targeting ads towards them.

"They've got perpetrators of family violence abusing people through smart fridges. They've got photos of people on the toilet taken by vacuum cleaners that end up online. They've got companies that are holding on to data for over a decade that ends up in a data breach. Some of their most intimate information, things that they put in fertility tracking apps or things that they access through mental health services, is being packaged and sold to advertisers. We've got sick patients who are being told that they have to join a loyalty program or pay to see a GP," says Bower. But she's not finished.

"We've got AI companies training on people's photos and messages to friends that are now being used to train large language models with no opt out. We've got increasing scam losses. We've got increasing identity theft. We've got increasing examples of deep fake sextortion. All of these things are happening online, and consumers are asking where is our protection here? Don't we have a law that should be stopping all of these things from happening to us? They feel like that they have no control,” she says.

“We don't have a Privacy Act that's fit for purpose and actually protects and promotes the privacy of Australians.”

Wrong questions

Industry has spent much time and effort defending the case for targeting and personalisation. IAB Australia enlisted PwC to come up with a figure of $94bn of economic benefits from digital advertising and has been banging the drum along with ADMA and others for industry to push back on proposals they see as “unworkable”.

Yet consumer studies consistently find a major mismatch with how the digital economy currently works and what consumers want. Bower says that’s because industry is asking the wrong questions – and points out that another of the big four consultants, Deloitte, is warning of the same consumer perplexity and concerns around their data protection and broader privacy.

“The reality is our lives are online. So when the question is posed as, ‘would you like your digital life to be more personalised and more curated?’ Well, that sounds like a good thing, but I think what people don't really necessarily understand is the processes,” says Bower.

“The industry tries to make it difficult to understand what is actually involved in the data collection and the data sharing that goes into making that personalised experience.

“Quite frequently it's all or nothing. So either you access the service and you agree to the data collection and therefore you get the personalisation, or you don't have access to the service, and then you miss out on all of those day-to-day necessary interactions that you have. And I think the perfect example of this is what Meta has just done with training their AI.”

When people joined Facebook 16 or 17 years ago they were delighting in 'poking' their mates. Or their nan. They could not possibly have knowingly consented to having their posts mined to train a proprietary large language model.

Now they face a choice of either “downloading that information and deleting their accounts [or] lose all of those memories, all of those interactions,” says Bower.

“So it's really about the questions that you ask and how you frame that for consumers. Obviously, consumers want personalisation, but they also want privacy, and it shouldn't be a choice between the two.”

We’re angry, and what we have seen recently is that anger is now boiling over. The community's given up on companies protecting their privacy. They're turning to the government, and they're saying it is your job to protect us from these exploitative kind of data practices.

Anna Johnston, Principal, Salinger Privacy

Privacy paradox

The disconnect between people saying they are worried about privacy but then not doing much about it is the “privacy paradox,” per Salinger’s Anna Johnston.

Sometimes that privacy paradox is used as a reason by industry or certain sectors to say, ‘this shows that people don't really care about privacy’ because they're not behaving in certain ways online.”

The reality, she suggests, is “there's a real lack of understanding about how data use and sharing works”, especially when risks are “time-shifted” as in the Meta training its LLM example.

“It's not that people don't care about their privacy. They've just given up trying to get privacy out of for-profit corporations in particular,” says Johnston.

But she thinks that sentiment may be shifting – and lawmakers are feeling the heat.

“We’re angry, and what we have seen recently is that anger is now boiling over. The community's given up on companies protecting their privacy. They're turning to the government, and they're saying it is your job to protect us from these exploitative kind of data practices.”

Johnston suggests consumer discontent is perhaps the highest she has seen in 25 years in the field. She says when even Deloitte is highlighting consumer angst, industry should probably accept the game is up

“The Deloitte research into personalisation, for example, said that 80 per cent of consumers can see some kind of value in online personalisation, but only 30 per cent of them are happy with their current experience, and only 20 per cent are happy with receiving personalised online advertising.”

She points out that personalisation is too vague a term. Most consumers, she says, are fine with content recommendations based on their TV streaming history. But that’s a little different to “covert, cross-device, cross-app, cross-brand tracking and then matching up data between unrelated companies to build up profiles on individuals and then target them with ads or show them different offers or exclude them from seeing offers or offer them different prices online”, says Johnston.

“If that's what you mean by personalisation, then consumers absolutely do not want that.”

She thinks wholesale application of those practices may soon be outlawed via the Privacy Act overhaul.

“I would hope that the law reform will go some way to stopping that kind of activity happening … There needs to be a value exchange for the consumer that needs to be made really clear to them in language they can understand, and they need to be opting-in to that. It can't be ‘this is just the nature of doing business’. That's really what needs to shift.”

What will be made crystal clear with this proposed change to the legislation is that if companies are using any kind of identifiers of individuals – or pseudonyms, like device IDs, cookies, hashed emails, whatever it is – to track and match up and profile and then act upon or target individuals in some way, that is personal information. That is a big shift.

Anna Johnston, Principal, Salinger Privacy

Everything is personal

Changes to the definition of what constitutes personal information will present perhaps the biggest challenge for the ad industry supply chain – because it brings just about every tool and identifier into scope. Choice’s Bower thinks IP addresses and device IDs will now count.

“That's going to be a pretty significant change for many marketers and for many businesses who may be using those in proxy for someone's identity. Those types of practices will now not be allowed,” she says.

Plus, if a person could conceivably be identified or “singled out” by two or three different identifiers, “those types of information will now be covered by the Act”, says Bower. “So it is a pretty significant change.”

Salinger’s Johnston thinks many of the digital ad industry’s current standard practices are already covered by the Act, but loose legislation and a lack of enforcement have allowed companies to do largely what they like. She think’s that’s about to change.

“What will be made crystal clear with this proposed change to the legislation is that if companies are using any kind of identifiers of individuals – or pseudonyms, like device IDs, cookies, hashed emails, whatever it is – to track and match up and profile and then act upon or target individuals in some way, that is personal information,” she says.

The Attorney General has already confirmed that is the government’s position.

“So this is a big shift for companies, especially the advertising, media, marketing sector, to wrap their head around. No longer can you say ‘because we don't have the person's name, we don't know who they are, therefore it's not personal information, and therefore none of the privacy rules apply to us’. That argument just doesn't wash anymore.”

That places major question marks over things like lookalike audiences, pioneered by companies like Facebook and now copied by most major media firms, where companies use hashed email addresses to create target audience segments.

“The disclosure of that hashed email address can no longer be described as anonymised or de-identified to the point where you're arguing that the privacy rules don't apply. They do apply,” says Johnston. “They apply now, in my view, but the legislative change will make it crystal clear that they apply.”

As such, marketers, media companies and intermediaries “must be on really solid ground, knowing that they have the lawful authority to disclose the hashed email address to Meta in the first place, and Meta will have to have the lawful authority to collect it and to use it.”

Johnston suggests that Australia’s loyalty data industry – which underpins the booming retail media sector – must likewise be on solid legal ground before trading on that data.

“Companies sharing data between each other – so the airline, supermarket, bank and the insurance company customer enrichment [businesses] – they will need to have some lawful authority under the Australian Privacy Principles to disclose and collect and use [that data] between them.”

Yes means no

Gaining customer consent is seen by many as a sure-fire way to obtain lawful authority for things like loyalty programs and data sharing. But Johnston says it’s not that easy, because one of the proposed reforms is around defining what consent really means – and 'yes' may turn out to mean 'no'.

The consent must be voluntary, informed, specific and current. It must be an unambiguous indication of the person's intentions,” says Johnston.

“I usually describe that as the ‘Would you like fries with that’ question. I’ve ordered my burger and the cashier says ‘would you like fries with that?’. I am free to say yes or no, and I still get my burger. I’m making the decision right now, and it is really unambiguous.”

Plus, consent can’t be bundled up with other things.

It can't be included in mandatory terms and conditions. It can't be a condition of getting the good or the service. I must, as a consumer, be as free to say no as I am free to say yes and still get the original service.”

Which has some loyalty operators fretting about busted business models (Woolworths is seeking an exemption) – and has far-reaching implications. That kind of informed consent, is “very difficult to get”, says Johnston, and it “must be as easy for people to withdraw consent at any point as it is to give it in the first place”.

Hence: “When you realise how fragile consent is as the foundation on which to rest your lawful authority to collect or use or disclose personal information, you realise you're much better off looking to the legislation to see what else is possible.”

Which is where the ‘fair and reasonable’ test comes into play.

I think this is going to really shift the onus onto the business to consider whether or not their data collection meets this 'privacy pub test', rather than being able to rely on essentially, what we have now – [which is] a notice and consent ... Would I feel comfortable explaining the practices that I'm doing to my grandmother, and would they think, ‘Oh yeah, that sounds good to me’?'

Kate Bower, Consumer Privacy Advocate, Choice

Privacy pub test

The government has agreed in principle that businesses must determine whether their use of personal data is both ‘fair’ and ‘reasonable’. This has some in the pro-business lobby worried.

The ‘fair and reasonable’ requirement will also trump consent. Which means, as Privacy Commissioner Carly Kind said last month, “organisations won’t be able to ‘consent out’ of the fair and reasonable requirement to justify their activities”. Kind underlined that this goes further than any other jurisdiction worldwide – and suggested Australia will be able to take the lead on some of the global businesses that have made trillions from trading personal data.

Choice’s Bower says the simple way to think about fair and reasonable use of personal information is via a “privacy pub test”. I.e. does it seem fair and reasonable to the layman – and crucially does that hold true all the way down the line with any further use of that personal data.

“If we go back to the Meta example, people might consider fair and reasonable data collection for accepting that service might be that we see some [relevant] ads; If I’ve joined a page about recipes I might see some ads about food related things.

“What we have seen that expand to is this huge ecosystem of advertising and trading of data and now into machine learning and other things. So would the reasonable consumer consider that those further uses of that information are fair and reasonable? I think this is going to really shift the onus onto the business to consider whether or not their data collection meets this test, rather than being able to rely on essentially, what we have now – [which is] a notice and consent.”

Bower thinks applying the pub test to digital advertising and marketing should ultimately “make it easier” for the industry to work out whether their specific practices are lawful rather than trying to grapple too deeply with specifics and semantics.

“Would I feel comfortable explaining the practices that I'm doing to my grandmother, and would they think, ‘Oh yeah, that sounds good to me’?" If not, it's probably going to be challenged – bringing the risk of class actions led by lawyers arguing the toss on what is fair. 

Going for brokers

Salinger’s Johnston says that the regulator will flesh out ‘fair and reasonable’ guidance over time. But she says the Attorney General has already given some clear indications of pub test failures.

“They gave the example of a weather app. A user might reasonably expect to share their location when they're using the app in order to get the benefit – what the weather forecast is.

“But the reuse or the sale of their geolocation data by the app to a data broker is unlikely to be considered fair and reasonable in the circumstances,” says Johnston. “Even if the consumer had said yes … it can still fail the privacy pub test.”

I hear adtech, media, marketing [people] say things like, ‘we know when you're in the supermarket’ ... and ‘we know when you walk into a physical store after we've shown you an ad for X product’ ... if that's not already in breach of the Privacy Act … then it likely soon will be.

Anna Johnston, Principal, Salinger Privacy

Guilty parties

Johnston thinks there are plenty of businesses that do not meet that test – some closer to home than others.

“I hear adtech, media, marketing [people] say things like, ‘we know when you're in the supermarket’, ‘we know that you're commuting to work [and] we know what you're listening to on the radio when you're doing that’ … and ‘we know when you walk into a physical store after we've shown you an ad for X product’.

“But we also know that consumers don't want that. So it might be really effective marketing, but the community attitudes research is telling us that consumers don't want that level of tracking, profiling and targeting, and if that's not already in breach of the Privacy Act … then it likely soon will be, if these reforms pass later this year, because it will fail that fair and reasonable test.”

She warns there are “fundamental changes coming for those industries that haven't yet woken up to the fact that consumers don't want their data to be exploited and reused for the benefit of [those] companies.

“Those companies that have realised that they are just the custodians of consumer data, that they need to respect and protect it and only use it in accordance with their customers clearly expressed wishes will be ahead of the game,” adds Johnston.

“But there are certainly organisations that have been the beneficiaries of the last 20 years of the data exploitation market – and that is the market that is going to change.”

Prepare now

The incoming changes, if poorly understood and acted upon, can “absolutely be a toxic risk” for businesses, says Johnston.

She advises firms to prepare now.

“The first thing, especially for companies in this [marketing, media and advertising] sector is understanding the real reach of the data in scope … It's going to anything that can be related back to an individual who can be distinguished from other individuals and acted upon in some way. If you're doing anything like that, then just assume it's personal information, and therefore the privacy rules will apply,” says Johnston.

“The next tip is to understand what a valid consent looks like and what it is not. But also when it's needed and when it's not,” she adds. “It is not always needed. You don't need consent to do everything under the Act.”

Once firms have got to grips with those aspects, they should refresh their understanding of the 13 Australian Privacy Principles, work out whether they comply with them and then “add that the ‘fair and reasonable’ lens over the top”, says Johnston. “Do you pass that privacy pub test?’”

If not, firms risk $50m non-compliance fines – with that part of the legislation already passed – and open themselves up to class actions. Ultimately, says Johnston, “there will be product lines and there will be business processes that simply have to stop. And frankly, that is the point of the reforms.”

Choice’s Bower suggests some firms will have to rethink their business models – especially those in the data catch and sell markets.

“If your product or service is actually just a front for collecting data and then on selling it ... That's not what consumers want. I think [those] people need to get back to the basics of making your product or service good enough that people want to buy it. Don't try and think about ways to exploit them in order to make more profit,” she says.

“It's really about time that businesses started respecting privacy as a human right.”

There's more in the podcast. Get the full download here.

What do you think?

Search Mi3 Articles