Skip to main content
News Plus 18 Sep 2024 - 10 min read
AMI CPD: 1  Share  

Brands, media owners, loyalty operators face difficult fast decisions on all automated decision-making as Privacy Act forces policy – maybe business model – rewrite

By Andrew Birmingham and Nadia Cameron

Dean Gerakiteys, Teresa Sperti, Chris Brinkworth, Monique Azzopardi, Sarla Fernando, Adam Krass, Lyndall Spooner, David Parsons, and Nicole Stephensen

"Every company is a software company," as Watts S. Humphrey once famously observed, but this is probably not what he had in mind. As brands bid to wring every ounce of profit from every dollar earned through the efficiency of automated decision-making that understands customers better than they understand themselves, they now face a new and potentially costly risk – penalties in the millions of dollars, and a regulatory regime that makes it easier for the market policemen to write you a ticket. What's more, it's not fully clear how broad the definition of what constitutes automated decision-making will be – which puts almost everyone on notice. Privacy specialists Mi3 spoke with stressed there's much more work to be done than simply writing a new privacy policy. For starters, you have to adhere to it. That might mean costly and complicated business and system process changes that move at the speed of compliance – but in a market powered by the accelerant of generative AI.

What you need to know:

  • It's time to take your privacy policy seriously. If you are using customer data to automate decisions, your customers need to be able to understand that from your privacy policy. If it's not simply disclosed and clearly articulated, expect fines.
  • Those fines will come thicker and faster thanks to two new penalty tiers including "administrative" breaches with fines up to $330,000 that the regulators can levy without going to court.
  • There's also a mid-tier level of fines that could cost you over $3m for breaches that don't rise to the worst case outcomes. 
  • Then there's the small matter of how customers react to all the new transparency, especially as they have new opportunities to pursue companies through a statutory tort, although there is no direct right to action – something privacy experts say previous governments have been loathe to introduce.
  • Privacy consultants say brands they work with are already in breach of the old rules, let alone the new ones.
  • And don't fall into the trap of believing it's simply a case of updating the privacy policy, you also need to make sure that business processes reflect the policy, and that in turn could lead to system changes and additional costs.
  • None of which happens quickly, with UM's chief digital, data, and technology officer offering an example of how it took a retail client working with a TV company a year to bed down details even under the existing law.

Maybe you actually need to adjust a bunch of things around automated decision-making that your entire business model has been based on.

Chris Brinkworth, Managing Partner, Civic Data

Here’s what’s changing: If your software uses personal information to make significant decisions without human intervention about customers or prospects or consumers, you need to disclose that in your privacy policy. That’s the easy bit.

What's more complicated and fraught is that you also need to ensure that your business processes reflect what you say in your policy, which may require your organisation to change the way its systems work, and how you collaborate with partners whose own lawyers probably want a quick word, or maybe a very long one with lots of warrants and sub-clauses.

Pull on the thread and the whole damn rug starts to unravel.

The privacy law grants your customer a right to action – basically, a valid reason to pursue legal proceedings based on a specific set of facts or circumstances that may have caused them harm or injury, something governments have been loathe to agree to in the past, say privacy experts.

Automated decision-making (ADM) was one of 30-plus areas of privacy in the long-running conversation that the government initiated before the legislative update where it said its mind was made up and no further discussions were required. Many of the other items on the list have been pushed back to next year, but ADM made it through, which speaks to how it is viewed as a priority.

While definitions of the “significant harm” that a decision might visit upon a customer can already be discerned from the Australian case law, for now, there is no detailed definition in the legislation about an automated decision.

On the current reading of the legislation introduced to Parliament last week, it could be everything from those huge multimillion-dollar real-time decisioning ecosystems that Commbank, NAB, and ANZ are building on the back of Pega’s software, through to something as simple as a Java script tag on a web page that triggers a decision.

Industry leaders Mi3 spoke with offered a wide set of opinions about the current practices that could qualify as potentially rendering significant harm and which are already common in business applications today – accepting or denying credit card applications, using loyalty programs to offer differential experiences (“Welcome to the President’s lounge, Madam”), accepting or rejecting a job application based on AI analysis of a CV, or surge pricing based on the user’s behavioural profile as defined by the data you hold, including how panicky they get when the battery on their phone turns from reassuring green to OMG-red.

None of these are specifically called out in the legislation. Instead, according to Chris Brinkworth, managing partner at Civic Data, the kinds of decisions that may affect the rights or interests of an individual include;

  • a decision made under a provision of an Act or a legislative instrument to grant, or to refuse to grant, a benefit to the individual;
  • a decision that affects the individual's rights under a contract, agreement or arrangement;
  • a decision that affects the individual's access to a significant service or support.

The new law won’t stop you from doing these things, and it doesn’t go as far as GDPR for instance, which lets consumers opt out. But it does require you to disclose that you use personal information from customers and consumers to make these decisions, and it does require you to do what you say.

The new law also makes it easier to penalise organisations and individuals who breach provisions in the act with greater alacrity thanks to new mid-level, and administrative breach penalties, in addition to existing penalties for such serious breaches.

For general privacy interferences not deemed to be as serious as top tier breaches, a corporation can be fined up to $3.3 million and an individual can be fined up to $660,000. Plus there is now a category of administrative breaches of up to $330,000 for corporations and $66,000 for individuals, says Brinkworth.

But he cautions the business impact extends beyond fines.

“If you have ingested the local bus timetable via an API and you have taken all that knowledge, and everything you know about the customer from the historical data – what time they go to work, where you live for example – and then you charge them a lot more than someone else, the users need to understand from your privacy policy that the data you have on them enables you to do it.”

Quite apart from a customer’s visceral reaction on learning that, Brinkworth says it's not as simple as updating the privacy policy, more like companies starting to realise "maybe I actually need to adjust a bunch of things around automated decisions that my entire business model is been based on".

His take: "Businesses have two years to get ready for this, but it might mean rethinking some of your automated marketing and customer management practices. It's all about being transparent with your customers about how you're using their data to make decisions that affect them."

But there's a kicker.

"In 24 months, the latest tranche of changes may well be upon us and that will likely include and updated list of terms that would be "considered personal information"' .  

That expanded definition could suddenly also bring other existing systems under regulatory scrutiny, requiring businesses to reassess – and potentially redesign – and be fully transparent about those ADM processes also, he believes.

"For example, if the new definition includes information inferred about individuals (as some privacy regimes do), it could dramatically affect how businesses view their ADM practices, especially in marketing and personalisation."

Public concern

“There has been a lot of public concern around automated decision-making in various contexts," Clayton Utz special counsel, Monique Azzopardi, told attendees at a recent media event.

"This particular reform is targeted on transparency of use of automated decision making – where computers are using automated decision-making, and a person’s personal information is involved in those processes.

“Entities subject to Privacy Act will need to disclose in their privacy policies where they do use automated decision-making and it does meet the test specified, they have to make that transparent. It’s very much around making transparent the use of more emerging technologies.”

But privacy policy disclosures are just the start, says Teresa Sperti, founder of advisory, Arktic Fox.

There are two material impacts for marketers, she says.

“The first, around automation, is that you will need to disclose to customers how you're using data and automated decisioning and how that might treat them differently as a result.”

The second material issue for marketers is that there is now a statutory tort – a legal framework that covers civil wrongs – that could lead to class actions, for instance, if the OIAC finds serious harm has been done. For now, there is no direct right to action which would allow individuals to go directly to the courts, but that may be there in the second tranche.

The reason this will be a problem is that, according to Sperti, “Many of the businesses I come into contact with are not compliant with today's Privacy Act.”

She says there is very little time left to delay and is keen to puncture the myth that this is just about updating privacy policies.

“It's not about rewriting your privacy policy, it's about enacting it across the organisation.”

This is not something marketers can leave to a Privacy Officer or an IT manager, Sperti underlines.

“They will only guide and help write the policy. The enacting happens within the business and that's the challenge. The question is, how do we change our business processes in order to deliver on that and ensure that we comply with privacy standards?”

Wide implications

According to Nicole Stephensen, managing director of Ground Up Privacy, there is potential for ADM to have adverse privacy impacts in a number of ways, for instance through erroneously made decisions, or a decision based on incorrect, out-of-date, or inferred personal information.

She stresses that the apparent ‘size’ or cost or visibility of the technology deployment is irrelevant.

“For example, a seemingly benign example of ADM might be the use of a chatbot to triage a person’s interactions with an organisation in relation to services, benefits or entitlements or even penalties and fines.”

It is increasingly an issue as automation is applied to things like making insurance claims, setting up bank accounts or credit facilities, accessing the right allied health services for your needs, or disputing a ticket or other bylaw enforcement, she says.

“Is this (the new law) a big deal? Yes. The new obligation in law adds to existing requirements to collect and handle personal information in accordance with the Australian Privacy Principles and acknowledges that there are unique risks to privacy associated with people not knowing that automated decisioning is being used."

The more things change…

Stephensen told Mi3 that one thing that hasn’t changed for marketers is the need to conduct their marketing and other business activities in line with the Australian Privacy Principles.

“If you do that you will be fine. If you don't conduct them in line with the Australian Privacy Principles, you could find yourself in strife."

The problem for marketers is that the loose, lazy days of paying lip service to privacy, or deprioritising it, are likely over and the kind of strife they can now find themselves in has changed materially, she suggests

“There are civil penalties that could apply in the marketing context, and that wasn't there before,” she said

As to what marketers should do next, Stephensen recommends as a first step, investigating whether the business genuinely has structures internally relating specifically to privacy and the protection of personal information. “Do they have their basic privacy governance and hygiene in place? “

The implications are far reaching she says.

“If you have the structure as an organisation, you then have the ability to inform all of the various different marketing projects, ad campaigns, etcetera that you intend to do. That’s true whether you're involving complex technologies or not – you will have the underpinning governance that should help you to do that safely. So getting your privacy house in order as an organisation is a top priority,” Stephensen continues.

“But then it will be really important to check to see that what you say and what you do in your privacy policy is, in fact, what you do. And if any of those things that you do could raise a privacy concern, it's worth going back and having a look and seeing if there's some other less privacy-invasive way to do that activity or to achieve that outcome.”

Every single marketer will find automated decision-making will probably decide who or who won't receive certain offers. The data they collect today will, if put into AI and automated decision systems, have an output that could impact the end consumer. And if the end consumer finds that is significantly affecting an individual's rights or interest, then they will be in breach.

Sarla Fernando, Director of Regulatory and Advocacy, ADMA

For Sarla Fernando, ADMA director of regulatory and advocacyADM data notices are just the start of what’s coming.

She describes it as a reform that cements why marketers need to start interrogating their data practices and collection not just today, but in preparation for the future. Why? Because all the definitions the Government has agreed to in principle that are on their way – like changing personal data from ‘about’ to ‘relates to’, or a shake-up of what’s considered sensitive information – all have a bearing on what’s collected and made transparent as this first tranche is enacted.

“Every single marketer will find automated decision-making will probably decide who or who won't receive certain offers, what pricing people might get, and what area will receive certain offers. The data they collect today will, if put into AI and automated decision systems, have an output that could impact the end consumer. And if the end consumer finds that is significantly affecting an individual's rights or interest, then they will be in breach,” Fernando says.

“What does that mean? That means a marketer today has to start thinking about their future campaigns and future use of the data they collect for their campaigns. There is that possibility anywhere down the line the data that's collected is then used to create insights – and don’t forget, the creation of insights will be considered to be a collection [of data]. Marketers at this stage might think collection is when they get the data from a consumer," she told Mi3.

"But the Act is going to clarify that inferred data, and data used to create insights, will be considered a collection. So if the data they collect in the traditional form today is used to create insights that will then be used in automated decision-making and have a significant effect on an individual's right or interest somewhere down the line, the marketer needs to ensure that is both put in their notice upfront, and that it's clear how data is going to be used in that form."

“They can't just say it may be used in ADM; they’re going to have to state how that could be used. They're also going to have to think about whether or not consent for that data can be removed and withdrawn just as easily as it is obtained. And they're going to have to think about whether or not that can be considered to be ‘fair and reasonable’. It’s a significant shift.”

Who cares?

“The thing is, until you know better, you don't care. I think our generation [X] would be the last generation that can compare where our data was not always automatically given. Anyone who has followed us doesn’t know any better – until they see the alternative, which I think these laws will start to show.

“The automated decision-making changes are not a small step for the actual operational application a marketing department is going to have to follow through on in implementing privacy reform.

“You cannot prepare your privacy policy to set out the types of personal information used in the types of decisions made by systems without first understanding what is personal information? Secondly, how do you plan to use it in the future? Which categories of personal information are sensitive? And what is the consent that will be required, and how has that to be given? The wording seems simple, but the actual process to get the policy written will require a fundamental shift in how marketers are thinking. It requires them to move according to the way the reform is going in the second tranche.”

Working with a retail client, it took close to a year to finalise with their legal team [how it would work with a TV network] in terms of how we renegotiate our needs, how we look at data enrichment for that client so we have a better understanding of their sponsorships and their customers when we're working with one of the TV networks.

Adam Krass, Chief Digital, Data and Technology Officer, UM

Systemic impacts

Adam Krass, UM’s Chief Digital, Data and Technology Officer, says his company has already started work with clients about the impact on the systems that manage business processes impacted by the privacy law changes.

“Where we're speaking to our clients about automated decision-making regulations we’re specifically looking at – for example – what that means in terms of how we set up the CDP, or how we set up their warehouse.”

One of the implications of the new rules is the need to understand the impact of data collaboration with partners and form the perspective of the audience.

“When we start to talk about all those new tools like AI that sits on top of data warehouses and the CDP … what does that mean in terms of how do we actually go about building insights on our customers? How do we then start to build our personalisation tactics off the back of that – which is already difficult in the current regulation.”

He's not kidding. By way of example, he cites a client trying to work out how it would do more data-driven TV advertising.

“We’re working with one of our retail clients at the moment and it took close to a year to finalise [the approach to the network] with their legal team, in terms of how we renegotiate our needs, how we look at data enrichment for that client so we have a better understanding of their sponsorships and their customers when we're working with one of the TV networks. And they start to build the product around that, how they communicate with those audiences, and how they follow that journey and onto their own side from a personalisation perspective," says Krass.

“The automated decision-making regulation is going to add on a whole new legal element that sits on top of that. What does that mean from a legal perspective, [as well as] the amount of work for the people that sit in all of those meetings and the costs to the business to try and build our client-facing solutions and customisations?”

Applicability

Richard Taylor, managing director of Digital Balance, a Melbourne-based DX agency sees wide applicability across sectors given the prevalence already of automated decisioning. "Health, finance, insurance, even retail and ecommerce. That includes things like loyalty rewards or access to exclusive offers."

The business process implications of the Act are wide-reaching. "Can you explain to your customers how the decision was made and what data was used to make that decision? For example, they could specifically ask what was it about my loan history that led you to the decision."

Taylor also sees a resource crunch, not only for brands but for regulators. "It's going to be a lot easier to make a complaint, but at the same time the Privacy Commissioner has said their budgets  have been reduced and their staffing has been reduced."

For now, Taylor is cautious about the kind of work Digital Balance does around automated decisioning, preferring to focus on practical implementation matters for clients, rather than matters of policy "We want to stay away from those systems. We could say, yes you can use this engine, like Bupa was using Pega and Tealium combined to make the [decisioning] engines. We help them implement those systems, but we don't actually help make the rules for those systems."

The average person still has no idea what automated decision-making is; they think it's just technology making life simpler and easier for them ... It's going to take a show like Sunrise or Today Tonight – to explain this to the average consumer.

Lyndall Spooner, CEO at Fifth Dimension Research & Consulting

Loyalty programs

David Parsons, CEO of customer loyalty program specialists Ellipsis & Co, thinks more transparency through privacy policies, terms and conditions can only be a good thing for loyalty program providers to maintain trust and integrity.

He claims many operators have already demonstrated proactive emphasis on ensuring the increasingly sophisticated technology systems driving their programs have kept up to date with changing and pending privacy legislation as well as customer expectations.

The other benefit in their corner, he says, is the clear understanding that already exists around the value exchange between customer and loyalty program provider: Personal data equals better promotions, offers and personalisation.

“It's not about if we should insure this person or not, or lend money to them or not,” he tells Mi3. “A lot of brands are already talking to members about personalised offers and signing up to propositions that give them richer offers based on who they are what their preferences are.”

Which is why Parsons says he’d be surprised to see negative fallout from having to be more transparent about the way data is used within the construct of these programs and their underlying technology systems.

“We’re fortunate as an industry, provided we can continue to convince our members it's in their best interest we know more about them so we deliver better offers, experiences and treatments to them. The fact we're using more advanced technology and their data to do that should hopefully be acceptable to them as opposed to it being alarming for them,” Parsons says.

“Again, if it's indeed true, which I think it is, that most of the application of those systems are for better experience, better marketing, better personalisation, then if you get that right, you're actually working in the customer's best interest anyway.”

Do consumers care?

Either way, the million-dollar question is whether consumers will firstly pay attention, and secondly, do they even care?

“The leap of faith is the consumer in general understands there is a fair amount of science and technology that goes into being able to operate those programs at scale. What we’re doing is admitting that in the privacy policy,” Parsons says. 

Lyndall Spooner, CEO at Fifth Dimension Research & Consulting, believes most average Australians still lack an understanding around how their personal data is collected and used – younger generations like Gen Z possibly don’t care given their heavy reliance on technology.

She points to how much damage digital manipulation and tech can do to the brain, citing research into the negative impact on a consumer’s ability for self-efficacy and solving problems on their own as a case in point.

“The average consumer still has no idea what they're actually agreeing to, and I think this is where it's still lacking transparency to the average person,” Spooner says.

Too many businesses think about purely getting consent, rather than the ethical issues of whether consumers actually get what they’re agreeing to, Spooner continues.

“We think we are becoming more transparent, and getting the right permissions by asking people to click on 10 things instead of agreeing to one thing? The average person still has no idea what automated decision-making is; they think it's just technology making life simpler and easier for them,” she adds.

“But they're going to become more educated as we go through this, and as mainstream press starts to pick it up, because that's what's it's going to take it – literally a show like Sunrise or Today Tonight – to explain this to the average consumer.”

Renewed scrutiny

According to Clayton Utz partner, Dean Gerakiteys, “By implementing legislation like this – whether it's the tort, or other aspects of the reforms – and having Australian end-users be very conscious now of their individual rights, it’s educating them to start asking questions they might not have been asking before.”

He said that where the law firm’s clients are active in customer-facing industries, “their teams are going to be concerned about what their customers are concerned about. “

“The more their customers know they're getting these individual rights, or are getting things close to what they're seeing overseas, and the more they think there are significant reforms, that changes that discourse.”

That will force businesses to start thinking about the implications ahead of time. “Not just because they're going to be law, but because they're good business," he said. “That always seems to drive behaviours, even before legislative change.” 

What do you think?

Search Mi3 Articles