Skip to main content
News Plus 11 Mar 2024 - 7 min read

Be worried: Australian privacy commissioner on what brands need to do next to comply with personal information, fair data use and breaches as AI threat looms

By Nadia Cameron - Editor - Marketing | Associate Publisher

There’s ever-growing consumer concern around the way personal information is handled off the back of Australia’s high-profile data breaches at Optus, Medibank and Latitude as well as AI use, says former privacy and now information Commissioner, Angelene Falk – and brands should be worried. Prevention is better than cure when it comes to meeting consumer demands and forthcoming privacy legislative change, and Falk is urging businesses to ensure data breach plans are not gathering dust but are operationally ready. Where businesses also need to do better is around the human intersection with systems and information, Falk says. That’s especially the case with more AI on its way, consuming more and more data sets, often without the ethics of human oversight, from all edges of the internet.  

What you need to know:

  • Australian former privacy commissioner and now information commissioner, Angelene Falk, was warning brands against being complacent in the face of the protracted but still forthcoming changes to the Privacy Act at the recent ADIA Leaders Forum in Sydney.
  • Among the most notable recapped by Falk are the shortening of the notifiable data breaches act to 72 hour reporting requirements; inclusion of businesses earning under $3m per annum per year; the introduction of ‘fair and reasonable’ use of personal data tests; and a swathe more penalties at the OAIC’s fingertips.
  • In the face of these changes, growing data breaches and heightened consumer trust and concern with how their information is being used, the privacy commissioner urged the adoption of privacy by design and practical, usable data breach and communications plans.
  • The privacy commissioner also touched on the importance of not just relying on tech but also education and training for the humans in the mix.
  • Growing use of AI also has consumers worried about their personal information, and while highlighting a number of ways the current privacy law and forthcoming Privacy Act reforms will help, she admitted there’s a deficit given how rapidly Gen AI in particular is scraping data from all edges of the internet.

Speaking at last week’s ADIA Leaders Forum in Sydney, Australia’s former Privacy Commissioner and now Information Commissioner, Angelene Falk, took attendees through the recommendations made around updating Australia’s Privacy Act, and what businesses should be doing to get prepared for their impending rollout.

It’s an onerous to-do list if you haven’t started building privacy by design into your organisation yet. In all, 38 of the 116 recommendations released last year to overhaul the Privacy Act have been agreed to in principle by the Federal Government. Yet more than four years on from commencing the process, protracted consultation continues on how to execute such changes through legislation and put them into practice.

That shouldn’t be causing lethargy across businesses, however, and Falk was quick to flag the need to get your ducks in order now.

Asked during the ADIA event what companies should concentrate on, the Office of Australian Information Commission chief said prevention is better than cure and having a plan in the event of a data breach occurring is non-negotiable – something she is concerned businesses are lagging behind on.

“I’d be making sure if you’re not already building privacy by design and assessment for any handling of personal information, that you build that in as part of risk assessment right now. Think of it as the person on the street test,” Falk told attendees.

“There’s an old example I use from Facebook, where they were manipulating user feeds to see if it affected people’s emotional responses. In the Privacy policy ... they thought they had consent to do that. There was big community backlash about it – and rightly so. I can see from people’s expressions in this room that it’s an unfair way of handling information and could incur risk of harm to individuals.

“Taking that lens to your work and doing that now is key.”

In the OAIC’s most recent longitudinal consumer survey, a strong call for businesses to do more to protect personal information was evident, with 92 per cent of respondents agreeing more needed to be done. In addition, 89 of the community want the Government to pass more law to make this happen.

In response, Falk noted the Government has agreed in principle to new requirements of fair and reasonable handling of personal information, which has the potential to transform the way targeting and other forms of personalised marketing are conducted. It’s also one of the reasons more proactive, consent-based approaches to capturing data are lately being promoted across the marketing and media industry.

The second must Falk outlined is having a data breach response plan in place. A significant change coming with the overhauled Privacy Act is a shortening of Notifiable Data Breaches (NDB) scheme reporting requirements. Instead of up to 30 days, companies will need to report within 72 hours of becoming aware a breach has happened.

“Ensure that plan is not gathering dust in the bottom of draw but has been operationalised,” Falk advised. “The 72-hour notification requirement means you’ll also need to have good detection systems in place so you know when something has occurred and can take the action, notify the individuals and importantly, put in place those remediation steps to protect the individual.”

Beyond the tech, the privacy commissioner said a big area businesses need to do better in is training individuals and recognising the humans in the mix.

“What’s consistent is the intersection with human beings. It’s really important to not just have technology safeguards in place around data, but also training and education for your people to make sure they can identify tricks of malicious actors,” Falk said. “We’re seeing so much more identity fraud as there’s a lot of information on the dark web after the massive data breaches we’ve seen.

“This allows actors to impersonate legitimate actors more easily – they can masquerade as a CEO or head of finance and ask for information to be sent through.”  

Falk further stressed the need to think about the supply chain and whether multiple parties are holding the data. Take the example of last year’s Dymocks loyalty program data breach, which saw customer records exposed on the dark web. In this instance, an internal investigation concluded the breach was caused by unauthorised access to systems run by the retailer’s new loyalty provider to temporarily store records while an update was occurring.

The third piece is readiness to respond in terms of external communications and assisting individuals navigate the impact of a data breach.

“Identity fraud is really commonplace and it takes people days, weeks or months to re-establish identity through a new passport, a new driver's licence,” Falk said. “Having in place arrangements with ID Care, a not-for-profit you can contract with to assist people to re-establish identity is one way; and make sure you have something prominent on website like a phone line to get information when a breach occurs.”

Under amendments to the Privacy Laws made in December 2022 as a direct consequence of the Optus breach, organisations are now required to not only send notices to affected individuals, but be very specific about what information has been compromised. For example, specifying if it’s a tax file number, email, postal address or other identifiable information. They are also required to set out steps people can take to protect themselves from downstream harm, such as changing passwords.

“This is again about putting individuals at the centre – both in terms of preventing data breaches then protecting them and then your response,” Falk said.  

Saying it’s [data or AI use] part of the black box will no longer accepted. It also brings to the fore the issues of human oversight. We know from our privacy survey the level of trust with using AI is really low in Australia – only 15 per cent of respondents were comfortable with business using AI to make decisions about them. But interestingly, the proportion or level of comfort increases if there are additional protections, like human oversight.

Angelene Falk, Australian Information Commissioner

Growing deficiencies around AI use

The importance of recognising humans again came up when Falk was asked about the Government’s inquiries into responsible AI practices, which took shape via the discussion paper, Safe and Responsible AI in Australia, released last year. In its interim response, the Government flagged Privacy Act reform as one of several mechanisms available to addressing known harms with AI, along with other existing regulatory frameworks, such as the Online Safety Act 2021.

Falk pointed out the Privacy Act already applies to AI in that any personal information use is regulated.

“But it’s fair to say the law is finding itself being applied in ways it wouldn’t have been expected to a couple of years ago,” she said. “Think about Gen AI – often, it’s brought together through data scraping techniques from the Internet. If that’s sensitive information about health, for example, it should be collected with consent. There’s a deficit there we can already see.

“When deploying AI within organisations, you raise the ethical issues. We see the change in the Privacy Act to have a ‘fair and reasonable’ test will go to some of those ethical questions: They’ll require us to ask the question of whether this will do harm; if this is a proportionate use of information, and if there’s a less intrusive way to operate. Also thinking about that in terms of the use of children’s data.”

A couple of aspects of Privacy reform run directly to the AI issue. One Falk noted was privacy policies would need to be more transparent about use of AI or automated decision making. Secondly, individuals would have the right to request information on how automated decisions are made, and where a decision has had a substantial effect, like in a legal context.

“That starts to give that requirement for explainability to have legal foundation,” Falk said. “Saying it’s part of the black box will no longer accepted. It also brings to the fore the issues of human oversight. We know from our privacy survey the level of trust with using AI is really low in Australia – only 15 per cent of respondents were comfortable with business using AI to make decisions about them.

“But interestingly, the proportion or level of comfort increases if there are additional protections, like human oversight.”

Recapping what the Privacy Act overhaul is set to bring

There are several big-ticket items set to be introduced with the Privacy Act overhaul Falk highlighted that are worth repeating. A big one is the growing call for the 97 per cent of Australians businesses under $3 million turnover per annum to be captured within new legislative obligations around data and privacy – something Falk is a big supporter of. This is despite concerns around the additional burden this might place on smaller businesses to comply – something a couple of small business attendees raised at the ADIA event.

Another seismic shift is introducing a ‘fair and reasonable’ test to how personal information is handled. “In my regulatory experience, that for first time expressly requires organisations to put the interests of the individual at the heart of [data] handling practices and if they’re likely to cause harm. And if so [action] less privacy intrusive ways to proceed,” Falk said.

Another big change already mentioned is shortening the Notifiable Data Breaches (NDB) scheme. Additional consumer rights, meanwhile, include the right to have personal information deleted and to request from a company where personal information in use came from. The new-look code proposes minimum and maximum data retention timeframes too, with provisions drawing attention to destroying personal information when it’s no longer needed.

Finally, a range of new penalties are set to be made available to the OAIC. For example, failure to have a compliant privacy policy could get you an infringement notice and fined. At a mid-tier level, less serious or one-off interferences with privacy that warrant civil penalty will be penalised accordingly; then there are top-tier penalties and court for the most egregious conduct.

Meanwhile, the incoming Privacy Act overhaul opens up greater potential for class actions.

Right now, the OAIC is in the midst of conducting major investigations into the data breaches that occurred at Optus, Medibank and Latitude. It also launched proceedings last November in the Federal Court against Australian Clinical Labs after the pathology provider took months to notify authorities of a data breach that saw private data from at least 23,000 patients exposed on the dark web.

- For more of Mi3's in-depth coverage on the proposed Privacy Act reforms, check out our full review here.

- And check out our latest report into the Consumer Policy Research Centre's response to how consumers really feel about marketing and advertising using their data here. 

 

What do you think?

Search Mi3 Articles