By Johanna Leggatt
Recent controversies such as the Cambridge Analytica scandal and the management of the Australian Government’s My Health Record database have again highlighted concerns about our data privacy. As privacy laws around the world evolve, we asked members of the MLS community about the challenges faced by businesses, governments and consumers.
There is nothing like a mass data breach to focus people’s attention on how personal information is stored, used and potentially harvested for nefarious purposes.
A recent high-profile example of this was the Cambridge Analytica scandal, in which the data analytics firm used personal information from more than 50 million Facebook profiles to target US voters with political advertisements during the 2016 presidential election campaign.
The public blow-back was fierce and immediate.
MLS alumna Carolyn Lidgerwood (BA(Hons) 1989, LLB(Hons) 1991), Head of Privacy at mining firm Rio Tinto, says the backlash to scandals like this show that people do care about their data.
“People often say that the younger generation don’t care about privacy because they share so much data online, and I think that is really naïve,” she says.
People share when they think they have control. When people suddenly realise they are not in control, then they are outraged.
MLS Associate Professor in Health Law and Regulation Dr Mark Taylor says the Cambridge Analytica exposé highlights a weak point in data privacy laws.
“Privacy law at the moment is focused on regulation of data as it relates to identifiable individuals and isn’t so good at regulation of data when it relates to groups,” he says.
“So with Cambridge Analytica, they generated model profiles using Facebook data when put together from other data from a personality quiz, and it was those profiles that could then be used to micro-target political campaign adverts.”
As Taylor points out, simply deleting the individual data used to construct the profiles did not prevent those profiles being used.
“It seems a bit odd that if the data relates to one person we are interested in the harms that may follow the use of that data, but if it relates to more than one person we don’t seem as well placed to regulate it,” he says.
Just as the law must keep pace with the way technology is manipulated, so too must companies ensure they are handling customer and staff data sensitively.
Lidgerwood has worked in data privacy law across a range of sectors. In her role at Rio Tinto, she is responsible for managing the global data privacy compliance program across all operations — from conducting privacy impact assessments when new technologies involving personal data are introduced, to ensuring adequate contractual protections are in place when outsourcing data processing.
“A lot of my job is making sure our global program is kept up to date because privacy laws are evolving around the world so quickly,” Lidgerwood says.
“We also deal with a lot of personal data internally, as we have more than 50,000 staff across six continents, as well as data from our customers and suppliers.”
Despite the variety of international privacy laws that Lidgerwood is juggling, the same core principles of data protection underpin many countries’ legal frameworks.
“It means that the core data privacy obligations of companies, whether they’re in Australia, Canada, Singapore or Europe, do not change that much,” she says.
“Those core obligations include that companies should only process personal data for limited and specific purposes, and those purposes need to be clear and notified and where necessary, consented to.
“You only collect what you need and only for specified purposes. You keep the data secure, you keep it up to date and accurate, and you’re open about what you’re doing with it.
You need to recognise there is a balance between allowing companies to use data for their own legitimate business purposes, and weighing that up against the privacy rights of the person.
Of course, some companies are more proactive about procuring individuals’ consent to data use than others, and Taylor argues there is work to be done in developing robust concepts of “meaningful consent” in an online environment.
“We have allowed the concept of consent to be debased in a way,” he says.
“We are allowing groups and individuals to get away with claiming valid consent to uses of data when the individuals about whom the data relates have no real understanding of what clicking the button implies for them.
“So old concepts of consent are struggling to be effective in a digital world.”
This is why Taylor would like to see the old consent model – the tick-the-terms-and-conditions box – transposed with “dynamic consent”.
“The model where the information is provided upfront and you give consent – and the organisation never comes back to you and checks if you comprehend what you have agreed to – needs to change,” he says.
We need to move to dynamic consent, where the responsibility to provide information is effectively discharged through ongoing communication and dialogue.
“Some individuals can then go back, if they wish, and find out more granular information and continue to express preferences in relation to their data.”
This desire to dynamically control the dissemination of our data is at the heart of the controversy surrounding the roll-out of the My Health Record system in Australia.
My Health Record was originally designed as an ‘opt-in’ system and there was widespread alarm when it was announced that the system would be changed and Australians’ medical information would be uploaded unless they opted out.
Lidgerwood isn’t surprised it has raised concerns.
“The main concern seems to be data security, and by that I mean not the security at the centre, because we have all been assured the main system is to be very secure, but the access by all the medical clinics all over the country. How good is their security?
“In a company, your weak spot is often when you outsource to external providers because you don’t have that same control over their systems as you do over your own.
“This is why we seek a range of independent certifications and attestations about our service providers’ data security so we can have confidence in them.”
Taylor says that an ‘opt-out’ health system, such as My Health Record, cannot be built around the legal concept of consent.
“When it’s an opt-out system, you need to defend that system on the grounds of public interest,” he says.
“So you can only use the system in ways that individuals have reason to accept and expect within that public interest.”
That is not to suggest that My Health Record isn’t without considerable potential benefit.
“Nothing is entirely risk-free,” Taylor says.
I’m not suggesting that you don’t try and make the database as secure as you can, but I think, increasingly, researchers in the health space are questioning whether we should be offering people guarantees of anonymity when they take part in health research.
“The reason is that sometimes the good-faith assurances given are undone by technological advances in the future, and people can be reidentified in data sets they thought were anonymous.”
For example, in late 2016 University of Melbourne researchers at the School of Computing and Information Systems were able to re-identify patients in historical health records from the Australian Medicare Benefits Scheme and the Pharmaceutical Benefits Scheme, highlighting a risky balance between data sharing and privacy.
The records, which were released by the Federal Government as part of its policy on accessible public data, were de-identified, but the researchers found they could re-identify patients through a process of linking unencrypted parts of the record with known information about the individual.
“People are beginning to question whether you can claim absolute anonymity,” Taylor says.
“A more accurate assurance may be to state that every measure will be taken to ensure privacy.”
The introduction in May this year of the General Data Protection Regulation (GDPR) in the European Union significantly strengthened the requirement for valid consent from consumers, according to Lidgerwood.
“Europe has gone well past the days of implied consent,” she says.
It’s all about express consent – that is, freely given, unambiguous and informed. You need to show that you have explained to people exactly how the data will be processed.
Taylor agrees that the GDPR has significantly strengthened the requirements for valid consent.
“The GDPR raises the hurdle in a number of ways, which does make it more difficult at times to be confident that one has obtained valid consent. But the GDPR also recognises that, by acknowledging there are a number of other legal bases for processing data other than consent.”
“For example, the Health Research Authority in England has advised health researchers not to rely upon consent as the lawful basis for processing under data protection law – although consent may still be required for other legal and ethical reasons. Instead, universities and other public authorities are advised to rely on the fact that the processing of personal data for research is a ‘task in the public interest’.
Taylor says even when other legal bases are relied upon, responsibilities in relation to transparency do not go away.
“So even if you’re not going to ask [for consent] you still need to tell [consumers], and I think that is a really important lesson for when you’re not relying on consent as your legal basis for processing data.
Data holders still have a responsibility to engage with consumers, to tell them how their data is being used and to give them the right to object if they wish.
Lidgerwood says those other legal bases are particularly important when it comes to employees, as it is very difficult to rely on employee consent in the EU, where regulators take the view that employee consent is rarely ‘freely given’.
“But whether reliance is on consent or on another ground like ‘legitimate interests’, it is critical that companies make it clear to people when they’re collecting their data and why they’re doing so.
“At the end of the day, it’s about being accountable and all the international privacy laws come back to that principle of transparency.”
Consumers, too, have a role to play.
“I think social expectations have a bit of work to do here in catching up,” Taylor says.
“Take, for example, the names and addresses of friends and family that are shared by individuals who agree to an app being installed on their phone that has access to their contacts.
“You can discuss the legal rights and wrongs of that, but one shouldn’t lose sight of the fact that the person who has installed the app has responsibilities here too.
“I don’t think we should rely on the law to do it all.
“So, as well as building governance-oversight mechanisms to promote accountability, transparency and trust, we must collectively develop the social norms of a healthy digital society.”
Banner image credit: LuckyStep/Shutterstock.com
This article originally appeared in MLS News, Issue 20, November 2018