Legal design for digital citizens

The creep of technology in our lives brings with it many ramifications and it’s precisely these developments – good and bad – that a team of Melbourne Law School academics are investigating.

By Johanna Leggatt

The creep of technology in our lives brings with it many ramifications and it’s precisely these developments – good and bad – that a team of Melbourne Law School academics is investigating.

The newly-formed Digital Citizens Research Network at Melbourne Law School brings together researchers with expertise across a range of legal areas – from consumer and competition law to banking and finance, health and environmental law – all of whom are studying the impact of technology and innovation on their field.

The group is focused on the role of law in keeping pace with rapid digital disruption, working across a range of ongoing research projects, from artificial intelligence and human-centric software design to the automation of traditional legal services.

“We formed the network because we realised that there are many Melbourne Law School academics engaging with new technology, and this network was an opportunity to bring those academics together to support research across disciplines and research fields,” says MLS Professor Jeannie Paterson, co-convenor of the Network.

We’re really interested in that fundamental question of what does this technology mean for citizens’ rights and what are the obligations of the state in responding to these issues?

“Forming a network puts us in a much better place to make recommendations, influence policy and speak up to demand digital equity.”

Jeannie Paterson
Professor Jeannie Paterson. Image: supplied.

The last point is a salient one as Paterson argues that despite our advances in technology, there are those who remain on the margins, locked out of the digital revolution.

“My research, working with the Melbourne Social Equity Institute, is concerned with the experiences of people who are poor, people living in remote areas, some older people and some in new communities who may have less access to technology and less voice in the automation of decisions that affect them,” she says.

“Take, for example, the notion of automating Centrelink payments.

“Sometimes those affected are the people given the least opportunity to challenge those decisions, so they risk being excluded in the virtual world.”

Paterson says academics and researchers, therefore, have a fundamentally important role to play in “bringing accountability into the equation”.

“We can work with engineers, we can work with programmers and health professionals to say that these technologies offer many positive opportunities, but they need to be accountable to the public.

“Often technological innovation happens in large corporations far away in California or China, and there is a real risk that if governments and the public don’t start thinking about what we want from that technology and what rights we think are important to be protected then the regulatory horse will bolt.”

The field of law itself is not immune from technological upheaval, and numerous advances with programs, such as natural language processing and e-discovery tools, are enabling lawyers to work smarter and more efficiently.

Gary Cazalet
Gary Cazalet. Image: supplied.

In fact, many MLS students are playing an active role in this area.

Gary Cazalet, senior lecturer and co-convenor of the Digital Citizens Research Network, heads up the Law Apps subject, which is now in its fourth year and is so over-subscribed there is always a waiting list.

The students partner with not-for-profit or government organisations and, using a technology platform called Neota Logic, design and build web-based applications that provide tailored legal information.

“They build applications that are live and used by consumers now,” Cazalet says.

For example, MLS students built the ‘My Rights’ application for young people’s legal centre Youthlaw, which informs users of their legal rights based on their age.

“Responding to questions on the site, people will get tailored information that is relevant to their circumstances,” Cazalet says.

“There is this disruptive element in that it is already changing the way people access legal advice.

“The site is providing access to legal information for someone in their own home, with no appointment, and at a time when funding for legal aid has been reduced. It increases access to justice for many people.”

Large and medium-sized firms are already using these kinds of platforms, Cazalet says, and putting what is known as ‘human-centred design’ at the heart of what they do.

“Law is changing radically as a profession and I think it is an exciting change.

The legal [fraternity] are re-designing how they communicate so it is more client-focused, including the design of platforms with the end-user in mind.

“It’s the kind of design used by Google and Apple, and now the legal sector is doing it too.”

The more technology influences the law, the more questions that arise in relation to the judicial process.

As Associate Professor Jason Bosland points out, machines and algorithms are being trialled within legal settings and this has huge implications for defamation and open justice.

“Where you have algorithms that produce material that may be defamatory, then who is responsible for that defamation?” he says.

“Google is a prime example. Their search results page is produced by an algorithm, but does this make them a publisher, and therefore capable of defaming?”

The principle of open justice is also threatened by the emergence of virtual courts.

“If you have a virtual court what happens to the open justice principle?” Bosland says.

“How will the public access that court and what mechanisms will be in place to ensure public accountability of what goes on?”

Jason Bosland
Associate Professor Jason Bosland. Image: supplied.

Bosland has plans to research the area of “robotized justice”, in which an algorithm decides certain elements of a case or a person’s likelihood of re-offending.

“This may remove elements of human bias but also potentially embed it in other parts of the algorithm, and that will be difficult for people to identify,” he says.

“That’s something I am keen to look into next.”

Bosland says judicial attitudes towards technology can also challenge the concept of open justice.

“If you have judges being concerned about private information being published in the digital sphere, this might influence the way a judge decides cases in relation to open justice,” he says.

“It may be that judges are more likely to grant suppression orders or not release documents to non-parties.

The decision-making of judges is likely to be impacted by concerns about technology and social media.

Bosland, Cazalet and Paterson’s ideas, as well as those of their colleagues and contributors in the Digital Citizens Research Network, will be presented at a conference in July 2019 at Melbourne Law School.

The Digital Citizens Conference will bring researchers in law, engineering, computer science and public policy together to think about how to design for the future to ensure citizens’ voices are forefront in technological advances.

“I think what we will see is more collaborative research coming out of this network,” Paterson says.

“We think our network is different from anything else because we’re not just interested in the technology per se, but its effect on the values that are important to our community.

“After all, technology really impacts on how we understand ourselves and our interactions with each other, and with industry and government.”

Banner image credit: Marius Masalar/Unsplash.

This article originally appeared in MLS News, Issue 20, November 2018