Criminal Law

Decoding automated decision-making in the justice system

By Randy O’Donnell, AdvocateDaily.com Associate Editor

A day-long conference on the use of automated decision-making (ADM) in the Canadian criminal justice system helped build on the critical work of developing a legislative and regulatory framework around the technology, Toronto criminal lawyer Jill Presser tells AdvocateDaily.com.

Presser, principal of Presser Barristers, was one of the organizers of the conference, held at the Law Society of Ontario’s Queen Street West office. It was being presented by the Law Commission of Ontario, The Citizen Lab, the International Human Rights Program at the University of Toronto, and the Criminal Lawyers Association (CLA).

ADM may include the use of automated tools such as algorithms, machine learning, and artificial intelligence (AI) systems in policing and in the courts, she says.

Currently, there is no legal framework in Canada to address these technologies in the criminal justice system, or how they may impact constitutional rights and civil liberties, says Presser, who is also co-chair of the CLA’s criminal law technology committee.

“Ideally, all of this would get legislated and regulated ahead of time, and we’d never have to litigate any of it, unlike the Americans, who are way ahead of us in the use of ADM tools. That’s part of the reason why I think this forum was so important,” she says.

“The main impetus for this conference came from the Law Commission of Ontario, which is a law reform body, because they’re looking at the kinds of reforms that can help us avoid the possible legal and civil liberty violations we are worried about.”

The conference featured presentations and workshops on topics that address those concerns, including:

  • understanding automated decision-making for lawyers: code and data
  • the American experience
  • current and potential uses of AI in criminal law in Canada
  • automated decision-making in the criminal justice system
  • analysis and discussion of legal rights and system implications: cases, due process and lived experience in access to justice
  • predictive policing
  • litigating algorithms: disclosure, design, and due process

Presser was part of a four-person panel which led the discussion on legal rights and system implications. She says conceptual concerns, such as accountability, reliability, transparency, due process, and equality, were discussed.

Presser says one such concern is that there are often biases built into these algorithmic tools that tend to affect historically disadvantaged and vulnerable groups, exacerbating an already existing inequality.

She says predictive policing is among the most widely used algorithmic tools in the U.S. The tool operates by conducting analysis through artificial intelligence of large amounts of demographic and other information about crime.

Presser explains that when enough information across a wide geographic space is entered, the tool generates predictions about where and when a type of crime is likely to occur. This enables police to anticipate incidents and deploy resources to prevent them, she says.

“The problem is that we know there is already a socio-economic and racial component to who gets policed in North America, so the poorer you are, the more racialized you are, the more likely you are to be over-policed in your community,” Presser says.

“If you take the data of an already over-policed, over-represented, vulnerable, equity-seeking group, and put that into a predictive policing algorithm, what it spits out is to go and police those communities even more. What you end up with is a self-perpetuating, worsening of an existing dynamic that is neither socially desirable nor just.”

The group also discussed the use of automated surveillance tools such as public cameras, drones, and facial recognition technology, she says.

Cameras are something Ontario drivers encounter every day as fresh photos of their licence plates are taken automatically and compared against an already existing police database of plates.

Similarly, the Toronto Police Service (TPS) is conducting aerial surveillance with drones.

“The thing about drone surveillance is that it is able to surveil in a way that humans never could. The machines don’t get tired. They don’t have to go off duty. They don’t require a lunch break or sleep. Surveillance by a drone can be 24-7, forever,” Presser says.

“There is a kind of persistence and omnipresence of surveillance when you have automation that was never available before, and that speaks to privacy issues.”

While facial recognition technology is not yet widely used in Canada, it is in the U.S. and the United Kingdom, she says.

It is an area that gives Presser a great deal of concern since the technology has not been shown to be very statistically accurate. She says it has also been demonstrated to be more error-prone in recognizing persons of colour and women.

“There are a couple of high-profile cases in the U.S. where police executed high-risk, guns-drawn, SWAT-teams-surrounding-the-house arrests based on facial recognition, and it was quite simply the wrong person,” she says.

“I think it is really concerning that the state is building these huge databases of photos of people’s faces by capturing images of us everywhere as we go about our daily lives — in the park, in the library, on the street — and I think it really infringes on the concept of anonymity.

“There are many who say that if you are out in public, how can you have a reasonable expectation of privacy? The fact is, when we are out in the community, we expect anonymity. The concept of privacy as anonymity is really an important one, and facial recognition tools threaten to destroy that,” Presser says.

“Privacy is a precursor right. We need privacy because it’s foundational in order to enjoy other freedoms such as the freedom of religion, expression and association. Many of our human rights are predicated on the ability to enjoy those freedoms based on anonymity.”

The group also discussed gaps in the criminal justice system surrounding the use of ADM, including the issue of funding, which Presser says is particularly worrisome from an equality perspective.

“If defence lawyers are going to be representing their clients and litigating against algorithmic decision-makers, they are going to have to fight for disclosure of code, hire experts to review the code, and then bring constitutional or other challenges to court to litigate the code. In order to do that, it will take funding, and that funding envelope doesn’t currently exist anywhere,” she says.

“We don’t have those tools on stream in the justice system, and the legal-aid envelope is not that well-funded at the best of times, and isn’t getting any bigger.

"So funding is going to be a really big issue. Without it, it will exacerbate the inequalities that already exist in the justice system if wealthy people who get caught can afford to litigate algorithms, and the disadvantaged, who get arrested in greater numbers, can’t,” Presser says.

The group also covered what kinds of litigation strategies will be available as ADM tools come on stream in the justice system. They examined Charter rights, natural justice, and extraordinary remedies that might be useful, she says.

They also looked at whether there are concerns that can’t be addressed through litigation, and whether that necessitates a need for law reform, Presser says.

“I’m a litigator, and I love to litigate, and as these tools come into use by the Toronto Police Service, I’ll be happy to don my robe and ride into court to fight them tooth and nail,” she says.

“But, when my better angels guide me, I recognize that litigation actually represents a failure of the system in this regard. We have the challenges of these brave new technologies. They do present opportunities for good because they are efficient and can create greater safety and reliability in our communities if programmed properly,” Presser says.

“The challenge of them is to build a legislative and regulatory framework from the ground up so that we can avoid people being subjected to these technologies in ways that violate their rights.”

To Read More Jill Presser Posts Click Here