This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 6 minute read

Do androids dream of judicial review? Challenges to automated decision-making processes in the public sector

While the idea of significant decisions being made with no human involvement was once the preserve of science fiction, automated decision-making processes (ADMPs) are increasingly being adopted by public bodies. In this article, we consider when and how an ADMP’s decision can be challenged by way of judicial review. Given a number of high-profile cases which have been recently brought, this is likely to be a rapidly developing area in public law for which public authorities should be prepared.

ADMPs are not new – but are expanding

ADMPs make decisions with no human oversight, using methods ranging from simple algorithms to artificial intelligence. Although not yet as complex (or consequential) as Terminator’s Skynet or I, Robot’s V.I.K.I., examples of ADMP decisions being used by public authorities include:

  • risk-based verification processes in relation to benefit applications;
  • recidivism prediction tools;
  • checks undertaken by the Department of Work and Pensions or HM Revenue and Customs; and
  • calculations of social security benefits.

Government departments have been using forms of ADMPs for over 20 years (the British Government uses over 40 types of ADMP, according to the Public Law Project), and its usage is expanding. Reasons for this include ADMPs’ promise to be quicker, more cost-effective and (at least theoretically) reduce the risk of human error and, as the technology develops, their increasing sophistication in terms of the decisions they can address.

However, when ADMPs go wrong, for example through flaws or bias, the impact can be wide-ranging. A single issue in an ADMP can lead to consistently poor outcomes delivered over time to a large population. The ongoing Australian Royal Commission into the “Robodebt” debt recovery scandal provides a good example of the magnitude of issues possibly arising out of ADMPs

We have already started seeing judicial review in this space

In the UK, judicial review has been threatened in response to the algorithm used to generate A-level results, to challenge an algorithm used by the Home Office for processing visa applications, with both systems being dropped before the cases could be heard. Other cases, discussed below, have proceeded to the courts, with mixed results. 

What kinds of “decision” can be challenged? 

The first issue to determine is whether there is even a “decision” capable of being challenged and, if so, the stage at which ADMP sits in the decision-making process:

  • across the whole decision-making process or just as part of it; 
  • to make a final decision, to recommend a decision or to make a preliminary recommendation then reviewed by a human decision-maker; or
  • to guide a decision-maker through relevant facts and provide support systems, including to sort, filter or arrange information.

In Australia, the Full Court of the Federal Court, in Pintarich v Deputy Commissioner of Taxation, considered whether a letter issued by the Australian Taxation Office (ATO) amounted to a “decision” of the ATO, given that it had been issued solely by a computer-generated system without human review:

  • A majority concluded that no decision had been made, given that that required both “a mental process of reaching a conclusion and an objective manifestation of that conclusion”.
  • In a dissenting judgment, however, Kerr J reasoned that close assessment was required of the circumstances of the conduct and whether these were within the “normal practices of the agency” and whether the decision making was an “overt act would be understood by the world at large as being a decision”. Kerr J observed that automated systems are increasingly being used by government departments for bulk decisions, and that the legal concept of what constitutes a “decision” cannot be static and must comprehend that technology has altered how decisions are made.

While there is no similar UK authority, and there is much to be said in favour of the dissenting judgment, this case demonstrates the unique complexities around the role that the ADMP plays in the decision-making process: issues which the courts will need to grapple with in the coming years.

In the UK, most Claimants have proceeded with ADMP claims on the basis that the relevant decision was capable of challenge. Public authorities would be wise to assume that ADMP decisions may be reviewable on a number of grounds including:

  • the decision to use the ADMP in the first place (such as if the system chosen uses biased data or if there were design faults with the underlying algorithm);
  • the reliance on the output of the ADMP; or
  • with respect to the use of ADMP in general (as was the case in R (Bridges) v Chief Constable of South Wales Police (discussed below)).

Is a ground of judicial review available?

A number of factors will be significant when determining whether a decision is judicially reviewable and, if it is, what grounds of review are available:

  • Irrationality or fettering discretion – ADMP decisions may be challenged on the basis of “irrationality” arising from the particular factors that a system takes into account, the data it relies on and the system’s internal logic. Further, where the decision-maker is required to exercise a discretion, the rigidity of the ADMP’s design may open it up to challenge on the basis that the discretion has been unlawfully fettered. In these cases, it will be critical for the Court to have a clear understanding of how the system works, how it has been trained and what assumptions it relies on.

     
  • Material mistake of fact – A decision-maker must correctly identify the existence (or non-existence) of particular facts before exercising their power. This is another area in which some features of ADMPs – such as the accuracy of the data that has been fed into the system, or the system’s decision-tree – may present grounds for challenge. For example, in the “Robodebt” case, the Australian Government conceded that alleged debts were not validly claimed because the information before the decision-maker was not capable of satisfying the decision-maker that debts were owed under the relevant legislative provisions.

     
  • The duty to give reasons – As observed by the Public Law Project, as with many forms of AI, some ADMP systems operate as "black boxes", meaning that the rationale for the decision made is opaque or invisible since the outputs cannot be predicted from the inputs. This could provide grounds for challenge were a duty to give reasons for a decision applies and cannot easily be satisfied technologically.

     
  • Equality Act and discrimination – ADMPs may also lead to issues under the Equality Act 2010. For example, in R (Bridges) v Chief Constable of South Wales Police, the police's use of automated facial recognition in a surveillance system which monitored and matched images of the public to individuals on a police watch list was challenged on the grounds of discrimination, both in relation to the system’s use in general and specific instances of use. The Court of Appeal found that the police had not provided sufficient guidance on when the technology could be used or done enough to ensure that the technology had no racial or gender bias, and that the “human failsafe” measure aimed at determining whether to act on the software’s decision was not enough to fulfil the non-delegable Public Sector Equality Duty.

    More broadly, ADMPs in some circumstances could be challenged on the grounds that it breaches duties not to discriminate under the Equality Act. An example of an ADMP producing discriminatory outcomes is the Dutch Data Protection Authority’s finding that the Dutch Government’s use of algorithms in relation to benefit fraud was discriminatory, after 26,000 parents had been wrongly accused of fraud in child benefit applications. 

    The Public Law Project also recently sent the UK Home Office a pre-action letter in relation to its use of algorithms in determining whether to open ‘sham marriage’ investigations, alleging that this ADMP discriminates against individuals on nationality grounds and that the Home Office has failed in discharging its Public Sector Equality Duty. 

     
  • Other rights under legislation – The use of ADMPs may also engage issues under Human Rights Act 1998 or the Data Protection Act 2018. In R (Bridges) v Chief Constable of South Wales Police, the Court acknowledged that the automated nature of the decision-making process did not have the effect of displacing statutory protections under the Data Protection Act 2018. As the use of ADMPs increases, we may increasingly see specific statutory provision being made for their which may include bespoke provisions for appeal or review of any such use (or its consequences in individual cases).

Looking ahead

Although the use of ADMPs use is unlikely to diminish as a result of the possibility of judicial review, public bodies should carefully consider the possibility of such challenge when setting up ADMPs. Given the number of high-profile issues occurring in the short period in which ADMPs have been used so far, this is likely to be a rapidly developing area in public law. Being prepared for such challenges and being ready to justify the approach taken will be key. 

Subscribe to our Tech Insights blog for insights, updates and news from our experts - subscribe now!

Tags

tech disputes, data and cyber