This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read

UK’s HMRC ordered to provide details regarding use of AI in tax decision-making

Two and a half years ago, while people were still hearing about ChatGPT for the first time, we were considering whether automated decision-making processes can be challenged by way of judicial review. Returning to the theme of automated decision-making in the public sector, a recent case addressed the question: if an official is making a decision for a public body and decides to use generative AI, can people find out? From a recent decision in the First-tier Tribunal, it looks like the answer is yes.

Background

In December 2023, Tom Elsbury, the founder of a business specialising in R&D tax relief, submitted a Freedom of Information request to HMRC regarding details of its use of large language models and GenAI. 

HMRC responded that, although it held the information requested, it was withholding it on the basis that releasing it would prejudice “the assessment or collection of any tax or duty or of any imposition of a similar nature” (engaging an exemption under s31(1)(d) Freedom of Information Act 2000). Elsbury complained to the ICO about HMRC’s response. At this point, HMRC changed the exemption on which it relied, saying that even confirming or denying that it held the information requested would prejudice the assessment or collection of taxes (engaging s31(3) Freedom of Information Act 2000).

The ICO investigated, concluding in November 2024 that HMRC had been justified in neither confirming nor denying that it held the information. 

FTT decision 

Elsbury challenged this decision and the issue ended up in the First-tier Tribunal. In his submissions, Elsbury noted that HMRC correspondence had “tell-tale signs of AI usage such as American spellings and use of the “em-dash” punctuation mark” – points commonly raised in relation to AI-generated writing.

The Tribunal found there were errors of law in the ICO’s conclusions but, more interestingly, the Tribunal went on to consider the balance of public interest in relation to whether HMRC should disclose the information. On this topic, the parties agreed that there is public interest in (amongst other factors) “knowing about the usage of AI, owing to its potential to pose a high risk to individuals and their rights and freedoms”. Weighing up the other arguments made by the parties, the Tribunal found Elsbury’s arguments on public interest to be more persuasive, commenting that they had “considerable force”. In doing so, they found that HMRC had “not satisfied the requirement for transparency and accountability so as to facilitate public debate on the matter of HMRC’s use (or not) of AI and LLMs in respect of R&D tax relief”. 

The Tribunal also agreed that “transparency on HMRC’s part is particularly important when AI’s role in decision-making is a pressing concern globally”.

Key takeaways 

Although First-tier Tribunal decisions do not set legally binding precedent, businesses and public bodies should take stock of the issues raised by the FTT and consider the following:

  • If making decisions in any capacity, the case serves as a good reminder that it can be relatively obvious to recipients when GenAI has been used – carefully review any text to ensure that it sounds like it has been written in the style of your organisation.
     
  • If making decisions in the exercise of public functions, be mindful that you may have to disclose any usage of AI in such decisions in response to Freedom of Information requests. 
     
  • If reviewing a decision from a public body and it looks like AI might have made the decision, consider whether it is worth requesting that the public body provide details in relation to its AI-usage.

In relation to the final two takeaways, it is worth noting that the treatment of Freedom of Information requests is highly fact-specific, particularly given the range of exemptions which might apply and factors which might be weighed in a public interest test. 

This is unlikely to be the last case about AI use in public decision-making, and the outcome of this case itself suggests that there will be more to come. 

"...transparency on HMRC's part is particularly important when AI's role in decision-making is a pressing concern globally."

Tags

tax, hmrc, ai