Artificial Intelligence and Tax: Legal Trends Every Business Should Know
The increasing influence of artificial intelligence in our lives is impossible to deny.

Written by Steve Tetley, Tax Advisor at rradar
The increasing influence of artificial intelligence in our lives is impossible to deny.
For many businesses, early adoption of AI has proved beneficial, automating the many repetitive tasks that make up everyday life in an organisation. Some have also been asking themselves what else AI can do, and thoughts have turned to whether it can make light work of interaction with the tax system.
However, recent developments in the borderland between tax and AI have shown that perhaps we should pause and take stock of where we are and where we’re going.
From the taxpayer’s perspective
Although AI can be a good tool when it comes to automating mundane tasks, there are huge caveats with using it for anything that requires accuracy, legal or otherwise. Such is the case with tax. Recent cases before the courts and the tribunals have highlighted the dangers where the taxpayer represented themselves and based their case on AI generated case law. British law relies on past cases to inform, at least in part, current judgments.
That’s all well and good, as the argument is usually put forward by trained legal professionals with an almost encyclopaedic knowledge of cases past. However, what happens when the taxpayer – for whatever reason – represents themselves? With no legal expertise or ability to sift when it comes to cases they can use to support their argument, they may turn to AI to give them what they’re looking for. Provided the cases which AI finds and presents are valid and genuine, all should be well.
However, in more than just a few instances, those cases aren’t genuine. The AI has invented an entire case and for a taxpayer to rely on that invention, called by those in the AI sector “hallucination” will lead to almost certain loss. One such case was that of HMRC v Marc Gunnarsson at the Upper Tribunal.
During the COVID-19 pandemic, the government, keen to protect the economy, produced a number of schemes to try and support businesses, including the Self-Employment Income Support Scheme (SEISS).
In December 2021, after the most serious phase of the pandemic had subsided, Mr Gunnarsson received an HMRC assessment relating to the recovery of two payments that he had received under SEISS during 2020/21, totalling nearly £13,000.
Following those payments, HMRC realised that he wasn’t entitled to either of them because, at the times they were made, he wasn’t a self-employed individual. They therefore took steps to recover the payments. Mr Gunnarsson then took his claim to the Upper Tribunal, which deals with tax-related issues.
What lifts this case out of the ordinary is the use by Mr Gunnarsson of an AI to assist with his research into - and presentation of - his case.
Although the Upper Tribunal found against Mr Gunnarsson overall, it did add a postscript to the judgment in which it addressed his use of AI.
This described how Mr Gunnarsson, being an unrepresented litigant in person, filed his argument, which contained reference to three previous decisions purportedly made by the First Tier Tribunal, which were said to support his interpretation of the legislation.
HMRC searched public and internal databases to check those previous decisions but found that they didn’t exist.
When he was informed of this, Mr Gunnarsson accepted that he had used online AI software to help him with his written submissions. He amended his argument, removing references to the non-existent decisions.
Updated guidance for judicial office holders (April 2025) says:
“AI chatbots are now being used by unrepresented litigants. They may be the only source of advice or assistance some litigants receive. Litigants rarely have the skills independently to verify legal information provided by AI chatbots and may not be aware that they are prone to error. If it appears an AI chatbot may have been used to prepare submissions or other documents, it is appropriate to inquire about this, ask what checks for accuracy have been undertaken (if any), and inform the litigant that they are responsible for what they put to the court/tribunal.”
Although what Mr Gunnarsson had done was improper, the tribunal took an understanding stance:
“We do not consider the Respondent to be highly culpable because he is not legally trained or qualified, not subject to the same duties as a regulated lawyer or other professional representative and may not have understood that the information and submissions presented were not simply unreliable but fictitious. He was under time pressure given his other competing responsibilities and doing his best as a lay litigant seeking to assist the UT by preparing written submissions.”
In short, Gunnarsson did not set out to deceive anyone and lacked the experience to determine the information he had obtained was incorrect.
Of course, Mr Gunnarsson isn’t alone; there will be many thousands like him who, believing that proper legal advice is outside their ability to access, have decided that AI is somehow levelling the playing field in their favour. By the time they realise their error, it’s too late for them, but legal professionals will have been put to a considerable amount of extra work uncovering the errors and rectifying them.
From HMRC’s perspective
HMRC faces a monumental task in gathering, processing, sifting and resolving millions upon millions of taxpayer records. It makes sense, therefore, for an organisation of its size to investigate the potential of using AI to automate the drudgery. That HMRC has been doing this for some time is a matter of fact, but what’s less obvious is what they’re doing and how.
In publications, HMRC has stated an intention to utilise and invest further in AI to assist with enabling customers to meet their tax obligations. The areas HMRC intend to focus on are:
- Using AI as a digital tool to assist their advisers
- Developing AI-powered Digital Assistants to assist customers navigate HMRC’s services
- The use of AI to achieve their tax compliance objectives. This is expected to include AI to identify fraudulent documents during its compliance activities.
Another area in which HMRC uses analytical technology is the collation of multi-sourced information to identify possible cases of tax evasion and avoidance. It uses something called HMRC Connect, an AI powered system that collects data from a wide variety of sources to build up a picture of a taxpayer’s finances.
With Connect, HMRC acquired the ability to analyse data from a wide variety of sources. It searches for patterns, compares information to other data sources and develops its predictive ability.
Connect collects information from a surprisingly large and varied list of sources, including the Land Registry, DVLA, UK and Foreign Banks, eBay, Etsy and Airbnb, Social Media, Credit Agencies, flight sales and passenger information, and UK Border Agency records. The list of sources increases month on month.
A recent court case highlighted the use of AI and may well bring some much-needed transparency to the way in which HMRC goes about using AI to automate many of its routine procedures.
Thomas Elsbury owned a business helping clients with their tax claims – in this case, Research and Development Tax Credits. In December 2023, he sent a request to HMRC under the Freedom of Information Act 2000. He wanted to know about their use of Large Language Models and generative AI such as ChatGPT.
In January 2024, HMRC responded, confirming that it held the information requested but was withholding it under Section 31(1)(d) of the Freedom of Information Act, which allowed them to do so: “if its disclosure under this Act would, or would be likely to, prejudice…the assessment or collection of any tax or duty or of any imposition of a similar nature”.
In February 2024, Mr Elsbury asked HMRC to carry out an internal review, but it wasn’t till May (following a complaint to the Information Commissioner’s Office) that they informed him about the outcome of the review – basically, their original position.
HMRC went onto explain that it was relying on the "neither confirm or deny" exclusion in Section 31(3) of the Freedom of Information Act. In November 2024, the Information Commissioner concluded that HMRC confirming or denying whether it held the requested information would assist those intent on defrauding the system, which would in turn prejudice the collection of tax.
Mr Elsbury was, understandably, not happy about this and in December 2024, he sent a Notice of Appeal to the Tribunal, challenging the Decision Notice.
The tribunal found in his favour and ordered HMRC to hand over information about its use of AI when dealing with research and development claims.
The Future
Businesses that use AI must expect that government agencies will do likewise. The scale of their workload will force their hand. According to the OECD, seven out of ten tax authorities across the world already use AI in their operations. That will almost certainly increase.
However, what is important is the transparency that is adopted by those agencies when answering queries from the public. Given the very high stakes attached to tax matters - monetary, personal or otherwise – and the much publicised propensity of AIs to make mistakes (known as “hallucinations”), then if it’s suspected that this has had some effect on a person’s tax affairs, the potential for disruption is strong. Businesses and individuals therefore have a keen interest in how this topic develops.
Human tax expertise adds value and avoids problems
Many businesses are struggling to stay ahead of the ever-changing world of taxation law and its associated burden of administrative process. Nobody likes paperwork, even the simplest – and taxation is often anything but simple - and anything that promises to reduce that will be appealing to hard-pressed business.
The ideal choice would be to make use of an experienced tax specialist, but for whatever reason, some may not choose to go down that route, and may look to online tax filing solutions advertised as powered by AI. The problem with this route is that there is really no way of knowing that said solution is fit for purpose until it’s too late.
Companies often claim to offer a solution that automates data capture of financial information, picks up anomalies and guarantees regulatory compliance, using AI. Throw in words like “machine learning algorithms” it is clear why many businesses can be seduced into thinking that AI can solve all their problems – identifying savings, avoiding penalties and generating audit trails.
However, as we’ve seen, AI is by no means foolproof and it’s also worth bearing in mind that confidential information is often entered into AI tax filing systems; that information could be accessed by AIs as data for training purposes.
The role of tax advisers
On both sides of the argument – from those challenging the decisions of HMRC as well as those who are concerned about the standard of communications from HMRC – these recent cases have underlined the role of the human, such as tax advisors who bring experience to bear when reviewing AI output. This knowledge could help avoid cases like Mr Gunnarsson and enable those who receive correspondence from HMRC to analyse it and identify whether it contains accurate information – and what to do if it doesn’t.
About the author
Steve Tetley is part of the Business Crime and Regulation team at rradar and is a specialist Tax Advisor. Steve’s career spans over 30 years, most of it within HMRC, before joining rradar in 2021. Steve has experienced almost all forms of tax disputes ranging from investigation settlements and alternative dispute resolution to attendance at tax tribunals and large scale Code of Practice 8 and 9 fraud enquiries.
Please note
This article is for general guidance only and aims to provide general information on a relevant topic in a concise form. This article should not be regarded as legal advice in relation to a particular circumstance. Action should not be taken without obtaining specific legal advice.