2025年4月17日

Technology in disputes: the double-edged sword of justice?

分享

This article first appeared in the April 2025 issue of PLC Magazine. 

The Ministry of Justice (MoJ) is rethinking how it perceives the accuracy and credibility of digital evidence in the criminal justice system. On 21 January 2025, the MoJ launched a consultation on the use in criminal proceedings of evidence that is generated by software (the consultation), noting that the use of computer evidence has proliferated in many prosecutions, ranging from complex fraud to high-volume prosecutions such as driving offences. As part of any reform, the MoJ has stated that it wants to future-proof its approach, given what it describes as the swiftly evolving landscape of technology, including the increasing use of AI.

Currently, there is a common law rebuttable presumption that the computer was operating correctly at the material time (see “A Brief History”). The burden of challenging the accuracy of the digital evidence is on the party that is seeking to undermine it. That party is often the one that is least familiar with the technology in question and so the presumption has always risked striking an imbalance of power between parties at the outset of criminal proceedings in respect of the critical matter of evidence.



A BRIEF HISTORY

At one time, neither the criminal nor the civil justice system trusted computers to an evidential standard without verification, as shown by:

  • Section 69(1)(b) of the Police and Criminal Evidence Act 1984 (section 69), which provided that statements in documents produced by computers were not admissible as evidence of any fact stated unless it was shown that at all material times the computer was operating properly.
  • The Civil Evidence Act of 1968 (1968 Act), which provided that computer evidence would be admissible in court only if the computer had been operating properly and the information put in evidence reproduced information already supplied to the computer.

However, as the use of technology became commonplace, it seems that with familiarity came trust. Section 69 was repealed in 1999 and replaced with the rebuttable common law presumption that the computer was operating correctly at the material time. The equivalent provision in the 1968 Act had already been repealed in 1995 but, unlike in the criminal justice system, no presumption was ever introduced in its place.



Shaken trust

In at least one notable instance, trust in computers turned out to have been catastrophically misplaced. Between 1999 and 2015, reportedly more than 900 subpostmasters were convicted of theft, fraud and false accounting based on evidence from the Horizon computer software, which turned out not to be operating properly. By way of comparison, in the 11 years before that, there were only 90 convictions, the vast majority of which were made after Horizon came into use in 1996. The convictions made from 1999 onwards were quashed when the Post Office (Horizon System) Offences Act 2024 passed into law on 24 May 2024.

In response to what has been widely accepted as one of the largest miscarriages of justice in modern English history, the consultation aims to rethink the presumption, which is arguably unsuitable for the present technological era.

The MoJ’s proposals

The proposals in the consultation involve maintaining the existing presumption, but drawing a distinction between:

  • Evidence that is merely captured or recorded by a device; for example, digital communications between people, digital photographs and video footage, and mobile phone extraction reports. This type of evidence would continue to be subject to the existing presumption or a revised version of it.
  • Evidence that is generated by software, including AI and algorithms. This would fall outside the scope of the presumption (or its revised version), so that proof of proper functioning may be required. 

While the proposals appear to be a step in the right direction, the assumption that there are still certain types of computer evidence that can be relied on without scrutiny is perhaps misplaced: instances of deepfake frauds continue to increase and the complete extraction of data from mobile phones is hampered by privacy and security tools such as encryption and remote-wipe capabilities.

Working with technology

Tools such as encryption and anonymity, which are designed to protect users’ privacy rights and shield them from cyber crime, are also used by criminals to protect themselves from conviction. While encryption is a useful tool in the right hands, its use can also make vital computer evidence harder to reach, or incomplete and harder to analyse.

The importance of the proper functioning of technology is not unique to the justice system: bug bounty platforms that enable ethical security hackers, commonly referred to as white hats, to help businesses identify and fix bugs in their software for a bounty, are reportedly already a billion-dollar market and experiencing steep growth. Initiatives such as bug bounty platforms are critical in helping businesses to protect themselves from cyber crime and feel confident about the way that their technology is functioning.

Immunefi, a bug bounty platform, has recently launched, in partnership with the London Chamber of Arbitration and Mediation, a set of blockchain-expedited arbitration rules that protect the anonymity of white hats, even as between the white hats and their counterparties in a dispute as to the correct level of bounty payable (https://lcam.org.uk/ wp-content/uploads/2025/01/BlockchainExpedited-Arbitration-Rules_09_12_2024.pdf).

Similarly, in the crypto technology sector, the courts have neatly sidestepped the anonymity of fraud defendants by allowing service of proceedings and, crucially, freezing orders and injunctions on anonymous defendants through WhatsApp or non-fungible token (Jones v Persons Unknown [2022] EWHC 2543; D’Aloia v Persons Unknown and others [2022] EWHC 1723). In these civil cases, the struggle for justice came not from the anonymity of the defendants so much as from the difficulty involved in tracing the assets and establishing a legal right to the return of those assets once they had been dissipated. Justice demanded that the courts keep pace with technology and place a degree of trust in its functioning in order to put it to use in favour of a wronged party.

Beyond the obstacle of anonymity, the English legal system has shown itself willing to adapt to complex areas of law to account for fast-developing technology, all in the name of justice. In Tulip Trading v Bitcoin Association for BSV and others, the Court of Appeal opened the door to the imposition of fiduciary duties on software developers that maintain cryptocurrencies on behalf of users and owners, potentially offering victims of crypto theft or fraud a route to recovery ([2023] EWCA Civ 83.

Non-binary approach

It is clear that, in order to fight crime effectively and deliver justice in a rapidly developing technology-centric world, the justice system needs to continue to find ways of keeping up with, if not staying ahead of, technological advances. In the civil courts, it has long been an expectation that parties use computer-assisted learning or technology-assisted review (TAR) to meet the overriding objective of enabling the court to deal with cases justly and at proportionate cost. Often, the volume of material disclosed by the parties to a civil dispute is so vast that disclosure without the assistance of technology is largely unfeasible. This means that parties need to understand the technology that they are using in order to adequately assess its effectiveness as well as its credibility.

TAR is only just beginning to be used in criminal fraud prosecutions where the same issue arises in terms of the volumes of evidence engaged. The Solicitor General has recently stated that the Serious Fraud Office plans to use TAR in more cases going forward, following well-documented problems with disclosure in its complex criminal investigations, which have sometimes been so acute that they have led to collapsed prosecutions (https://questions-statements.parliament.uk/written-questions/ detail/2025-01-27/26405).

Data plays a vital role in the justice system, so the integrity of that data, particularly when used in evidence, is pivotal in achieving justice. However, a legal principle that assumes that data which is generated by computers or software is accurate is not reflective of reality. Trust in technology does not have to be binary; technology should be used to further the interests of justice, but its output should be scrutinised. The scrutiny that is applied need not take the approach of leaving no stone unturned, but rather should adopt an approach that is informed, responsive and intelligent. Computers are, after all, programmed by humans, some of whom make honest mistakes and some of whom have self-interested agendas, sometimes both. In the same way that the courts do not assume that a witness is telling the truth, it should not be assumed that technology is infallible.

The civil courts did not adopt the presumption that the criminal justice system applies and has so badly suffered from, or any equivalent presumption. This may be at least part of the reason why the properties of Horizon were only properly investigated and argued when the Post Office’s criminal prosecutions were litigated in the civil courts, ultimately leading to the exposure of serious flaws.

Impact

The MoJ’s proposals have not yet been fleshed out, but they ask participants to consider what the impact would be if the presumption no longer applied to certain types of computer evidence. In theory, the removal of the presumption in the criminal context could complicate prosecutions, particularly in the areas of white-collar crime and fraud, where digital evidence often plays a crucial role and involves considerable volumes of data. Prosecutors may need to invest more resources in proving the integrity of digital evidence, which would potentially affect the efficiency and success rate of prosecutions. In practice, however, the reality may turn out to be no less effective at combatting fraud than civil fraud proceedings.

As for future-proofing, there continues to be uncertainty as to the global regulatory landscape in relation to AI. Competing priorities are dividing jurisdictions even as to the fundamental principles to be applied to its use. The US and the UK recently refused to sign the international AI Action Statement, which is aimed at promoting AI accessibility while ensuring that its development is transparent, safe, secure and trustworthy (www.bbc.co.uk/news/ articles/c8edn0n58gwo; https://gbc1.net/ index.php/2025/02/12/joint-statement-at-the-ai-action-summit-2025-in-paris/). In that context, a procedural approach that assumes as little as possible about the fair, effective and transparent functioning of technology is likely to be a judicious one.

 


 

The consultation is at www.gov.uk/government/calls-for-evidence/use-ofevidence-generated-by-software-in-criminal-proceedings

及时掌握我们的最新见解

见证我们如何使用跨学科的综合方法来满足客户需求
[订阅]