Think twice before you take legal advice from AI

AI can help make legal support accessible, but it can also be dangerous. Could a regulated AI tool akin to the NHS 111 service be the answer? Richard Atkinson thinks so

Following recent cases involving the use of AI, the High Court has warned lawyers that relying on fake AI-generated citations, even unknowingly, could lead to them facing serious professional consequences and risk a wasted costs order. This warning points to the dangers of unregulated AI in the legal sector. It can cause serious harm and lead to grave consequences, including criminal sanctions, for both lawyers and people that seek justice.

But that doesn’t mean there isn’t room for AI to assist the legal sector, particularly the public, who should not be left exposed when they are using AI to understand and resolve their legal issues.

A legal version of NHS 111 could help bring justice for all

That’s why the Law Society of England and Wales is asking the government to create a new AI-powered tool to help people understand their legal issues and find the best way to address them. It will be a simple-to-use government-backed tool, like the online NHS 111 service, that guides people to the right support on common legal problems such as divorce, employment, housing and wills. Just like health and education, justice is a public service and as such it should not be the privilege just of those with means. Justice should be easily accessible to all.

A cost-benefit analysis found that this tool could save the justice system around £72m over five years. A small academic team along with the Solicitors Regulation Authority built a first working prototype focusing on employment law. Legal experts tested it and found it was very accurate in spotting the right legal issues and giving useful advice.

This legal AI tool can really benefit members of the public that have legal issues and find it difficult to access justice. Research showed people are confused about where to start. One in two people look for legal help online first but without human oversight or regulation, this can lead to people finding wrong or even dangerous legal information.

Bad AI advice has real-life consequences

Online dispute resolution providers say a “one-stop shop” is badly needed. This service would also provide information about legal aid and funding for legal costs, as well as signposting people to local solicitors. Right now, there’s no clear way to complain if you’re misled by common generative AI tools providing legal support as they are not regulated and consumers are not protected.

Whether in the hands of lawyers doing their work or of citizens trying to find clarity on their rights, being misled by AI can be detrimental. Even where professionals have a built-in responsibility to check their work, in both cases, trust in justice is undermined. AI tools should support and ensure fairness, rather than add to the confusion.

While legal professionals lead the way in using AI tools in an ethical and responsible way, the Law Society, as part of its newly published report “21st century justice”, is asking the government to build an online solutions explorer and regulate AI properly. The aim is to modernise the system and level the playing field for anyone seeking justice. To reach a solution that serves the interests of the public, the professional world and, above all, for the sake of justice, everyone needs to be consulted: lawyers, technology firms and people. We must all work together to future-proof justice by using innovation and all available tools safely.

Richard Atkinson is president of the Law Society of England and Wales

Related posts

No selfies please: Croatia has a quiet luxury island that’s more Succession than Kardashian

Fitch Learning Completes Acquisition of Moody’s Analytics Learning Solutions and the Canadian Securities Institute

Swift can Ascend higher than rivals with Bentley on board