Home Estate Planning Would you trust an AI to handle your divorce?

Would you trust an AI to handle your divorce?

by
0 comment

AI could sweep away the power end of the legal market but it’s no match for a top divorce lawyer, says Ayesha Vardag

AI is the buzzword of the moment. It’s in our phones, our homes, our courtrooms – and now, it seems, it’s coming for our professions. Even mine.

As someone who has spent a career at the sharp end of family law, handling some of the most complex and high-value divorces in Britain and internationally, I don’t approach AI with fear. I approach it with fascination. I’ve tested it, challenged it, even tried to trip it up. And yes, it’s clever – sometimes startlingly so. But it’s not ready to take over the kind of work I do. Not unless you’re comfortable entrusting your financial future, your children and your reputation to a system that can only tell you what’s already been said and not what’s possible.

Recently, my husband and our CEO, Stephen Bence, ran a fictitious but realistic sophisticated query through one of the leading AI platforms. The result was mixed. It grasped the general principles but missed key legal subtleties – the kind of details that, in our world, can determine the outcome of a case and profoundly affect a client’s future. Specifically, it failed to consider an important alternative jurisdiction that could have dramatically shifted the financial outcome for the client. 

In another instance, it overlooked the relevance of a lengthy period of cohabitation prior to marriage – a factor that can significantly affect the application of the sharing principle in English law. These aren’t minor oversights. They’re the kinds of errors that can cost people millions. 

And that’s the crux of it: There is no doubt that AI could absolutely sweep away the lower end of the legal market – the low-hanging fruit. The cookie-cutter divorces, the £99 fill in the form and hope for the best brigade – all of that is ripe for automation. And perhaps that’s no bad thing. But once you enter the realm of strategic litigation, emotional complexity and bespoke legal advice, AI begins to falter.

And here’s why: even at its best AI codifies the present. It is, at its core, a distillation of existing knowledge. It can summarise precedent, synthesise commentary and regurgitate the consensus. But cutting-edge law is not about repeating what’s already known. It’s about pushing boundaries – identifying gaps, exploiting overlaps and crafting new arguments that shift the legal landscape. That’s especially true in English common law, which evolves through judicial interpretation and creative legal reasoning. The best lawyers don’t just apply the law; they shape it. My own ground-breaking Radmacher case made pre-nuptial agreements binding in England.  But all the prior case law said that pre-nups were unenforceable. AI – at least at present – is nowhere near being able to craft the arguments that I ran in that case that led to a change in the law.

AI, for all its sophistication, cannot yet do that. It cannot intuit the emotional undercurrents of a case, or anticipate how a judge might respond to a novel argument. It cannot weigh the strategic value of a risk, or decide when to push and when to settle. It cannot read a room – or a client.

Ethical implications

More concerning, however, are the legal and ethical implications of AI’s growing presence in our field.

If AI is offering regulated legal advice, and some tools certainly appear to be, then we’re entering murky territory. These systems are not solicitors. They are not regulated by the Solicitors Regulation Authority. They carry no professional indemnity insurance. So, when they get it wrong, and they will, who bears the liability? Whom do you sue when your case collapses and your financial future is compromised? OpenAI? Microsoft? Your broadband provider?

The Solicitors Regulatory Authority needs to step in. We’re seeing tools that walk and talk like lawyers, but without any of the accountability. If AI starts drafting pleadings, negotiating settlements or filing applications, it could be deemed to have conduct of litigation – which, for an unqualified entity, is a criminal offence.

And what of confidentiality?

Imagine a husband feeding sensitive documents into an AI tool to “get advice” – and that data being absorbed into a model that’s accessible to the other side. That’s not just unethical; it could be a breach of the implied undertaking, a serious violation of court rules. Confidentiality is the bedrock of our profession. And AI, for all its sophistication, does not truly comprehend that.

So yes, AI is coming. It’s already here. But it’s not ready to replace us. Not yet.

For now, it’s a tool – useful for the basics, dangerous for the rest. The legal profession must tread carefully. We need regulation, oversight, and above all, clarity. Because when it comes to justice, “good enough” just doesn’t cut it.

You may also like

Leave a Comment

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?