Lucio vs. CoCounsel for Litigators
Lucio Team

The comparison between AI tools is often framed too narrowly. For litigators, the real question is not who wins on a simple query or a small set of documents. It is who performs when the work starts to look like litigation: large bundles, complicated facts, issue-spotting, and iterative thinking.
That is where Lucio shines. It is designed around litigator workflows rather than around a narrow research-only use case. In practice, that means the difference becomes visible when users test the products on thousand-page bundles or across 20-plus documents, not when they compare them on five files or a single prompt. This is exactly why litigators who have run careful, rigorous comparisons, keep coming back to us.
But, What About WestLaw?
Everyone’s had this thought: Don't tools that have access to Westlaw give it a fundamental edge?
Of course, Westlaw remains an important database. But litigators do not do their work from one database alone. They search broadly. They use Google. They look at BAILII, Judiciary UK, government and tribunal materials, and practice-specific sources that sit outside the traditional legal research platforms.
Lucio is built for that broader reality. It is able to work across web research and multiple source types, including sources that matter materially to UK litigators and barristers.
Add in the fact that Lucio curates custom repositories for UK barristers, and the tool becomes practically indispensable.
A barrister who put two different tools through their paces on live cases put it this way:
“Initially, I honestly wasn’t convinced Lucio was any better than {a tool that has WestLaw}. It was only once we started running it on real case papers that the difference became obvious. On smaller bundles and straightforward research, {a tool that has WestLaw} works fine. But it struggles to pull in other sources and give you a genuinely holistic note. Once we moved to large, complex papers, Lucio pulled ahead decisively. Its custom library curation turned out to be incredibly helpful in keeping our resources updated. For serious work, Lucio is the clear winner.”
— Feedback from a barrister after using multiple AI tools
What a Firm Found After Months of Testing
What we are most proud of is that our strongest endorsements came from people who approached the comparison sceptically. Here’s what a litigation firm had to say after spending months evaluating AI products, and expecting Co-counsel to win:
“We spent months vetting 5 AI products, finally narrowing down to CoCounsel and Lucio (yes, we looked at Harvey too).
I expected CoCounsel to win, primarily among the litigators, due to its access to Westlaw databases. However, we were all overwhelmed by Lucio's functionality, especially the litigators. And their solution for accessing Westlaw or Lexis databases was not an impediment.
Lucio put 2 people in our office for more than 2 full weeks, training us, running real-time searches, and helping us understand their product. We ran them side by side with CoCounsel, in many instances, doing identical searches, research and drafting in both programs. Lucio outperformed CoCounsel on all fronts.”
— Feedback from a litigation firm (Stubbs Alderton) after a side-by-side evaluation of leading legal AI tools
Ask the people who know
In the end, litigators do not need another impressive demo. They need a tool they can trust when the work is messy, urgent, and intellectually demanding.
For chambers and firms that want to hear directly from practitioners who have tested Lucio against other tools in real workflows, we are happy to make introductions. We can connect you to a barrister from a top civil/commercial set, or with a litigation team that ran a detailed evaluation over several weeks, so you can hear about their experience first-hand.



