If you’ve spent time exploring legal AI tools, you’ll likely have heard of “retrieval-augmented generation” (or more likely its acronym, “RAG”). It’s not a new idea, but it has become a cornerstone of how most sophisticated AI systems deliver reliable results, and it will continue to be important in the development of ‘AI Agents’. In this article, we unpack how RAG works, why it matters in a legal context, and how it can help reduce the risk of “hallucinations” that undermine confidence in AI outputs.
B lind spots and knowledge gaps
Many lawyers are sceptical of AI tools. They need solutions they can trust, and this is something that generative AI (GenAI) tools struggle to reliably deliver. There are very public examples of inaccurate citations, outdated laws, and fabricated judgments.