In their quest to integrate and market generative AI-powered technologies, legal tech companies have often cited a process they claim keeps such hallucinations at bay: retrieval augmented generation (RAG).

RAG shows up in press releases, at trade shows, and in many product demos as a solution for LLMs’ hallucination problem.