In their quest to integrate and market generative AI-powered technologies, legal tech companies have often cited a process they claim keeps hallucinations at bay: retrieval augmented generation (RAG).

RAG shows up in press releases, at trade shows, and in many product demos as a solution for large language models' (LLMs) hallucination problem.