Pet rocks, Cabbage Patch Kids, Tickle-Me-Elmo, and the latest holiday gift phase. Some things explode in the media and quickly flame out. Is all the talk about ChatGPT taking (and passing) bar exams, writing demand letters, and supplying sections to briefs a similar shooting star? Or is there substance? And if so, where? And what are the ethical limits as applied to dispute resolution?

Lawyers thinking about AI initially focus on computers' ability to perform, more efficiently, some routine litigation tasks such as reviewing, analyzing, and organizing emails and large amounts of data into manageable lists and bites. eDiscovery is a problem that technology has created, so it makes sense that technology should be enlisted to solve the problem. Keyword searches and de-duplication do help to separate the wheat from the chaff, but there are limits to relying on algorithms to present decision-makes with "key" documents. It is not just that the coder (who writes the algorithms) can bring to the task his or her biases or (in)experience. Rue the day — already here — when the computer writes the code. Even the most ardent supporters of AI recognize that AI efficiencies do not replace human judgment and experience.