In the first article in this series, we assessed several methods of technology assisted review of documents in civil litigation and discussed when they should be considered for use. In this article, we explain how to work with opposing counsel to achieve a positive result that your court will accept. In the third and final article in this series next month, we will explore the best way to make technology assisted review a cost-efficient process.

While no lawyer or company should jump into technology assisted review (TAR) of electronic data without first conducting a comprehensive assessment of the case, once the decision has been made to use it, the challenge becomes working with opposing counsel to ensure that the forum court accepts the results and that discovery motions practice is minimized. The best way to effectively use TAR is to confer with your opponent early in the case and make sure he or she understands the nature and cost of TAR. The earlier you express proportionality concerns and come to at least nominal agreement on what is truly at issue in the case, the better equipped you will be to successfully use TAR in civil litigation.

First, you will need to engage your adversary when deciding both the scope of the documents to which TAR will be applied and the subset of documents used to create the sample, “seed” set. Even in the seminal Moore v. Publicis Groupe SA case, which involved numerous disputes concerning TAR, counsel for plaintiffs and defendants eventually agreed upon the custodians whose electronic data should be searched. Judge Andrew Peck, author of numerous articles on the uses of TAR, urged the parties to discuss the full impact of its use. He also explained that, if TAR was going to be used, defendants' counsel will “have to give [their] seed set, including the seed documents marked as nonresponsive[,] to the plaintiff's counsel so they can” challenge the methodology used. Given Judge Peck's view of the value of transparency, you need to be prepared to share the documents you initially mark as responsive and those you judge nonresponsive.

Second, you and your opponent need to discuss where to place the cut-off for responsiveness. Most TAR methods rank documents on a scale of likeliness of responsiveness, often from 1 to 100. The goal is to identify as many responsive documents as possible while reviewing as few nonresponsive documents as you can. Recall is the fraction of responsive documents identified during a review, while precision is the fraction of identified documents that are nonresponsive. Do you agree to produce all documents ranked 70 or higher? 80? The software will provide a recommendation for this line, but your adversary is unlikely to be as comfortable with the technology. Overall, the parties have a much better chance of agreeing on the cut-off if they agree on the factual matters at issue in the litigation. Sharing examples of responsive and nonresponsive documents identified in a refined seed set, along with their respective responsiveness scores, will help opposing counsel understand the process and agree to an appropriate cut-off point.

Third, you will need to discuss validation of results with your adversary. Some forms of validation include independent testing, but the most commonly used method is sampling — i.e., taking a cross-section of the documents reviewed and checking the software's decisions on responsiveness and nonresponsiveness. Using sampling and sharing its results with opposing counsel is a more collaborative process than setting arbitrary recall and precision rates. While you will need to negotiate the percentage of documents to be checked via sampling, you avoid the possibility of agreeing to impractical recall and precision numbers.

Transparency and collaboration may strike you as difficult in the confines of civil litigation, but engaging opposing counsel on the use of TAR is the most effective way to realize a quality, large-scale electronic document production approved by the court. The final article in this series will focus on achieving cost effectiveness using TAR.

In the first article in this series, we assessed several methods of technology assisted review of documents in civil litigation and discussed when they should be considered for use. In this article, we explain how to work with opposing counsel to achieve a positive result that your court will accept. In the third and final article in this series next month, we will explore the best way to make technology assisted review a cost-efficient process.

While no lawyer or company should jump into technology assisted review (TAR) of electronic data without first conducting a comprehensive assessment of the case, once the decision has been made to use it, the challenge becomes working with opposing counsel to ensure that the forum court accepts the results and that discovery motions practice is minimized. The best way to effectively use TAR is to confer with your opponent early in the case and make sure he or she understands the nature and cost of TAR. The earlier you express proportionality concerns and come to at least nominal agreement on what is truly at issue in the case, the better equipped you will be to successfully use TAR in civil litigation.

First, you will need to engage your adversary when deciding both the scope of the documents to which TAR will be applied and the subset of documents used to create the sample, “seed” set. Even in the seminal Moore v. Publicis Groupe SA case, which involved numerous disputes concerning TAR, counsel for plaintiffs and defendants eventually agreed upon the custodians whose electronic data should be searched. Judge Andrew Peck, author of numerous articles on the uses of TAR, urged the parties to discuss the full impact of its use. He also explained that, if TAR was going to be used, defendants' counsel will “have to give [their] seed set, including the seed documents marked as nonresponsive[,] to the plaintiff's counsel so they can” challenge the methodology used. Given Judge Peck's view of the value of transparency, you need to be prepared to share the documents you initially mark as responsive and those you judge nonresponsive.

Second, you and your opponent need to discuss where to place the cut-off for responsiveness. Most TAR methods rank documents on a scale of likeliness of responsiveness, often from 1 to 100. The goal is to identify as many responsive documents as possible while reviewing as few nonresponsive documents as you can. Recall is the fraction of responsive documents identified during a review, while precision is the fraction of identified documents that are nonresponsive. Do you agree to produce all documents ranked 70 or higher? 80? The software will provide a recommendation for this line, but your adversary is unlikely to be as comfortable with the technology. Overall, the parties have a much better chance of agreeing on the cut-off if they agree on the factual matters at issue in the litigation. Sharing examples of responsive and nonresponsive documents identified in a refined seed set, along with their respective responsiveness scores, will help opposing counsel understand the process and agree to an appropriate cut-off point.

Third, you will need to discuss validation of results with your adversary. Some forms of validation include independent testing, but the most commonly used method is sampling — i.e., taking a cross-section of the documents reviewed and checking the software's decisions on responsiveness and nonresponsiveness. Using sampling and sharing its results with opposing counsel is a more collaborative process than setting arbitrary recall and precision rates. While you will need to negotiate the percentage of documents to be checked via sampling, you avoid the possibility of agreeing to impractical recall and precision numbers.

Transparency and collaboration may strike you as difficult in the confines of civil litigation, but engaging opposing counsel on the use of TAR is the most effective way to realize a quality, large-scale electronic document production approved by the court. The final article in this series will focus on achieving cost effectiveness using TAR.