There has been a lot buzz about new technologies that can do document review with little to no human involvement. Some believe that technology-only reviews are better and faster than attorney-only reviews. But, if done correctly, attorney-led reviews can both streamline the review and teach the legal team the case in real-time in ways that technology alone just can't.

While it is true that “traditional” attorney-only reviews can be overly expensive and not always accurate, new technologies that promise to lower costs also have problems of their own. Legal standards for defending the results produced by these predictive coding tools have not yet been developed. And no one is sure what level of outside testimony will be needed to prove the reliability of productions from tool-only reviews.

Attorney-led e-discovery practices can cut costs and increase efficiencies by side-stepping these problems. Knowledgeable e-discovery practitioners deliver document review value by taking the best practices of “eyes on” attorney review and then using available tools to enhance the process. This type of review is very specialized and some firms have created their own in-house document review centers to make sure that clients get the maximum value for their document review dollars. Specific document review savings strategies used by this approach include:

  • Targeted data collections
  • “Right job” e-discovery tool matching
  • Leveraging law firm e-discovery investments
  • Creating case-specific “eyes on” review rules
  • Established quality control processes

The Right Tool for the Job

Predictive coding tools offer a foothold into the ever-growing mountain of ESI. These tools let attorneys familiar with the case work with a subset of documents to “teach” the system important core case terms and concepts. The systems go through a number of iterative steps to learn the kinds of documents that are most relevant and responsive in the case. Once this learning has occurred, the tools can search through the entire data set to find the documents that meet the parameters the system has learned.

In the right circumstances, predictive coding can drive the most relevant documents in a production to the top of the pile, or it can push the most likely “junk” to the bottom of the review stack. As a result, a case with a large document population and a fast discovery schedule is most likely to benefit from the use of these tools.

But early use also has revealed some of the current tools' limitations. System learning doesn't work as well in cases with rolling production schedules. If the system based its initial learning on one kind of dataset, it may have to relearn core concepts when introduced to a different kind of dataset—i.e., moving from accounting data to marketing data. This can create the need to reexamine completed data and increase costs.

And no one is sure what admissibility standards will govern when the parties haven't reached agreement on the use of predictive coding. Though judicial commentators encourage the use of analytic tools, they also recognize that the results the tools produce must be defensible. In Victor Stanley, Inc. v. Creative Pipe, Inc., 250 F.R.D. 251 (D. Md. 2008), Judge Grimm addressed the level of proof needed when parties disagree on the adequacy of the search method used. He said that, “[s]hould their selection be challenged by their adversary, and the court be called upon to make a ruling, then [the offering party] should expect to support their position with affidavits or other equivalent information from persons with the requisite qualifications and experience, based on sufficient facts or data and using reliable principles or methodology.”

Keeping human eyes involved in the review process boosts the functionality of current e-discovery tools. This common-sense approach streamlines the review process, minimizes the risk of production of privileged and non-relevant documents and yields defensible results.

Value-based Document Review Best Practices

While e-discovery tools continue to develop, in-house counsel need solutions to get value from their document review expenses now.

Practical strategies for cutting costs and increasing the efficiency of document reviews include:

  • Targeted data collections. Getting to what's real faster means doing the legwork up-front to find the most responsive data during collection. Less data collected means less data to review.
  • Right job” tool matching. Recognize that e-discovery will never be a one-size-fits-all proposition. Work with counsel who understand the range of available tools, review strategies and then choose the document review process that is best for the case and data involved.
  • Leveraging law firm e-discovery investments. Firms continue to compete to provide outstanding e-discovery services. Firms that have made an investment in a dedicated e-discovery practice often offer a variety of review resources at very competitive prices.
  • Creation of case-specific, “eyes on” review rules. Reviews can be streamlined to meet specific criteria. Customized reviews let reviewers quickly separate relevant from non-relevant data while segregating privileged information and keeping it from inadvertent production.
  • Established quality control processes. Effective human quality control measures are the bedrock of both traditional and emerging technology document review methods. Find providers with the QC data to back up their reliability and accuracy claims, and then test them with real-world questions based on the data at issue in your case.

There has been a lot buzz about new technologies that can do document review with little to no human involvement. Some believe that technology-only reviews are better and faster than attorney-only reviews. But, if done correctly, attorney-led reviews can both streamline the review and teach the legal team the case in real-time in ways that technology alone just can't.

While it is true that “traditional” attorney-only reviews can be overly expensive and not always accurate, new technologies that promise to lower costs also have problems of their own. Legal standards for defending the results produced by these predictive coding tools have not yet been developed. And no one is sure what level of outside testimony will be needed to prove the reliability of productions from tool-only reviews.

Attorney-led e-discovery practices can cut costs and increase efficiencies by side-stepping these problems. Knowledgeable e-discovery practitioners deliver document review value by taking the best practices of “eyes on” attorney review and then using available tools to enhance the process. This type of review is very specialized and some firms have created their own in-house document review centers to make sure that clients get the maximum value for their document review dollars. Specific document review savings strategies used by this approach include:

  • Targeted data collections
  • “Right job” e-discovery tool matching
  • Leveraging law firm e-discovery investments
  • Creating case-specific “eyes on” review rules
  • Established quality control processes

The Right Tool for the Job

Predictive coding tools offer a foothold into the ever-growing mountain of ESI. These tools let attorneys familiar with the case work with a subset of documents to “teach” the system important core case terms and concepts. The systems go through a number of iterative steps to learn the kinds of documents that are most relevant and responsive in the case. Once this learning has occurred, the tools can search through the entire data set to find the documents that meet the parameters the system has learned.

In the right circumstances, predictive coding can drive the most relevant documents in a production to the top of the pile, or it can push the most likely “junk” to the bottom of the review stack. As a result, a case with a large document population and a fast discovery schedule is most likely to benefit from the use of these tools.

But early use also has revealed some of the current tools' limitations. System learning doesn't work as well in cases with rolling production schedules. If the system based its initial learning on one kind of dataset, it may have to relearn core concepts when introduced to a different kind of dataset—i.e., moving from accounting data to marketing data. This can create the need to reexamine completed data and increase costs.

And no one is sure what admissibility standards will govern when the parties haven't reached agreement on the use of predictive coding. Though judicial commentators encourage the use of analytic tools, they also recognize that the results the tools produce must be defensible. In Victor Stanley, Inc. v. Creative Pipe, Inc., 250 F.R.D. 251 (D. Md. 2008), Judge Grimm addressed the level of proof needed when parties disagree on the adequacy of the search method used. He said that, “[s]hould their selection be challenged by their adversary, and the court be called upon to make a ruling, then [the offering party] should expect to support their position with affidavits or other equivalent information from persons with the requisite qualifications and experience, based on sufficient facts or data and using reliable principles or methodology.”

Keeping human eyes involved in the review process boosts the functionality of current e-discovery tools. This common-sense approach streamlines the review process, minimizes the risk of production of privileged and non-relevant documents and yields defensible results.

Value-based Document Review Best Practices

While e-discovery tools continue to develop, in-house counsel need solutions to get value from their document review expenses now.

Practical strategies for cutting costs and increasing the efficiency of document reviews include:

  • Targeted data collections. Getting to what's real faster means doing the legwork up-front to find the most responsive data during collection. Less data collected means less data to review.
  • Right job” tool matching. Recognize that e-discovery will never be a one-size-fits-all proposition. Work with counsel who understand the range of available tools, review strategies and then choose the document review process that is best for the case and data involved.
  • Leveraging law firm e-discovery investments. Firms continue to compete to provide outstanding e-discovery services. Firms that have made an investment in a dedicated e-discovery practice often offer a variety of review resources at very competitive prices.
  • Creation of case-specific, “eyes on” review rules. Reviews can be streamlined to meet specific criteria. Customized reviews let reviewers quickly separate relevant from non-relevant data while segregating privileged information and keeping it from inadvertent production.
  • Established quality control processes. Effective human quality control measures are the bedrock of both traditional and emerging technology document review methods. Find providers with the QC data to back up their reliability and accuracy claims, and then test them with real-world questions based on the data at issue in your case.