A multi-part series on the fundamentals eDiscovery practitioners need to know about document review planning and execution
In “The Main Event,” we discussed the costs and significance of review, as well as the question of what gets reviewed. In “For What It Gets Reviewed,” we discussed the range of determinations that you might want reviewers to be making. In “Who Does the Reviewing,” we discussed your options for review staffing. In “Workflow Design Considerations,” we discussed document flow and tagging palette considerations. In “More Workflow Design Considerations,” we discussed batch creation and tracking, reporting, and documentation considerations. In this final Part, we review fundamentals of quality control.
The final and most important fundamental of review to understand is quality control. No matter what you’re reviewing, what you’re reviewing it for, who’s reviewing it, or how you’re reviewing it, you will need to take proactive steps to ensure the overall quality and consistency of that work. Perfection isn’t possible (and isn’t required), but reasonable efforts to meet your obligations of completeness, accuracy, and privilege protection are both.
The Sedona Conference’s Best Practices Commentary on the Use of Search and Information Retrieval Methods in E-Discovery describes a persistent myth in eDiscovery:
It is not possible to discuss this issue without noting that there appears to be a myth that manual review by humans of large amounts of information is as accurate and complete as possible – perhaps even perfect – and constitutes the gold standard by which all searches
should be measured.
The reality is quite different from this myth. In reality, even the best reviewers make numerous mistakes due to simple human fallibility, and reviewers frequently come to different conclusions regarding questions of relevance, privilege, and more. Studies have shown surprisingly low consistency between the independent results of equivalent review teams (“Assessor Overlap”).
Because of this reality, it is critical that your document review project include some steps to ensure an acceptable minimum level of quality, consistency, and completeness.
The most traditional method of quality control is second level (or second pass) review. In this method, some portion of the material reviewed by first level (or first pass) review is re-reviewed by more senior reviewers to check the accuracy and consistency of the work. The volume re-reviewed and the focus can vary widely depending on the needs of the project:
In some projects you may establish more than two levels of review. For example, you might add a third level in which case team members re-review certain materials prior to production.
The other traditional quality control method is targeted searching. Targeted searching is the practice of running searches against the reviewed materials for key terms that would likely indicate clear relevance, irrelevance, or privilege and then double-checking that the results are coded correctly. For example, you might search for key attorneys’ names and email addresses and double-check the privilege tagging applied to the results.
Sampling broadly comes in two categories: judgmental sampling and formal sampling. Judgmental sampling is the informal process of looking at some randomly selected materials to get an anecdotal sense of what they contain. The random 10% second-level review and targeted searching described above are examples of judgmental sampling. The goal of these efforts is to get an impression and make an intuitive assessment rather than to take a specific measurement.
Formal sampling is just the opposite: you are reviewing a specified number of randomly-selected documents with the goal of taking a defined measurement with a particular strength. Typically that measurement is either being taken to test classifiers or estimate prevalence:
For a deeper dive on these two sampling techniques, please see our series, practice guides, or webinar on Estimating Prevalence and Testing Classifiers.
Regardless of the specific quality control methods you choose to employ on your project, it is critical that effective feedback loops are established. In most document review projects, you will be engaged in ongoing quality control throughout first-level review, giving you the opportunity not just to catch and correct errors, but to identify issues and address them with first-level reviewers to improve the rest of their work. Effective feedback loops make this possible.
The most important feedback loop is the one between the review managers and the reviewers, This is about feeding the insights gleaned from quality control efforts back to the reviewers though additional instruction and clarification. For larger projects, it is common to have weekly review team meetings to review issues and answer questions. It is also common to have one-on-one sessions with individual reviewers identified as requiring additional guidance, and it is a good idea to maintain a shared list of reviewer questions and review manager answers for everyone’s reference. It is also important that there is a feedback loop between the case team and the review managers, so that review managers can request guidance and clarification as needed to share with their team and so that the case team can share any issues they have identified during any quality control review them have performed.
As we conclude this series, it’s worth emphasizing the particular importance of engaging in quality control for the purpose of preventing the inadvertent disclosure of privileged materials. As we discussed previously, Federal Rule of Evidence 502(b) establishes that inadvertent disclosures can lead to privilege waiver if reasonable steps to prevent the disclosure weren’t taken.
The Committee’s Explanatory Note on Rule of Evidence 502 makes clear that “reasonable steps” is a case-by-case determination that can depend on factors such as the total number of documents to be reviewed, the time constraints for production, how records were managed, what tools were used, and more. Consequently, taking steps to ensure the quality of your privilege review approach is at least as important as what approach you take:
The implementation of the methodology selected should be tested for quality assurance; and the party selecting the methodology must be prepared to explain the rationale for the method chosen to the court, demonstrate that it is appropriate for the task, and show that it was properly implemented. [emphasis added]
Victor Stanley Inc. v. Creative Pipe Inc., 250 F.R.D. 251, 262 (D. Md. 2008).
For Assistance or More Information
Xact Data Discovery (XDD) is a leading international provider of eDiscovery, data management and managed review services for law firms and corporations. XDD helps clients optimize their eDiscovery matters by orchestrating precision communication between people, processes, technology and data. XDD services include forensics, eDiscovery processing, Relativity hosting and managed review.
XDD offers exceptional customer service with a commitment to responsive, transparent and timely communication to ensure clients remain informed throughout the entire discovery life cycle. At XDD, communication is everything – because you need to know. Engage with XDD, we’re ready to listen.
Whether you prefer email, text or carrier pigeons, we’re always available.
Discovery starts with listening.