Every evolution of ediscovery has launched a new set of exciting technologies, efficient workflows and savvy experts. We have come a long way from the old “check the box” view of discovery in which the main measure of a project’s success was whether it avoided adverse consequences like sanctions. Skilled professionals today know how
to plan a project strategically, cull enormous amounts of data into manageable sets, leverage machine learning and other computer analytics and use statistics to prove that the needed documents were indeed found. Yet those skills need to be translated into the language of business intelligence (BI); otherwise they are like a tree falling in the woods with no one around to hear.
How can ediscovery professionals prove the value of their contributions if their clients or organizations do not have a clear understanding of what they do? Terms like targeted collection
and technology-assisted review are not often meaningful to those who do not focus on ediscovery. Luckily, ediscovery professionals can use metrics to tell success stories for the strategies they
employ. Metrics show where inefficiencies exist, which can then be corrected. This in turn boosts the number of dollars ediscovery professionals can prove they are saving the organization.
Regardless of familiarity with ediscovery, all business managers understand and appreciate dollars saved.
As cost management analysis is increasingly emphasized in all parts of organizations, legal departments are also focusing on BI
tools and reports to manage their caseload and spending.
Ediscovery BI can be especially important because ediscovery represents such a significant portion of legal budgets. Metrics analysis proves the value of efficient ediscovery and also provides insight into opportunities to save more through effective data management.
There are many ways to mix and match data points to create BI that will be useful to and appreciated by a specific organization. Areas of focus may include how and when documents were excluded from the ediscovery process, comparing a selected process to other possible outcomes and analysis of data footprint management. Each of these areas can be analyzed for individual projects or across many projects.
Culling is hardly a novel approach to ediscovery – in fact, it is a necessary one – but the approaches of organizations and their counsel vary drastically. Consider the different stages at which data may be excluded from the steps of ediscovery: preservation, collection, processing, early data assessment, additional searching and finally review. The general rule is that discovery will be less expensive if extraneous data is removed earlier in the process.
Skillful yet defensible exclusion of data is a “win” for ediscovery professionals. Metrics on savings related to excluded data reflect when and how an organization is most successful at reducing ediscovery spend.
Consider a project where the review portion cost $15,700 dollars.Without context, this cost provides no insight into how well the
review was run. In contrast, understanding that this project involved custodial interviews that led to a targeted collection and that technology-assisted review (TAR) was used to limit review provides context on the value the discovery team provided to the organization. This project involved a fixed fee-per-document reviewed, so we can calculate both the savings and the spend on review:
That looks like a successful project!
We can do the same analysis for every component of the project, the project overall and multiple projects taken together. The following are the estimated savings for those excluded documents for the entire project, including collection, processing, hosting, project management and litigation support, review, review quality control and supervision:
This analysis demonstrates the effectiveness of targeting the collection and using TAR to reduce the costs of the project. While these techniques may not be the right fit for every case, the metrics they provide on potential savings is very helpful in future decisions and provides instructive BI about when and how to use them.
The art of ediscovery involves picking the right workflow for each case. This requires understanding the costs and benefits of the workflows. BI can demonstrate how successful an approach has been as well as inform future decisions about similar workflows.
For example, many projects now use analytics to put as many responsive documents into human review as possible while avoiding the review of nonresponsive documents (this is known as TAR 2 or continuous active learning). In a TAR 2 project, analytics predict which unreviewed documents are most likely to be responsive, and those documents are batched for human review. As humans then tag the documents responsive or nonresponsive, the system continues to learn and refine results to better predict which documents should be reviewed. The result of this workflow is full review of the required responsive documents and a remaining set of unreviewed documents that are predicted nonresponsive.
To demonstrate the usefulness of this approach we first use metrics to analyze how well the process increases responsiveness of documents reviewed compared to the overall set of data. Here, where the overall set of data was only 25 percent responsive, the TAR 2 review successfully focused the review on the responsive documents, and only a small percentage (13 percent) of nonresponsive documents were reviewed.
In addition, we can show actual dollars saved from this TAR 2 review workflow as compared to a linear review
In another case, we identified a modified workflow that used TAR 2 and where certain collections of documents could be marked responsive without needing to go through the TAR 2 review. This created large additional savings and allowed the project to finish in a shorter amount of time, as shown in this analysis:
While law department managers appreciate hearing that a skilled workflow made a project more efficient, this type of analysis clearly and meaningfully illustrates the impact in terms of dollars and timing for the case.
Ediscovery professionals can leverage BI to optimally manage their data footprint and reduce costs. Often multiple copies of data reside on the IT infrastructure due to multiple environments, retained copies of source data or as a side effect of data workflows. With intelligence about where all the data resides, ediscovery professionals can clean up unneeded copies of data. Some ediscovery service providers now offer cost models and tools that empower users to save money by managing their own data. A managed services subscription consumption model allows providers to deliver intelligence in near real-time, which empowers ediscovery professionals to monitor their data and make business decisions accordingly.
Without BI on data footprint, most ediscovery professionals are in the dark about what copies of data can be removed to create cost savings. With BI, a clear picture emerges:
Here, the Original Backup, Forensic and FTP may be duplicative copies of the data that could be discarded. Other silos may also contain data that becomes eligible for deletion as the project takes its course.
BI on data footprint can comprehensively illustrate all volumes and flavors of an organization’s data across every matter in each environment. From there, ediscovery professionals can collaborate with their service providers and use the intelligence to optimize workflows, tool stacks and consumption model architecture.
Better data management lowers infrastructure burdens on the provider, which allows reduced prices for their client. In addition, BI also promotes project consultation between provider and client that not only creates immediate savings on current projects but also creates workflow improvements that can pay dividends on future projects.
While organizations are most interested in efficiency, discovery workflows also need to be defensible. Metrics are often key to establishing processes that are not only efficient but proven to meet the project’s goals.
To obtain these metrics and evaluate the effectiveness of workflow strategies, discovery professionals often must rely on a critical tool: sampling. Sampling involves a very simple concept: that “a few” can adequately represent “the many.” Sampling has been deemed a sufficient measure in a variety of different circumstances, such as measuring patient health or determining how voters are leaning on a particular candidate or ballot measure.
A properly designed sample of electronically stored information can likewise be sufficient for discovery purposes. In particular, sampling can be used to depict information from a broader set of data among dozens or hundreds of custodians spanning a range of years. Properly sampled data, after it is reviewed, can provide
discovery teams with intelligence on any number of issues relating to a document population. This raw data can be distilled into any number of metrics which can used to determine appropriate workflows.
Use of BI should be an ongoing, iterative and collaborative process of leveraging critical information to mitigate costs and maximize
savings throughout ediscovery projects. Ediscovery providers should determine what metrics are available to them and how to turn those metrics into BI that will effectively illustrate the value they create.They should also determine how to use BI to identify new areas to further drive down their ediscovery costs.
The more metrics available for analyses, the better the BI – as long as it is being used correctly. While proper metrics can create useful analysis to demonstrate the success of strategies and amount of savings in specific projects, the best analysis can also capture the big picture and lead to improved organizational strategy. If the right metrics are tracked, recorded and compiled for analysis across variables in all of an organization’s matters, BI can provide insights about the cost effectiveness of ediscovery professionals and their work, allow comparison of different strategies used in different matters and facilitate identification of areas for further development and savings.
In summary, BI tools offer an important opportunity to promote the valuable contributions by skilled ediscovery professionals in
a way that is easily appreciated by their organizations and clients. The ability to do ediscovery well is important, but just as important is the ability to communicate the value of this job done well. BI is a necessary tool for ediscovery and for communicating its results, and for finding ways to do it even better.
The author thanks Brian Cunningham, Driven, Inc. Chief Financial Officer, for his contribution on Data Footprint analysis. The author also thanks Phil Favro, Driven Inc.
Consultant, for his contribution on how to use Metrics for Defensibility.
Originally published on April 25, 2018 online at http://epubs.iltanet.org/i/973671-lps18/0?_ga=2.113238151.224216392.1524493055-536794961.1520434322
This post cannot be reprinted or reused without ILTA consent.