Technology Assisted Review: Trick or Treat? Ralph Losey, Esq., Jackson Lewis 1.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Brendan McGivern Partner White & Case LLP May 20, 2009 US – Continued Suspension and the Deference Standard BIICL - Ninth Annual WTO Conference Panel 4:
Global Congress Global Leadership Vision for Project Management.
1 Evaluation Rong Jin. 2 Evaluation  Evaluation is key to building effective and efficient search engines usually carried out in controlled experiments.
Webinar Sponsorship Partner. Jason Velasco Jason Velasco is an electronic discovery industry veteran with more than 15 years of experience in electronic.
Privileged & Confidential Ready or Not, Here it Comes…. Why It’s Likely that You’ll be Employing Predictive Coding Technology In 2013 Texas Lawyer In-House.
The Use of Alternative Dispute Resolution in Bankruptcy Proceedings * *Portions reprinted by permission of JAMS.
Cache La Poudre Feeds, LLC v. Land O’Lakes, Inc.  Motion Hearing before a Magistrate Judge in Federal Court  District of Colorado  Decided in 2007.
Litigation and Alternatives for Settling Civil Disputes CHAPTER FIVE.
Cluster Evaluation Metrics that can be used to evaluate the quality of a set of document clusters.
1 Best Practices in Legal Holds Effectively Managing the e-Discovery Process and Associated Costs.
1. 2 Slide Materials  To download a complimentary copy of today’s materials, please follow these instructions: 1.Go to krollontrack.com/events 2.Click.
The E-Discovery Games: A Closer Look at Technology Assisted Document Review David D. Lewis, Ph.D., Information Retrieval Consultant Kara M. Kirkeby, Esq.,
Technology-Assisted Review Can be More Effective and More Efficient Than Exhaustive Manual Review Gordon V. Cormack University of Waterloo
Evaluating Search Engine
INFO 624 Week 3 Retrieval System Evaluation
©2007 H5 Simultaneous Achievement of high Precision and high Recall through Socio-Technical Information Retrieval Robert S. Bauer, Teresa Jade
Software Engineering Code Of Ethics And Professional Practice
By Saurabh Sardesai October 2014.
WXGB6106 INFORMATION RETRIEVAL Week 3 RETRIEVAL EVALUATION.
ISP 433/633 Week 6 IR Evaluation. Why Evaluate? Determine if the system is desirable Make comparative assessments.
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
RANDOM SAMPLING PRACTICAL APPLICATIONS IN eDISCOVERY.
Search and Retrieval: Relevance and Evaluation Prof. Marti Hearst SIMS 202, Lecture 20.
Get Off of My I-Cloud: Role of Technology in Construction Practice Sanjay Kurian, Esq. Trent Walton, CTO U.S. Legal Support.
TAR, CAR, Predictive Coding, Integrated Analytics What Does It All Mean? Huron Legal provides advisory and business services to assist law departments.
ZHRC/HTI Financial Management Training
Technology-Assisted Review Solutions Predictive Coding Language-Based Analytics SM (LBA) © 2013 RenewData.
1 Evaluating Model Performance Lantz Ch 10 Wk 5, Part 2 Right – Graphing is often used to evaluate results from different variations of an algorithm. Depending.
INFO 4307/6307 Comparative Evaluation of Machine Learning Models Guest Lecture by Stephen Purpura November 16, 2010.
1. 2 IMPORTANCE OF MANAGEMENT Some organizations have begun to ask their contractors to provide only project managers who have been certified as professionals.
©2008 Srikanth Kallurkar, Quantum Leap Innovations, Inc. All rights reserved. Apollo – Automated Content Management System Srikanth Kallurkar Quantum Leap.
Analysis 360: Blurring the line between EDA and PC Andrea Gibson, Product Director, Kroll Ontrack March 27, 2014.
George M. Aloth. Definitions  E-Discovery = The collection, preparation, review and production of electronic documents in litigation discovery. This.
Presented by Rebecca Shwayri. Introduction to Predictive Coding and its benefits How can records managers use Predictive Coding Predictive Coding in Action.
Copyright © 2005, The Sedona Conference ®
Auditing: The Art and Science of Assurance Engagements
Copyright © 2007 Pearson Education Canada 1 Chapter 14: Completing the Tests in the Sales and Collection Cycle: Accounts Receivable.
Audit Sampling: An Overview and Application to Tests of Controls
P RINCIPLES 1-7 FOR E LECTRONIC D OCUMENT P RODUCTION Maryanne Post.
Jimmy Coleman.  The Sedona Conference  The Electronic Discovery Reference Model Project  The Federal Judicial.
The Challenge of Rule 26(f) Magistrate Judge Craig B. Shaffer July 15, 2011.
TOP TEN LIST OF COACHING BELIEFS CURRICULUM 511 DR. PECK BY: HALI PLUMMER.
WIRED Week 3 Syllabus Update (next week) Readings Overview - Quick Review of Last Week’s IR Models (if time) - Evaluating IR Systems - Understanding Queries.
Conducting Modern Investigations Analytics & Predictive Coding for Investigations & Regulatory Matters.
LITERACY COACHES WRITING 1. 2 OUTCOMES Literacy Coaches will:  become familiar with the CC Literacy Writing Standards 1,2, 10 (range of writing)  have.
Joint Meeting eDiscovery Joint Session with the Association of Records Managers and Administrators, NYC Chapter (ARMA), Paralegals and Litigation Support.
Session 6 ERM Case Law: The Annual MER Update of the Latest News, Trends, & Issues Hon. John M. Facciola United States District Court, District of Columbia.
Chapter 8 Evaluating Search Engine. Evaluation n Evaluation is key to building effective and efficient search engines  Measurement usually carried out.
1 CS 430: Information Discovery Sample Midterm Examination Notes on the Solutions.
Records Management for Paper and ESI Document Retention Policies addressing creation, management and disposition Minimize the risk and exposure Information.
ASSISTING YOUR ADVISERS WITH THE PERFECT FILE Phil Broadbent.
Performance Measures. Why to Conduct Performance Evaluation? 2 n Evaluation is the key to building effective & efficient IR (information retrieval) systems.
Defensible Quality Control for E-Discovery Geoff Black and Albert Barsocchini.
Ch 8.2: Improvements on the Euler Method Consider the initial value problem y' = f (t, y), y(t 0 ) = y 0, with solution  (t). For many problems, Euler’s.
Chapter. 3: Retrieval Evaluation 1/2/2016Dr. Almetwally Mostafa 1.
Ethical Considerations in Dispute Resolution Practice Thursday, October 29, 2015 Kimberlee Kovach Kovach Dispute Resolution
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
MSA Orientation – v203a 1 What’s RIGHT with the CMMI?!? Pat O’Toole
The School Council President - tips to increase your effectiveness.
Knowledge and Information Retrieval Dr Nicholas Gibbins 32/4037.
What is Legal Analytics?
Women in Products Liability 2016 Annual Regional CLE November 3, 2016
Analogizing and Distinguishing Cases
Speakers: Ian Campbell, Claire Hass,
Bennett B. Borden Conor R. Crowley Wendy Butler Curtis
The Team Players and Play in a Complex Document Review Project: Past, Present and Future Ralph C. Losey, Jackson Lewis Principal and National e-Discovery.
I. Hypothetical A. Case Background: Relator, a former employee of Defendant, brought a case in federal court against a large defense contractor (Relator’s.
Jonathan Elsas LTI Student Research Symposium Sept. 14, 2007
Presentation transcript:

Technology Assisted Review: Trick or Treat? Ralph Losey, Esq., Jackson Lewis 1

Ralph Losey, Esq. 2  Partner, National e-Discovery Counsel, Jackson Lewis  Adjunct Professor of Law, University of Florida  Active member, The Sedona Conference  Author of numerous books and law review articles on e-discovery  Founder, Electronic Discovery Best Practices (EDBP.com)  Lawyer, writer, predictive coding search designer, and trainer behind the e-Discovery Team blog (e- discoveryteam.com)  Co-founder with son, Adam Losey, of IT-Lex.org, a non-profit educational for law students and young lawyers

Discussion Overview 3  What is Technology Assisted Review (TAR) aka Computer Assisted Review (CAR)?  Document Evaluation  Putting TAR into Practice  Conclusion

What is Technology Assisted Review? 4

Why Discuss Alternative Document Review Solutions? Document review is routinely the most expensive part of the discovery process. Saving time and reducing costs will result in satisfied clients. Traditional/Linear Paper-Based Document Review Online Review Technology Assisted Review 5

Information retrieval effectiveness can be evaluated with metrics Fraction of relevant documents within retrieved results – a measure of exactness Precision Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness Harmonic mean of precision and recall Recall F-Measure Hot Not All documents Bobbing for Apples: Defining an effective search

Information retrieval effectiveness can be evaluated with metrics Fraction of relevant documents within retrieved results – a measure of exactness Precision Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness Harmonic mean of precision and recall Recall F-Measure 1) Perfect Recall; Low precision Bobbing for Apples: Defining an effective search Hot Not

Information retrieval effectiveness can be evaluated with metrics Fraction of relevant documents within retrieved results – a measure of exactness Precision Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness Harmonic mean of precision and recall Recall F-Measure 2) Low Recall; Perfect Precision Bobbing for Apples: Defining an effective search Hot Not

Information retrieval effectiveness can be evaluated with metrics Fraction of relevant documents within retrieved results – a measure of exactness Precision Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness Harmonic mean of precision and recall Recall F-Measure 3) Arguably Good Recall and Precision Bobbing for Apples: Defining an effective search Hot Not

Key Word Search  Key word searches are used throughout discovery  However, they are not particularly effective »Blair and Maron - Lawyers believed their manual search retrieved 75% of relevant documents, when only 20% were retrieved  It is very difficult to craft a key word search that isn’t under-inclusive or over-inclusive  Key word search should be viewed as a component of a hybrid multimodal search strategy Go fish! 10

Where are we?

What Is Technology Assisted Review (TAR) ? 12

13 Classification Effectiveness  Any binary classification can be summarized in a 2x2 table  Test on sample of n documents for which we know answer »A + B+ D + E = n

14 Classification Effectiveness  Recall = A / (A+D) »Proportion of interesting stuff that the classifier actually found  High recall of interest to both producing and receiving party

15 Classification Effectiveness  Precision = A / (A+B)  High precision of particular interest to producing party: cost reduction!

How precise were you in culling out from your bag of 10,000 and ? 16 Sampling and Quality Control  Want to know effectiveness without manually reviewing everything. So: »Randomly sample the documents »Manually classify the sample »Estimate effectiveness on full set based on sample  Sampling is well-understood »Common in expert testimony in range of disciplines Sample size = 370 (Confidence Interval: 5; Confidence Level: 95%) Precision: 81%

 Annual event examining document review methods 17 TREC 2011 [T]he results show that the technology-assisted review efforts of several participants achieve recall scores that are about as high as might reasonably be measured using current evaluation methodologies. These efforts require human review of only a fraction of the entire collection, with the consequence that they are far more cost- effective than manual review. -Overview of the TREC 2011 Legal Track

18 Putting TAR into Practice

TAR or CAR? A Multimodal Process Must… have… humans! 19

The Judiciary’s Stance  Da Silva Moore v. Publicis Groupe »Court okayed parties’ agreement to use TAR; parties disputed implementation protocol (3.3 million documents)  Kleen Products v. Packaging Corp. of Am. »Plaintiffs abandoned arguments in favor of TAR and moved forward with Boolean search  Global Aerospace Inc. v. Landow Aviation, L.P. »Court blessed defendant’s use of TAR over plaintiff’s objections (2 million documents)  In re Actos (Pioglitazone) Products Liability Litigation »Court affirmatively approved the use of TAR for review and production  EORHB, Inc., et al v. HOA Holdings, LLC »Court orders parties to use TAR and share common ediscovery provider

 Must address risks associated with seed set disclosure  Must have nuanced expert judgment of experienced attorneys  Must have validation and QC steps to ensure accuracy 21 TAR/CAR: TricksTreats  TAR can reduce time spent on review and administration  TAR can reduce number of documents reviewed, depending on the solution and strategy  TAR can increase accuracy and consistency of category decisions (vs. unaided human review)  TAR can identify the most important documents more quickly &

TAR Accuracy TAR must be as accurate as a traditional review Studies show that computer-aided review is as effective as a manual review (if not more so) Remember: Court standard is reasonableness, not perfection: “[T]he idea is not to make it perfect, it’s not going to be perfect. The idea is to make it significantly better than the alternative without as much cost.” -U.S. Magistrate Judge Andrew Peck in Da Silva Moore 22

23 Conclusion

24 Parting Thoughts  Automated review technology helps lawyers focus on resolution – not discovery – through available metrics »Complements human review, but will not replace the need for skillful human analysis and advocacy  Search adequacy is defined in terms of reasonableness, not whether all relevant documents were found  TAR can be a treat, but only when implemented correctly »Reconsider, but do not abandon, the role of: »Concept search »Keyword search »Attorney review

25 Q & A

26