Bennett B. Borden Conor R. Crowley Wendy Butler Curtis

Slides:



Advertisements
Similar presentations
Federal Rules of Civil Procedure 26(f) and In re Bristol-Myers Squibb Securities Litigation Lina Carreras.
Advertisements

and Electronic Records Retention: IT Requirements Paul Dworak Office of Compliance
What is GARP®? GARP® is an Acronym for Generally Accepted Recordkeeping Principles ARMA understands that records must be.
Technology Assisted Review: Trick or Treat? Ralph Losey, Esq., Jackson Lewis 1.
INFORMATION WITHOUT BORDERS CONFERENCE February 7, 2013 e-DISCOVERY AND INFORMATION MANAGEMENT.
Litigation and Alternatives for Settling Civil Disputes CHAPTER FIVE.
William P. Butterfield February 16, Part 1: Why Can’t We Cooperate?
Ethical Issues in the Electronic Age Ethical Issues in the Electronic Age Frost Brown Todd LLC Seminar May 24, 2007 Frost Brown.
A PROACTIVE APPROACH TO E-DISCOVERY March 4, 2009 Presented to the Corporate Counsel Section of the Tarrant County Bar Association Carl C. Butzer Jackson.
1 Best Practices in Legal Holds Effectively Managing the e-Discovery Process and Associated Costs.
Avoiding Sanctions & Surprises The ethics of discovery Kat Meyer, Esq. President of Conquest eDiscovery, LLC.
Developing a Records & Information Retention & Disposition Program:
1 © Copyright 2008 EMC Corporation. All rights reserved. Litigation Response Planning: eDiscovery Best Practices Stephen O’Leary Sr. eDiscovery and Compliance.
Ronald J. Hedges No Judge Left Behind: A Report Card on the E- Discovery Rules April 24, 2007 Austin, Texas National.
EXPERT EVIDENCE UNDER THE NEW RULES OF CIVIL PROCEDURE ARTHUR ROBERT CAMPORESE Camporese Sullivan Di Gregorio.
AIIM Presentation Selecting and Implementing A Records Management System June 5, 2008.
SharePoint at SRP Perry Bellaire Content Management Salt River Project 2/9/2010.
Electronic Record Retention and eDiscovery Peter Pepiton eDiscovery Product Manager CA Information Governance.
Grant S. Cowan Information Management & eDiscovery Practice Group.
Get Off of My I-Cloud: Role of Technology in Construction Practice Sanjay Kurian, Esq. Trent Walton, CTO U.S. Legal Support.
By Helen Streck President/CEO Kaizen InfoSource LLC Litigation Readiness: Information Manager’s Role.
TAR, CAR, Predictive Coding, Integrated Analytics What Does It All Mean? Huron Legal provides advisory and business services to assist law departments.
The Sedona Principles 1-7
Nathan Walker building an ediscovery framework. armasv.org Objective Present an IT-centric perspective to consider when building an eDiscovery framework.
2009 CHANGES IN CALIFORNIA DISCOVERY RULES The California Electronic Discovery Act Batya Swenson E-discovery Task Force
DOE V. NORWALK COMMUNITY COLLEGE, 248 F.R.D. 372 (D. CONN. 2007) Decided July 16, 2002.
Against: The Liberal Definition and use of Litigation Holds Team 9.
P RINCIPLES 1-7 FOR E LECTRONIC D OCUMENT P RODUCTION Maryanne Post.
The Challenge of Rule 26(f) Magistrate Judge Craig B. Shaffer July 15, 2011.
Electronic Records Management: A New Understanding of Policy, Compliance, and Discovery Robert J. Sobie, Ph.D. Director Information Systems Department.
Session 6 ERM Case Law: The Annual MER Update of the Latest News, Trends, & Issues Hon. John M. Facciola United States District Court, District of Columbia.
Microsoft.com/publicsector Records Management Microsoft Records Management for Government Agencies.
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc., All Rights Reserved. 6-1 Chapter 6 CHAPTER 6 INTERNAL CONTROL IN A FINANCIAL STATEMENT AUDIT.
The Risks of Waiver and the Costs of Pre- Production Privilege Review of Electronic Data 232 F.R.D. 228 (D. Md. 2005) Magistrate Judge, Grimm.
Defensible Records Retention and Preservation Linda Starek-McKinley Director, Records and Information Management Edward Jones
Primary Changes To The Federal Rules of Civil Procedure Effective December 1, 2015 Presented By Shuman, McCuskey, & Slicer, PLLC.
Emerging Case Law and Recent eDiscovery Decisions.
The Sedona Principles November 16, Background- What is The Sedona Conference The Sedona Conference is an educational institute, established in 1997,
In Re Seroquel Products Liability Litigation United States District Court for the Middle District of Florida 2007.
Enterprise Archiving, Retention and Discovery System Jim Albert Deputy Director Department of Information Services April 19 th 2007.
1 PRESERVATION: E-Discovery Marketfare Annunciation, LLC, et al. v. United Fire &Casualty Insurance Co.
EDiscovery Also known as “ESI” Discovery of “Electronically Stored Information” Same discovery, new form of storage.
Electronic Discovery Guidelines FRCP 26(f) mandates that parties “meaningfully meet and confer” to consider the nature of their respective claims and defenses.
CITY OF PHOENIX RECORDS MANAGEMENT AND E-PRIVACY Margie Pleggenkuhle City Clerk Department March 18, 2004.
Practical IT Research that Drives Measurable Results Vendor Landscape Plus: Enterprise Content Management Suite ECM: A vendor marketing concept, not an.
Charles University – Law Faculty October 2012 © Peter Kolker 2012 Class III
Records Management with MOSS, K2, & PsiGen Deepa Patadia
When the law firm is the client Handling legal holds, document collections and productions of your own firm’s documents.
Workplace Health and Safety Disclosure obligations to third parties ANDREW TURNER.
Stages of Research and Development
Retention Breakout Session
Indiana Access to Public Records Act (APRA) Training
Leveraging the Data Map – A Case Study November 15, 2016
Data Minimization Framework
Proactive Information Management and eDiscovery
Civil Pre-Trial Procedures
Civil Pre-Trial Procedures
Chapter Three Ethics and Professional Responsibility
Speakers: Ian Campbell, Claire Hass,
Litigation Holds: Don’t Live in Fear of Spoliation
RECORDS AND INFORMATION
Bonnie Weiss McLeod Cooley LLP
Compliant Information Management and the eDiscovery Challenge
I. Hypothetical A. Case Background: Relator, a former employee of Defendant, brought a case in federal court against a large defense contractor (Relator’s.
Good Spirit School Division
Chapter 3 Judicial, Alternative, and E-Dispute Resolution
Information Governance Part 2
Government Data Practices & Open Meeting Law Overview
Discovery in TPR Cases and of DFS Records in Other Contexts
AUDIT QUALITY REGULATORY FOCUS AREAS
Presentation transcript:

Bennett B. Borden Conor R. Crowley Wendy Butler Curtis Technology Assisted Review: A Kissing Cousin to Autocategorization? February 27, 2013 Bennett B. Borden Conor R. Crowley Wendy Butler Curtis

Agenda How is TAR similar to auto-classification? How does TAR differ from auto-classification? Why do lawyers disagree over the use of TAR?

Why? Managing information tells us WHAT we have and WHERE it is, so we can: Be better prepared to respond to litigation Reduce production costs during discovery Comply with legislative and regulatory mandates Increase staff productivity Improve client service Reduce records storage costs Reduce liability insurance premiums

Routine Tasks? Retention & Disposition Legal Holds/Protective Orders File Transfers/Lateral Movement of Attorneys KM/Precedents BI

Existing Tools Are Not Enough No one content repository meets all needs DMS, RMS, Litigation Support, Extranets, Portals, etc., etc., etc. The nightmare of email Attorneys live in Outlook Outlook is not designed to be a records management solution Archive Not designed to easily address information lifecycle management

From Reactive to Proactive Using what we’ve learned in eDiscovery to Govern Information Better

#1 Problem: Too much ungoverned information

Data Volumes Continue to Grow Estimates by analysts and research shows that: Each year 1,200 exabytes of new data will be generated 650% enterprise data growth in the next 5 years 80% of this data will be unstructured generated from a variety of sources such as blogs, web content, email, etc. 70% of this data is stale after ninety days.

Unknown Information Has No Value

How Did We Get Here? How Did We Get Here? Years of information overload build up Many different systems used from email to file shares, from portals to content management systems No one ever cleans up when they leave Challenges in getting buy in from firm management

Gaps Between Expectations & Practice

Humans Aren’t Good at Classification Manual categorization has a average accuracy of only 60%

Using Predictive Technologies Software “that use[s] sophisticated algorithms to enable the computer to determine relevance, based on interaction with (i.e., training by) a human reviewer.” - Da Silva Moore v. Publicis, 2012 U.S. Dist. LEXIS 23350 (S.D.N.Y. Feb. 12, 2012) (Peck, J.).

How Does it Work? How Does it Work? Index Content Categorize Apply Policies File Systems Email Archives ECM Systems RM repositories SharePoint Automatic Manual Learning system Train by example Multi-action policies Rules based Copy Move Delete Lock Shortcut

Merges categorization methods Manual Categorization Great for small data sets Creates best “training” data sets Rules Based Great for eliminating the obvious stuff Powerful when content has good metadata Can be used to enhance “training” data Supervised Learning Locates errors in rules and manual categorization Offers highest levels of precision and recall Is not dependent on metadata

Supervised Learning Supervised Learning 1. Human categorization of sample content 2. “Training” algorithm is run against category 3. Computer “Suggested” content is reviewed 4. Review of “Suggested” content 5. Content is auto-categorized

How it is Being Used How it is Being Used Data Remediation Classification in repositories Classification upon creation

The IGRM (Information Governance Reference Model

The Business Knows the Value “The line of business has an interest in information proportional to its value, the degree to which it helps drive the profit or purpose of the enterprise itself. Once that value expires, the business quickly loses interest in managing it, cleaning it up, or paying for it to be stored.”

Legal and RIM Legal and RIM “…it is the legal department’s responsibility to define what to put on hold and what and when to collect data for discovery. Likewise, it is RIM’s responsibility to ensure that regulatory obligations for information are met including what to retain and archive for how long…”

Expectations of RIM √ Establish the foundation of good RIM policy Many have formal policies but only 43% are confident that their retention schedule is legally credible. √ Manage information consistently. 83% are unable to even locate hardcopy records when needed. What about managing them?

The Role of IT The Role of IT “IT stores and secures information under their management. Of course their focus is efficiency and they’re typically under huge pressure to increase efficiency and lower cost… …IT doesn’t know and can’t speak to what information has value or what duties apply to specific information.”

TAR Case Law

Open the Kimono Wide: Da Silva Moore Da Silva Moore v. Publicis Groupe et al., 2012 WL 607412, (S.D.N.Y. Feb. 24, 2012), adopted 2012 WL 1446534 (S.D.N.Y. April 26, 2012) Open the Kimono Wide: Da Silva Moore Regarded as the first published judicial opinion approving the use of predictive coding as a valid and defensible method of identifying relevant documents for production Predicates its approval of predictive coding on great transparency offered by the producing party as to its relevance determinations—and an ability of the requesting party to refine these decisions

Major Aspects of the Da Silva Moore Protocol The requesting party entitled to suggest key search terms to segregate documents used to create the seed set. The producing party shall provide “All of the documents that are reviewed as a function of the seed set, whether [they] are ultimately coded as relevant or irrelevant, aside from privilege….” The producing party will disclose all relevance coding for its seed set. The parties shall meet and confer about any documents in the seed set that the requesting party believes were incorrectly coded The requesting party shall have similar input at each wave of the iterative “training” process

Potential Limiting Factors for Da Silva Moore Employment dispute means that Plaintiff ex-employees are familiar with Defendant’s internal “jargon” and information repositories Both sides employed e-discovery consultants and were well funded Both sides agreed, in general principle, to the validity of predictive coding as a culling tool (but disagreed on its implementation) Predictive coding implemented through an extremely detailed protocol

Kleen Products, LLC, et al. v. Packaging Corp. of Amer. , et al Kleen Products, LLC, et al. v. Packaging Corp. of Amer., et al., Case: 1:10-cv-05711, Document #412 (ND, Ill., 2012) Plaintiffs moved to compel to try to require redo of search and production using predictive coding. Defendants had used sophisticated iterative Boolean Keyword searches with QC samplings. P’s argued D.’s approach would only capture 25%, but predictive coding would find 75%. Two days of evidentiary hearings and numerous other conferences.

Kleen Products, LLC, et al. v. Packaging Corp. of Amer. , et al Kleen Products, LLC, et al. v. Packaging Corp. of Amer., et al., Case: 1:10-cv-05711, Document #412 (ND, Ill., 2012) No ruling, but Judge Nolan indicated her preference for Sedona Principle 6: Responding parties are best situated to evaluate the procedures, methodologies, and techniques appropriate for preserving and producing their own electronically stored information. Settled by accepting defendants methods of search until requests served after Oct. 1, 2013

Global Aerospace, Inc. v. Landow Aviation, L. P. et al, Case No Global Aerospace, Inc. v. Landow Aviation, L.P. et al, Case No. CL 61040 (Va. Cir. Ct. April 23, 2012) Second court to permit the use of predictive coding Plaintiffs objected to the use of predictive coding, arguing that it is not as effective as human review. During the hearing, Judge recognized producing party selects review methodology. The receiving party can then raise issues if it does not get what it thinks it should have in the litigation, as in any other discovery scenario. Order allowed plaintiffs to later object “the completeness of the contents of the production or the ongoing use of predictive coding.”

In Re: Actos (Pioglitazone) Products Liability Litigation (W. D. La In Re: Actos (Pioglitazone) Products Liability Litigation (W.D. La., July 27, 2012) The plaintiffs allege that Actos, a prescription drug for the treatment of type 2 diabetes, increases the risk of bladder cancer in patients. On July 27, 2012, United States Magistrate Judge Hanna Doherty of the Western District of Louisiana entered a Case Management Order outlining the electronically stored information (ESI) protocol the parties must follow during discovery. Court outlines a “Search Methodology Proof of Concept” to examine the performance of defendant’s e-discovery provider’s predictive coding tool for the review and production of documents in this matter. The parties have agreed to collect email documents from four of 29 custodians named by defendant, Takeda Pharmaceuticals. These four custodians, added to a set of regulatory documents will be the “sample collection population.”

In Re: Actos (Pioglitazone) Products Liability Litigation (W. D. La In Re: Actos (Pioglitazone) Products Liability Litigation (W.D. La., July 27, 2012) A 500 document random “control set” will be created from the culled collection Three “experts” nominated by each side will jointly review the control set. Plaintiff’s “expert” reviewers are required to sign a non-disclosure agreement. The CMO demands a high degree of cooperation. Defendant’s experts are permitted to pre-screen and review the control document set for privileged material. Defendant’s experts will either remove or redact privileged documents before the control set is reviewed by the panel of experts. The parties’ experts will work collaboratively to determine relevance of the not privileged and privilege redacted documents.

In Re: Actos (Pioglitazone) Products Liability Litigation (W. D. La In Re: Actos (Pioglitazone) Products Liability Litigation (W.D. La., July 27, 2012) Following the review of the Control Set, the experts will review random sample training sets of 40 documents each that the system will select using an active learning approach. This process will continue until a “Stable” training status is reached. The parties will meet and confer regarding which relevance score will provide a cutoff that will yield a proportionate set of documents to be manually reviewed by Takeda for production. All of the documents above the agreed upon relevance score in the sample collection population will be reviewed by Takeda. The CMO provides for meet and confers throughout the process, including a post-predictive coding sampling meet and confer to “finalize the search methodology on a going forward basis.”

EORHB, Inc. , et al v. HOA Holdings, LLC, C. A. No. 7409-VCL (Del. Ch EORHB, Inc., et al v. HOA Holdings, LLC, C.A. No. 7409-VCL (Del. Ch. Oct. 15, 2012) Complex multimillion dollar commercial indemnity dispute involving the sale of Hooters “Why don’t you all talk about a scheduling order for the litigation on the counterclaims. This seems to me to be an ideal non-expedited case in which the parties would benefit from using predictive coding. I would like you all, if you do not want to use predictive coding, to show cause why this is not a case where predictive coding is the way to go.”

Gabriel Tech. Corp. , et al. v. Qualcomm Inc. , 2013 WL 410103 (S. D Gabriel Tech. Corp., et al. v. Qualcomm Inc., 2013 WL 410103 (S.D. Cal. Feb. 1, 2013) Complex patent dispute Defendants won on summary judgment and sought attorneys’ fees: $10,244,053 for Cooley LLP attorneys $391,928.91 for document review by Black Letter Discovery, Inc. $2,829,349.10 for fees associated with document review algorithm generate by H5 Court awarded all fees sought and specifically found “Cooley’s decision to undertake a more efficient and less time-consuming method of document review to be reasonable under the circumstances.”

Why Do Lawyers Disagree Over the Use of TAR? They don’t understand it Concerns about efficacy Concerns about cost/changes to law firm billing model Level of disclosure required

Potential Disclosures Size of document corpus Size of seed set Seed set selection method Contents of seed set Expert selected to review seed set Methodology for analysis of seed set deemed relevant Sampling methodology with respect to documents deemed relevant/non-relevant by TAR Sample of documents deemed not relevant by TAR Levels of precision and recall achieved and the confidence level for precision/recall metrics

Thank you. Bennett B. Borden bborden@williamsmullen. com Conor R Thank you! Bennett B. Borden bborden@williamsmullen.com Conor R. Crowley ccrowley@crowleylawoffice.com Wendy Butler Curtis wcurtis@orrick.com