Download presentation
1
Legal Tech Hong Kong March 2013 Richard Williams
Electronic Discovery Partner Deloitte China
2
Topics to discuss today
State of play Methodology and practical experience dealing with relevant legislation eDiscovery readiness and IT Integration lifecycle Time line and ability to deliver Deloitte
3
State of play – challenges
4
Current experience and state of play in the market for moving data cross border – what we see as data processors Japanese Korea Hong Kong China Singapore Malaysia Vietnam Deloitte
5
Challenges arising from cross border discovery
Everything always needs to be ready for review yesterday Multiple jurisdictions means multiple laws, cultures, languages and geographical challenges. Need to create multiple review databases, therefore risking loss of efficiency for the review. Deloitte
6
Working to find solutions to cross border discovery
Globally integrated data processor with common standards for methodology, tools and data handling Ability to scale in any territory to meet demand Deep understanding and appreciation of local cultural and language Seamlessly integrate with legal teams and provide advisory services in relation to handling of data and local procedures Bespoke solutions Risk management focus to reduce risk for clients, legal teams and advisors/data processors Deloitte
7
Methodology
8
Segregation of documents and data
4 categories of relevance. Relevant protected documents Relevant not protected documents Not relevant not protected documents Not relevant protected documents Universe of Documents and files Deloitte
9
Framework Deloitte Data Universe MNC Legal team PRC Legal team
Client Deloitte
10
How eDiscovery readiness and IT integration can help
11
eDiscovery readiness & IT integration
Know your data universe and plan future technology upgrades Understand the cross border implication of hosting and data servers ERP and financial system implementation Requirements to flag data protected documents Deloitte
12
Assisted Review – a view on reducing timelines
13
Assisted review to reduce time and increase accuracy
The changing face of e-Discovery With increasing data volumes, data protection legislation emerging or being enforced in more locations and cross border matters becoming the norm there is pressure to develop a more efficient review methodology Clients want/need to exert greater control over review costs The catalyst - Assisted Review: Provides a cheaper, quicker option for document review that does not place reliance on a people intensive first tier review exercise Is already generating significant interest among law firms Leverages advances in technology As clients learn of assisted review and other changes in the marketplace, they will begin to realise they can exert greater control over their eD review 1 2 3 – Law firm perspective, has the potential to allow smaller firms to compete as smaller teams of experienced lawyers can now do what used to require large numbers. 4 5 So how can we assist clients with these issues? Deloitte
14
Assisted review Definition
It’s a review methodology that takes the expertise of human reviewers and amplifies their decisions using advanced search & classification technology Automating the document review process with human guidance. Experts train the system & their efforts are amplified. There are several search technologies that can classify documents (LSI, NLP, Bayesian). Computer assisted review efficiency and quality is driven by and measured using a statistical based approach Draw attention to two parts of the definition on the left Expertise of human reviewers – amplifies their decisions using advanced search & classification technology – uses Relativity Analytics which you will have heard of and some may have used. (Consider going into a little detail regarding relativity analytics) Automating all or part of the review – can be applied at different stages depending on how you want to use it (refer back to process) Second point is key (CSF) Consider the traditional model for review – keywords, bunch of people drafted in for T1 review who are trained in how to use the tool and in the key aspects of the case, can be a large team depending on vol of documents and can spend a lot of their time reviewing not relevant documents to find the (typically smallish number of) relevant documents then go to T2 who are typically the lead investigators. T1 is usually the largest factor contributing to high cost of document review. Assisted review turns this model on its head. Removes the need for the large manual T1 review and uses experienced case personnel to train the system via samples whereby their results are applied across the document population – their efforts are amplified. trained and knowledgeable lawyers and/or investigators. But does need to be knowledgeable reviewers – any wrong/inconsistent decisions here will be amplified across remaining documents in scope for review. (refer back to people) Several schools of thought on search technologies to do this, Relativity happens to use LSI but the market is in agreement that it is not so much about which technology is used, it’s about how it is applied, the process that is used and the importance of quality which is monitored via statistically based reports (process again). Deloitte
15
Assisted review An iterative Process
Create Random Sample Set Review Sample Set Validate Results Categorise Document Universe Here is the process. Blue parts are people and green parts are technology. Green - System creates random sample set, typically documents so manageable number of documents without commitment to days and days of review Blue – sample set is reviewed by case expert. Need precise and accurate coding here. We get feedback from kCura to field as to how other clients are using this and this is No 1 on their list of reasons where Assisted Review has not been as successful as hoped. If two docs with similar content are coded differently – this will confuse system. Accurate and clean coding at this stage has been shown to deliver good results. Green – system takes learning from the coded sample and uses this to categorise remaining documents based on similarity of content of the documents (using clever analytics part in relativity and similarity of document content) – documents categorised as Relevant or Not Relevant where possible – it may not be possible to categorise all documents as the sample may not be representative of the entire population. – note I am using term categorised rather than coded Blue – validate results, again done by case expert, reviewing sample of those categorised, this can be done blind so reviewer does not necessarily know what the system has done. If reviewer disagrees with the system then this is counted as an “overturn”. After validation, get a report of the number of overturns and if, for example, the reviewer overturn rate is less than 5% then you can say that you are 95% confident in the systems categorisation. Continue with process until overturn rate meets success criteria. If overturn a document again then this again will impact accuracy of system categorisation. So this is process in principle but can it be applied….. Deloitte
16
Further applications of assisted review Leveraging the power of Analytics
Prioritisation – Review the documents more likely to be Relevant first and the Not Relevant documents last. Culling/Filtering – Only review the documents more likely to be Relevant. The assumption is the remaining documents will be least Relevant and more likely to be Not Relevant. Quality Control – Quality control the work product of the review team. Sampling documents tagged as Relevant and/or Not Relevant. Tier 2 Review – Use of document clustering can assist the Tier 2 team in their assessment / approach to the second tier review process. By now, you should have a fair understanding of Assisted Review. So when can you use assisted review? Here are some examples: You can use Assisted Review to help with prioritising documents for review; Review the documents more likely to be Relevant first and the Not Relevant documents last You can use it as a form of culling/filtering; Only review the documents more likely to be relevant, assuming the remaining is not relevant Finally, you can use Assisted Review as a form of Quality Control – This is to Quality Control the work product of the review team. Sampling from documents tagged as Relevant and/or Not Relevant. Today I’m going talk through how we used Assisted Review for Project Gotham. Deloitte
17
Assisted review Key Benefits
Potential to significantly reduce time and costs of review. Allows clients to assess the merits of their case far quicker. Potential for enhanced quality of document review. Enhanced coding consistency. Improved & cost effective QC processes. Looking at our Project Gotham example- we have already seen potential time and cost savings. The new CPR rules which come out next April regarding cost management will be another contributory factor to driving the use of this technology in order to keep costs down. On coding consistency, worth pointing out that presented with the same scenario (same document with same seed set of documents) will always make the same decisions – something that is missing from human review. Different reviewers on different days, different review decisions. Deloitte
18
Case Studies
19
Case Study 1 RTO company responding to regulatory matter in foreign jurisdiction Computer collected in various location in PRC All data centrally located in Shanghai data centre Data processed onsite in Shanghai by eDiscovery team Keywords applied to data to identify responsive documents to relevant issues at hand eReview undertaken by a team of reviewers (combined Deloitte/Legal firm team) who physically reside in PRC using standard 1st and 2nd tier review. Data identified as relevant reviewed by PRC law firm with specific scope to flag documents that may trigger State Secret or Commercially sensitive legislation. Documents that are flagged as not responsive to above legislation are provided back to the law firm/client in PRC for further action Deloitte
20
Case Study 2 Multinational Pharmaceutical company responding to litigation Computer collected in various location in PRC All data centrally located in Shanghai data centre Data processed onsite in Shanghai by eDiscovery team List of keywords specifically formulated to identify documents that may trigger State Secret or Commercially Sensitive legislation. Documents flagged as responsive to SS/CS Legislation isolated. Keywords applied to data to identify responsive documents to relevant issues at hand (SS/CS+NR) (SS/CS+Rel) (NR) (Rel) Deloitte
21
Final words Focus on reducing overall risk relating to data mobility by leveraging in country platforms Global interconnectedness is the key to ensure consistency of approach and keeping abreast of rapidly changing environment Working in an advisory capacity with our clients and legal teams.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.