Monday, April 28, 2014

E-Discovery Costs vs. Disseminating Justice – What’s Important?

In e-Discovery, courts, attorneys, e-Discovery consultants, and other industry veterans emphatically deliberate proportionality and predictive coding as major apparatuses for reducing e-Discovery costs. First, Rule 26 - “duty to disclose; general provisions governing discovery” of FRCP encompasses, in entirety, matters relating to initial disclosure, time, scope and limits, pretrial disclosure, limitations, parties conference, sanctions, etc., In other words, the legislative intention behind Rule 26 is to ensure and streamline e-Discovery governance matters.
edrm
Secondly, e-Discovery costs can easily escalate to millions of dollars. For instance, on average a Gigabyte (GB) contains 15,000 documents. An average collection of 50 GB entails 750,000 documents which need to be sifted through for relevant details pertaining to specifics of case for defensibility purposes. To give you an idea in terms of costs, reviewing those documents could cost as high as $2 per document or 1.5 million dollars! If 60% were culled down using technology assisted review (TAR), costs would still be as high as $600,000 dollars! E-Discovery budget calculators can be found here.
Here’s the catch! These 750,000 documents are culled down in order to identify potentially relevant documents. The traditional e-Discovery approach is to process all data to TIFF or native for full linear review, whereas, newest and advanced method entails indexing, culling, legal first pass review, and process data for review. With the advent of ‘Big Data’ technology introduced (TAR) or predictive coding as a tool for handling e-Discovery in an efficient cost effective manner.
Statistics plays a pivotal role in TAR, and courts have endorsed usage of TAR in one way or other. However, there may be pitfalls as I explained in one of my earlier posts relating to the limitations of precision and recall in TAR.

Has our justice system become dependent on technology?

Technology is great, however, it must strictly be used as a tool in aid to the due-process of law. As an attorney, I would argue against our justice system’s inclination towards dependability on technology. There are other ways to reduce costs such as global talent acquisition, outsourcing, dual-shoring, offshoring etc., and numerous law firms and corporations have adopted such business models, documenting additional 60% reduction in e-Discovery costs. While reduction in e-Discovery costs are essential, the opportunity cost may undermine defensibility.


e-Discovery and | cloud computing
New Jersey, USA | Lahore, PAK | Dubai, UAE
www.claydesk.com
(855) – 833 – 7775
(703) – 646 - 3043

Tuesday, April 22, 2014

7 Things E-Discovery Auditors Must Do

U.S. Federal Rules of Civil Procedure (FRCP) require organizations to look at the ability to respond in a legally defensible manner to discovery requests. Moreover, as organizations expand globally, they need to be ready at all times to provide information that could be requested as evidence in a legal proceeding. Internal or external auditors are in the best position to recommend policies and best practices that can prepare organizations to respond to a data discovery request. The auditors must:

  1. Determine the effectiveness of the e-Discovery communication plan
  2. Document the IT environment
  3. Regularly review backup, retention, and data destruction policies
  4. Review compliance with document destruction procedures, when a litigation hold is issued
  5. Document the steps that will be taken to respond to e-discovery requests
  6. During litigation, determine whether employees are preserving the integrity of relevant material
  7. Review existing backup controls, reports, and inventories of media stored off site

Failing to prepare for an e-discovery request can result in sanctions. Organizations need to have a litigation readiness policy and plan in place to effectively deal with lawsuits. Auditors play a pivotal role in managing litigation risks and help organizations take a proactive approach to e-Discovery by recommending strategies that address key data preservation, storage, destruction, and recovery disquiets. Microsoft SharePoint 2013 and M-Files, for instance, offer e-Discovery and content management solutions to cater to these needs.

e-Discovery auditor


e-Discovery and | cloud computing
New Jersey, USA | Lahore, PAK | Dubai, UAE
www.claydesk.com
(855) – 833 – 7775
(703) – 646 - 3043

Friday, April 18, 2014

Corporate Social Responsibility in E-Discovery Industry

Corporate Social Responsibility (CSR) is a management concept in which companies integrate social and environmental concerns in their business operations and interactions with their stakeholders. The basic definition according to Wiki:

“Corporate social responsibility is a form of corporate self-regulation integrated into a business model. CSR policy functions as a built-in, self-regulating mechanism whereby a business monitors and ensures its active compliance with the spirit of the law, ethical standards, and international norms”

CSR is best incorporated with the “Triple Bottom Line” (TBL) approach, which is essentially an accounting framework incorporating three dimensions of performance: financial, social, and environmental.

 ClayDesk CSR strategy
 A triple bottom line measures the company's economic value, "people account" – which measures the company's degree of social responsibility and the company's "planet account" – which measures the company's environmental responsibility. While CSR indoctrination within the e-Discovery industry may be prevalent, only a handful of companies may actually have developed and adopted CSR.

Adopting to a mindset of a good corporate citizen, at ClayDesk, we have initiated a CSR program and embedding CSR practices in our business. The foremost area of focus for CSR initiatives are directed towards promotion of legal education, e-Discovery laws, Pro-Bono legal work, sponsor a student, and steps towards a paperless (go-green) environment. These steps will bring about positive change and improve the quality of life of members of the society.

Some of the core CSR issues relate to: environmental management, eco-efficiency, responsible sourcing, stakeholder engagement, labor standards and working conditions, employee and community relations, social equity, gender balance, human rights, good governance, and anti-corruption measures. Denmark, for instance, has CSR Law in place which mandates companies to report their CSR initiatives. Apart from providing charity and sponsorships, CSR concept goes beyond by allowing companies the opportunity to become a socially and ethically responsible corporate citizen.
ClayDesk's CSR














e-Discovery and | cloud computing
New Jersey, USA | Lahore, PAK | Dubai, UAE
www.claydesk.com
(855) – 833 – 7775
(703) – 646 - 3043

Friday, April 11, 2014

When Should E-Discovery Vendors Be Disqualified? Gordon V. Kaleida Health Case

Generally speaking, courts have inherent authority to disqualify parties, representatives, and consultants from participating in litigation.  Attorneys, expert witnesses, and litigation consultants may face disqualification motions in the event of a conflict of interest. With the rapid expansion of the eDiscovery industry, however, a new question has arisen: If an eDiscovery vendor has a potential conflict of interest, when should it be disqualified?  What standard should apply?

To put the problem in perspective, imagine that you manage discovery at a law firm representing the defendant in a contentious wage and hour dispute, and you recently hired an eDiscovery vendor to assist you in scanning and coding your client’s documents, at a cost of $50,000.  Two months later, you receive notice from your vendor that the plaintiff’s counsel has requested its services in connection with the same case.  How would you react?  Would you expect a court to disqualify the vendor if it accepted the engagement?  This scenario occurred in Gordon v. Kaleida Health, resulting in the first judicial order squarely addressing vendor disqualification.  The Kaleida Health court ultimately denied the defendant’s motion to disqualify, allowing the vendor to continue participating in the case.

Discussion of Gordon v. Kaleida Health

Kaleida Health arose out of a now commonplace dispute between a hospital and its hourly employees under the Fair Labor Standards Act (“FLSA”). The plaintiffs, a group of hourly employees, sued the defendant, Kaleida Health, a regional hospital system, claiming they were not paid for work time during meal breaks, shift preparation, and required training, in violation of FLSA.

Kaleida Health’s attorneys, Nixon Peabody, LLP (“Nixon”), hired D4 Discovery (“D4”), an eDiscovery vendor, to scan and code documents for use in the litigation. In connection with the work, Nixon and D4 executed a confidentiality agreement. D4 was to “objectively code” the documents using categories based on characteristics of the document, such as the author and the type of document. The coded documents would then be used by Nixon in preparing for upcoming depositions.

Two months later, plaintiffs’ counsel, Thomas & Solomon, LLP (“Thomas”), requested D4 to provide ESI consulting services to it in connection with the same case. D4 notified Nixon, who promptly objected based on the scanning and coding services D4 provided the defendant during the litigation. D4 then provided assurances that Kaleida Health’s documents would not be used in consulting the plaintiffs and that an entirely different group of employees would work with the plaintiffs’ counsel. Nixon, on behalf of Kaleida Health, persisted in its objection to D4 working for the plaintiffs and ultimately filed a motion to disqualify the vendor.

Magistrate Judge Foschio’s analysis began by outlining the standard governing the disqualification of experts and consultants.  According to the court, the entity sought to be disqualified must be an expert or a consultant, defined as a “‘source of information and opinions in technical, scientific, medical or other fields of knowledge’” or “one who gives professional advice or services” in that field. After the moving party makes this initial showing, it must meet two further requirements.  First, the party’s counsel must have had an “‘objectively reasonable’ belief that a confidential relationship existed with the expert or consultant.” Second, the moving party must also show “that . . . confidential information was ‘actually disclosed’ to the expert or consultant.”

Applying this standard, Judge Foschio ultimately found that because the scanning and objective coding services performed by D4 did not require specialized knowledge or skill and were of a “clerical nature,” D4 was not an “expert” or “consultant.” Further, the court determined that the defendant failed to prove that it provided confidential information to D4 because it did not show “any direct connection between the scanning and coding work . . . and Defendants’ production of [its] ESI.”
Rejecting Kaleida Health’s argument, the court declined to apply to D4 and other eDiscovery vendors the presumption of confidential communications, imputation of shared confidences, and vicarious disqualification applicable in the context of attorney disqualification when a party “switches sides.” The court— as an alternative basis to its finding that D4 did not act as an expert or consultant—held that disqualification was improper because no “prior confidential relationship” existed between Kaleida Health and D4.

Because Kaleida Health represents the first significant attempt at exploring the issues surrounding vendor disqualification, whether later courts should follow Kaleida Health’s lead in exclusively applying the disqualification rules for experts and consultants to vendors becomes the main issue in its wake.  To come to a conclusion on this point, one must first explore the different schemes that courts may apply when considering disqualification.

This above excerpt is a part of article originally written by Michael A. Cottone, a candidate for Doctor of Jurisprudence, The University of Tennessee College of Law, May 2014.

e-Discovery | cloud computing
New Jersey, USA | Lahore, PAK | Dubai, UAE
www.claydesk.com
(855) – 833 – 7775
(703) – 646 - 3043
Appellate Court
Appellate Court - Lahore

Monday, April 7, 2014

The trade-off between ‘Recall’ and ‘Precision’ in predictive coding (part 2 of 2)

This is the second part of the two-part series of posts relating to information retrieval by applying predictive coding analysis, and details out the trade-off between Recall and Precision. For part 1 of 2, click here.
To clarify further:
Precision (P) is the fraction of retrieved documents that are relevant, where Precision = (number of relevant items retrieved/number of retrieved items) = P (relevant | retrieved)
Recall (R) is the fraction of relevant documents that are retrieved, where Recall = (number of relevant items retrieved/number of relevant items = P (retrieved | relevant)
Recall and Precision are inversely related. A solid criticism of these two metrics is the aspect of biasness, where certain record may be relevant to a person, may not be relevant to another.
So how do you gain optimal values for Recall and Precision in a TAR platform?
Let’s consider a simple scenario:
• A database contains 80 records on a particular topic
• A search was conducted on that topic and 60 records were retrieved.
• Of the 60 records retrieved, 45 were relevant.
Calculate the precision and recall.
Solution:
Using the designations above:
• A = Number of relevant records retrieved,
• B = Number of relevant records not retrieved, and
• C = Number of irrelevant records retrieved.
In this example A = 45, B = 35 (80-45) and C = 15 (60-45).
Recall = (45 / (45 + 35)) * 100% => 45/80 * 100% = 56%
Precision = (45 / (45 + 15)) * 100% => 45/60 * 100% = 75%
So, essentially - the optimal result - high Recall with high Precision is difficult to achieve.
According to Cambridge University Press:
“The advantage of having the two numbers for precision and recall is that one is more important than the other in many circumstances. Typical web surfers would like every result on the first page to be relevant (high precision) but have not the slightest interest in knowing let alone looking at every document that is relevant. In contrast, various professional searchers such as paralegals and intelligence analysts are very concerned with trying to get as high recall as possible, and will tolerate fairly low precision results in order to get it. Individuals searching their hard disks are also often interested in high recall searches. Nevertheless, the two quantities clearly trade off against one another: you can always get a recall of 1 (but very low precision) by retrieving all documents for all queries! Recall is a non-decreasing function of the number of documents retrieved. On the other hand, in a good system, precision usually decreases as the number of documents retrieved is increased”

For part 1 of 1, click here.

e-Discovery | cloud computing
New Jersey, USA | Lahore, PAK | Dubai, UAE
www.claydesk.com
(855) – 833 – 7775
(703) – 646 - 3043
Recall and Precision
Recall and Precision

Saturday, April 5, 2014

The trade-off between ‘Recall’ and ‘Precision’ in predictive coding (part 1 of 2)

This is a two-part series of posts relating to information retrieval by applying predictive coding analysis, and details out the trade-off between Recall and Precision. Predicting Coding – sometimes referred to as ‘Technology Assisted Review’ (TAR) is basically the integration of technology into human document review process. The two-fold benefit of using TAR is speeding up the review process and reducing costs. Sophisticated algorithms are utilized to produce relevant set of documents. The underlying process in TAR is based on concept of Statistics. In TAR, a sample set of documents (seed-sets) are coded by subject matter experts, acting as the primary reference data to teach TAR machine recognition of relevant patterns in the larger data set. In simple terms, a ‘data sample’ is created based on chosen sampling strategies such as random, stratified, systematic, etc. Remember, it is critical to ensure that seed-sets are prepared by subject matter experts. Based on seed-sets, the algorithm in TAR platform starts assigning predictions to the documents in the database. Through an iterative process, adjustments can be made on the fly to reach desired objectives. The two important metrics used to measure the efficacy of TAR are:
  1. Recall
  2. Precision
Recall is the fraction of the documents that are relevant to the query that are successfully retrieved, whereas, Precision is the fraction of retrieved documents that are relevant to the find. If the computer, in trying to identify relevant documents, identifies a set of 100,000 documents, and after human review, 75,000 out of the 100,000 are found to be relevant, the precision of that set is 75%. In a given population of 200,000 documents, assume 30,000 documents are selected for review as the result of TAR. If 20,000 documents are ultimately found within the 30,000 to be responsive, the selected set has a 66% precision measure. But if another 5,000 relevant documents are found in the remaining 170,000 that were not selected for review, which means the set selected for review has a recall of 80% (20,000 / 25,000). 

 To be continued....

 
e-Discovery | cloud computing 
New Jersey, USA | Lahore, PAK | Dubai, UAE
 www.claydesk.com
(855) – 833 – 7775 
(703) – 646 - 3043
 

CEO ClayDesk
Syed Raza