Writings

August 7, 2023

Books On Law and Technology:

E-DISCOVERY FOR EVERYONE, Ralph Losey; Foreword Judge Paul Grimm (ABA 2017) (Click here for my video intro to this book);  

PERSPECTIVES ON PREDICTIVE CODING  And Other Advanced Search Methods for the Legal Practitioner, Editors: Jason R. Baron, Ralph C. Losey, Michael Berman; Foreword by Judge Andrew Peck (ABA 2016-2017).

Adventures in Electronic Discovery (West, Spring 2011); 

Electronic Discovery: New Ideas, Trends, Case Law, and Practices (West 2010); 

Introduction to e-Discovery, (ABA 2009); 

e-Discovery: Current Trends and Cases (ABA 2008); 

Your Cyber Rights and Responsibilities: The Law of the Internet, Chapter 3 of Que’s Special Edition Using the Internet, (McMillian 3rd Ed, 1996).

Law Review Articles and Other Writings on Law and Technology and Insurance:


TAR Course: 9th Class

March 17, 2017

Ninth Class: GIGO, QC, SME, Method, Software

At this point in the Course we have already covered five of the nine insights. In this class we will cover the remaining four: GIGO & QC (Garbage In, Garbage Out) (Quality Control); SME (Subject Matter Expert); Method (for electronic document review); and, Software (for electronic document review). The last three: SME – Method – Software, are all parts of Quality Control. This again is very important to understand in order to pass this Course.

GIGO & QC – Garbage In, Garbage Out & Quality Control

Garbage In, Garbage Out is one of the oldest sayings in the computer world. You put garbage into the computer and it will spit it back at you in spades. It is almost as true today as it was in the 1980s when it was first popularized. Smart technology that recognizes and corrects for some mistakes has tempered GIGO somewhat, but it still remains a controlling principle of computer usage.

garbage-in-garbage-out

The GIGO Wikipedia entry explains that:

GIGO in the field of computer science or information and communications technology refers to the fact that computers, since they operate by logical processes, will unquestioningly process unintended, even nonsensical, input data (“garbage in”) and produce undesired, often nonsensical, output (“garbage out”). … It was popular in the early days of computing, but applies even more today, when powerful computers can produce large amounts of erroneous information in a short time.

Wikipedia also pointed out an interesting new expansion of the GIGO Acronym, Garbage In, Gospel Out:

It is a sardonic comment on the tendency to put excessive trust in “computerized” data, and on the propensity for individuals to blindly accept what the computer says.

Now as to our insight: GIGO in electronic document review, especially review using predictive coding, is largely the result of human error on the part of the Subject Matter Expert. Of course, garbage can also be created by poor methods, where too many mistakes are made, and by poor software. But to really mess things up, you need a clueless SME. These same factors also create garbage (poor results) when used with any document review techniques. When the subject matter expert is not good, when he or she does not have a good grasp for what is relevant, and what is important for the case, then all methods fail. Keywords and active machine learning both depend on reliable attorney expertise. Quality control literally must start at the top of any electronic document review project. It must start with the SME.

Missed_target

If your attorney expert, your SME, has no clue, their head is essentially garbage. With that kind of bad input, you will inevitably get bad output. This happens with all usages of a computer, but especially when using predictive coding. The computer learns what you teach it. Teach it garbage and that is what it will learn. It will hit a target all right. Just not the right target. Documents will be produced, just not the ones needed to resolve the disputed issues. A poor SME makes too many mistakes and misses too many relevant documents because they do not know what is relevant and what is not.

Robot_BYTEA smart AI can correct for some human errors (perfection is not required). The algorithms can correct for some mistakes in consistency by an SME, and the rest of the review team, but not that many. In machine learning for document review the legal review robot now starts as a blank slate with no knowledge of the law or the case. They depend on the SME to teach them. Someday that may change. We may see smart robots who know the law and relevance, but we are not even near there yet. For now our robots are more like small children. They only know what you tell them, but they can spot inconsistencies in your message and they never forget.

Subject Matter Expert – SME

The predictive coding method can fail spectacularly with a poor expert, but so can keyword search. The converse of both propositions is also true. In all legal document review projects the SME needs to be an expert in scope of relevance, what is permitted discovery, what is relevant and what is not, what is important and what is not. They need to know the legal rules governing relevance backwards and forwards. They also need to have a clear understanding of the probative value of evidence in legal proceedings. This is what allows an attorney to know the scope of discoverable information.

relevance_scope_2016

If the attorney in charge does not understand the scope of discoverable information, does not understand probative value, then the odds of finding the documents important to a case are significantly diminished. You could look at a document with high probative value and not even know that it is relevant. This is exactly the concern of many requesting parties, that the responding party’s attorney will not understand relevance and discoverability the same way they do. That is why the first step in my recommended work flow is to Talk, which I also call Relevance Dialogues.

The kind of ESI communications with opposing counsel that are needed is not whining accusations or aggressive posturing. I will go into good talk versus bad talk in some detail when I explain the first step of our eight-step method. The point of the talking that should begin any document review project is to get a common understanding of scope of discoverable information. What is the exact scope of the request for production? Don’t agree the scope is proportionate? That’s fine. Agree to disagree and Talk some more, this time to the judge.

We have seen firsthand in the TREC experiments the damage  that can be done by a poor SME and no judge to keep them inline. Frankly, it has been something of a shock, or wake up call, as to the dangers of poor SME relevance calling. Most of the time I am quite lucky in my firm of super-specialists (all we do is employment law matters) to have terrific SMEs. But I have been a lawyer for a long time. I have seen some real losers in this capacity in the past 36 years. I myself have been a poor SME in some of the 2015 TREC experiments. An example that comes to mind is when I had to be the SME on the subject of CAPTCHA in a collection of forum messages by hackers. It ended up being on the job training. I saw for myself how little I could do to guide the project. Weak SMEs make bad leaders in the world of technology and law.

captcha

spoiled brat becomes an "adult"There are two basic ways that discovery SMEs fail. First, there are the kind who do not really know what they are talking about. They do not have expertise in the subject matter of the case, or, let’s be charitable, their expertise is insufficient. A bullshit artist makes a terrible SME. They may fool the client (and they often do), but they do not fool the judge or any real experts. The second kind of weak SMEs have some expertise, but they lack experience. In my old firm we used to call them baby lawyers. They have knowledge, but not wisdom. They lack the practical experience and skills that can only come from grappling with these relevance issues in many cases.

That is one reason why boutique law firms like my own do so well in today’s competitive environment. They have the knowledge and the wisdom that comes from specialization. They have seen it before and know what to do. Knowledge_Information_Wisdom

An SME with poor expertise has a very difficult time knowing if a document is relevant or not. For instance, a person not living in Florida might have a very different understanding than a Floridian of what non-native plants and animals threaten the Florida ecosystem. This was Topic 408 in TREC 2016 Total Recall Track. A native Floridian is in a better position to know the important invasive species, even ones like vines that have been in the state for over a hundred years. A non-expert with only limited information may not know, for instance, that Kudzo vines are an invasive plant from Japan and China. (They are also rumored to be the home of small, vicious Kudzo monkeys!) What is known for sure is that Kudzu, Pueraria montana, smothers all other vegetation around, including tall trees (shown below). A native Floridian hates Kudzo as much as they love Manatees.

kudzu

A person who has just visited Florida a few times would not know what a big deal Kudzo was in Florida during the Jeb Bush administration, especially in Northern Florida. (Still is.) They had probably never heard of it at all. They could see email with the term and have no idea what the email meant. It is obvious the native SME would know more, and thus be better positioned than a fake-SME, to determine Jeb Bush email relevance to non-native plants and animals that threaten the Florida ecosystem. By the way, all native Floridians especially hate pythons and a python eating one of our gators as shown below is an abomination.

python

Expertise is obviously needed for anyone to be a subject matter expert and know the difference between relevant and irrelevant. But there is more to it than information and knowledge. It also takes experience. It takes an attorney who has handled these kinds of cases many times before. Preferably they have tried a case like the one you are working on. They have seen the impact of this kind of evidence on judge and jury. An attorney with both theoretical knowledge and practical experience makes the best SME. Your ability to contribute subject matter expertise is limited when you have no practical experience. You might think certain ESI is helpful, when in fact, it is not; it has only weak probative value. A document might technically be relevant, but the SME lacks the experience and wisdom to know that matter is practically irrelevant anyway.

It goes without saying that any SME needs a good review team to back them up, to properly, consistently implement their decisions. In order for good leadership to be effective, there must also be good project management. Although this insight discussion features the role of the SME member of the review team, that is only because the importance of the SME was recently emphasized to us in our TREC research. In actuality all team members are important, not just the input from the top. Project management is critical, which is an insight already well-known to us and, we think, the entire industry.

Corrupt SMEs

Star_wars_emperor

Beware evil SMEs

Of course, no SME can be effective, no matter what their knowledge and experience, if they are not fair and honest. The SME must impartially seek and produce documents that are both pro and con. This is an ethics issue in all types of document review, not just predictive coding. In my experience corrupt SMEs are rare. But it does happen occasionally, especially when a corrupt client pressures their all too dependent attorneys. It helps to know the reputation for honesty of your opposing counsel. See: Five Tips to Avoid Costly Mistakes in Electronic Document Review – Part 2 that contains my YouTube video, E-DISCOVERY ETHICS (below).

Also see: Lawyers Behaving Badly: Understanding Unprofessional Conduct in e-Discovery, 60 Mercer L. Rev. 983 (Spring 2009); Mancia v. Mayflower Begins a Pilgrimage to the New World of Cooperation, 10 Sedona Conf. J. 377 (2009 Supp.).

If I were a lawyer behaving badly in electronic document review, like for instance the Qualcomm lawyers did hiding thousands of highly relevant emails from Broadcom, I would not use predictive coding. If I wanted to not find evidence harmful to my case, I would use negotiated keyword search, the Go Fish kind. See Part Four of this article.

looking for droids in all the wrong places

I would also use linear review and throw an army of document review attorneys at it, with no one really knowing what the other was doing (or coding). I would subtly encourage project mismanagement. I would not pay attention. I would not supervise the rest of the team. I would not involve an AI entity,  i.w.- active machine learning. I would also not use an attorney with search expertise, nor would I use a national e-discovery vendor. I would throw a novice at the task and use a local or start-up vendor who would just do what they were told and not ask too many questions.

A corrupt hide-the-ball attorney would not want to use a predictive coding method like ours. They would not want the relevant documents produced or logged that disclose the training documents they used. This is true in any continuous training process, not just ours. We do not produce irrelevant documents, the law prevents that and protects our client’s privacy rights. But we do produce relevant documents, usually in phases, so you can see what the training documents are.

Star Wars Obi-WanA Darth Vader type hide-the-ball attorney would also want to avoid using a small, specialized, well-managed team of contract review lawyers to assist on a predictive coding project the review project. They would instead want to work with a large, distant army of contract lawyers. A small team of contract review attorneys cannot be brought into the con, especially if they are working for a good vendor. Even if you handicap them with a bad SME, and poor methods and software, they may still find a few of the damaging documents you do not want to produce. They may ask questions when they learn their coding has been changed from relevant to irrelevant. I am waiting for the next Qualcomm or Victor Stanley type case where a contract review lawyer blows the whistle on corrupt review practices. Qualcomm Inc. v. Broadcom Corp., No. 05-CV-1958-B(BLM) Doc. 593 (S.D. Cal. Aug. 6, 2007) (one honest low-level engineer testifying at trial blew the whistle on Qualcomm’s massive fraud to hide critical email evidence). I have heard stories from contract review attorneys, but the law provides them too little protection, and so far at least, they remain behind the scenes with horror stories.

One protection against both a corrupt SME, and SME with too little expertise and experience, is for the SME to be the attorney in charge of the trial of the case, or at least one who works closely with them so as to get their input when needed. The job of the SME is to know relevance. In the law that means you must know how the ultimate arbitrator of relevance will rule – the judge assigned to your case. They determine truth. An SME’s own personal opinion is important, but ultimately of secondary importance to that of the judge. For that reason a good SME will often vary on the side of over-expansive relevance because they know from history that is what the judge is likely to allow in this type of case.

Judges-Peck_Grimm_FacciolaThis is a key point. The judges, not the attorneys, ultimately decide on close relevance and related discoverability issues. The head trial attorney interfaces with the judge and opposing counsel, and should have the best handle on what is or is not relevant or discoverable. A good SME can predict the judge’s rulings and, even if not perfect, can gain the judicial guidance needed in an efficient manner.

If the judge detects unethical conduct by the attorneys before them, including the attorney signing the Rule 26(g) response, they can and should respond harshly to punish the attorneys. See eg: Victor Stanley, Inc. v. Creative Pipe, Inc., 269 F.R.D. 497, 506 (D. Md. 2010). The Darth Vader’s of the world can be defeated. I have done it many times with the help of the presiding judge. You can too. You can win even if they personally attack both you and the judge. Been through that too.

Three Kinds of SMEs: Best, Average & Bad

bullseye_arrow_hitWhen your project has a good SME, one with both high knowledge levels and experience, with wisdom from having been there before, and knowing the judge’s views, then your review project is likely to succeed. That means you can attain both high recall of the relevant documents and also high precision. You do not waste much time looking at irrelevant documents.

When an SME has only medium expertise or experience, or both, then the expert tends to err on the side of over-inclusion. They tend to call grey area documents relevant because they do not know they are unimportant. They may also not understand the new Federal Rules of Civil Procedure governing discoverability. Since they do not know, they err on the side of inclusion. True experts know and so tend to be more precise than rookies. The medium level SMEs may, with diligence, also attain high recall, but it takes them longer to get there. The precision is poor. That means wasted money reviewing documents of no value to the case, documents of only marginal relevance that would not survive any rational scrutiny of Rule 26(b)(1).

When the SME lacks knowledge and wisdom, then both recall and precision can be poor, even if the software and methods are otherwise excellent. A bad SME can ruin everything. They may miss most of the relevant documents and end up producing garbage without even knowing it. That is the fault of the person in charge of relevance, the SME, not the fault of predictive coding, nor the fault of the rest of the e-discovery review team.

relevance_targets

top_smeIf the SME assigned to a document review project, especially a project using active machine learning, is a high-quality SME, then they will have a clear grasp of relevance. They will know what types of documents the review team is looking for. They will understand the probative value of certain kids of documents in this particular case. Their judgments on Rule 26(b)(1) criteria as to discoverability will be consistent, well-balanced and in accord with that of the governing judge. They will instruct the whole team, including the machine, on what is true relevant, on what is discoverable and what is not. With this kind of top SME, if the software, methods, including project management, and rest of the review team are also good, then high recall and precision are very likely.

aver_smeIf the SME is just average, and is not sure about many grey area documents, then they will not have a clear grasp of relevance. It will be foggy at best. They will not know what types of documents the review team is looking for. SMEs like this think that any arrow that hits a target is relevant, not knowing that only the red circle in the center is truly relevant. They will not understand the probative value of certain kids of documents in this particular case. Their judgments on Rule 26(b)(1) criteria as to discoverability will not be perfectly consistent, and will end up either too broad or too narrow, and may not be in accord with that of the governing judge. They will instruct the whole team, including the machine, on what might be relevant and discoverable in an unfocused, vague, and somewhat inconsistent manner. With this kind of SME, if the software and methods, including project management, and rest of the review team are also good, and everyone is very diligent, high recall is still possible, but precision is unlikely. Still, the project will be unnecessarily expensive.

The bad SME has multiple possible targets in mind. They just search without really knowing what they are looking for. They will instruct the whole team, including the machine, on what might be relevant and discoverable in a confused, constantly shifting and often contradictory manner. Their obtuse explanations of relevance have little to do with the law, nor the case at hand. They probably have a very poor grasp of Rule 26(b)(1) Federal Rules of Civil Procedure. Their judgments on 26(b)(1) criteria as to discoverability, if any, will be inconsistent, imbalanced and sometimes irrational. This kind of SME probably does not even know the judge’s name, much less a history of their relevance rulings in this type of case. With this kind of SME, even if the software and methods are otherwise good, there is little chance that high recall or precision will be attained. An SME like this does not know when their search arrow has hit center of the target. In fact, it may hit the wrong target entirely. Their thought-world looks like this.

poor_sme

A document project governed by a bad SME runs a high risk of having to be redone because important information is missed. That can be a very costly disaster. Worse, a document important to the producing parties case can be missed and the case lost because of that error. In any event, the recall and precision will both be low. The costs will be high. The project will be confused and inefficient. Projects like this are hard to manage, no matter how good the rest of the team. In projects like this there is also a high risk that privileged documents will accidentally be produced. (There is always some risk of this in today’s high volume ESI world, even with a top-notch SME and review team. A Rule 502(d) Order should always be entered for the protection of all parties.)

Method and Software

The SME and his or her implementing team is just one part of the quality triangle. The other two are Method of electronic document review and Software used for electronic document review.

predictive_coding_quality_triangle-variation

Obviously the e-Discovery Team takes Method very seriously. That is one reason we are constantly tinkering with and improving our methods. We released the breakthrough Predictive Coding 3.0 last year, following 2015 TREC research, and this year, after TREC 2016, we released version 4.0. You could fairly say we are obsessed with the topic. We also focus on the importance of good project management and communications. No matter how good your SME, and how good your software, if your methods are poor, so too will your results in most of your projects. How you go about a document review project, how you manage it, is all-important to the quality of the end product, the production.

predictive_coding_4-0_webThe same holds true for software. For instance, if your software does not have active machine learning capacities, then it cannot do predictive coding. The method is beyond the reach of the software. End of story. The most popular software in the world right now for document review does not have that capacity. Hopefully that will change soon and I can stop talking around it.

Mr_EDREven among the software that has active machine learning, some are better than others. It is not my job to rank and compare software. I do not go around asking for demos and the opportunity to test other software. I am too busy for that. Everyone knows that I currently prefer to use EDR. It is the software by Kroll Ontrack that I use everyday. I am not paid to endorse them and I do not. (Unlike almost every other e-discovery commentator out there, no vendors pay me a dime.) I just share my current preference and pass along cost-savings to my clients.

I will just mention that the only other e-discovery vendor to participate with us at TREC is Catalyst. As most of my readers know, I am a fan of the founder and CEO, John Tredennick. There are several other vendors with good software too. Look around and be skeptical. But whatever you do, be sure the software you use is good. Even a great carpenter with the wrong tools cannot build a good house.

predictive_coding_quality_triangleOne thing I have found, that is just plain common sense, is that with good software and good methods, including good project management, you can overcome many weaknesses in SMEs, except for dishonesty or repeated, gross-negligence. The same holds true for all three corners of the quality triangle. Strength in one can, to a certain extent, make up for weaknesses in another.

Go on to Class Ten.

Or pause to do this suggested “homework” assignment for further study and analysis.

SUPPLEMENTAL READING:  Find and read Losey’s articles on Information, Knowledge and Wisdom. Consider what this means for the Law.

Read up on the mentioned Qualcomm case. It is, for instance, discussed in the e-Discovery Team Training program. Also read Losey’s Lawyers Behaving Badly and Mancia v. Mayflower Begins a Pilgrimage to the New World of Cooperation. But do not stop there, also read what other attorneys and judges have said about lawyer ethics in e-discovery.

For fun, research and find out more about the history of the phrase GIGO among computer users. It has been around for as long as I can remember. Background reading on the origin of the term SME is also interesting and may add to your perspective. Note how in software development there are also what are called “domain” experts, explained in Wikipedia thusly:

A domain expert is a person with special knowledge or skills in a particular area of endeavor. (An accountant is an expert in the domain of accountancy, for example.)

Unfortunately, many legal document review software developers created predictive coding software and software use methods without use of qualified legal domain experts. The result is what we call the first two versions of predictive coding software, still in use today. See Class One that discusses Predictive Coding 1.0 and 2.0.

EXERCISES: Do you know the primary statistical fallacies underlying the early versions of predictive coding that software developers imposed upon the profession, what we call versions 1.0 and 2.0? Hint – it has to do with the widespread failure of non-legal domain experts, mainly software SMEs, to understand legal relevance, trial preparation and prevalence, not to mention the temperament and time value of senior legal SMEs. Look for articles criticizing predictive coding and consider how this came to pass and what we can do about it now.

Related to this question, do you know the primary reason that many law firms who tried these old machine training methods quickly gave them up. Hint – it has to do with SMEs, good and bad. Aside from use of the latest methods taught here, consider what other methods could be used to minimize the value of SME time. Hint – the answer has to do with grey area relevant documents, and hot documents. If you think you know, write a comment below and Losey will privately respond to either “confirm or deny.”

Consider what can be done to counteract corrupt SMEs. Have you ever served under such a person? Without naming names, please, can you share in a comment below what you think they were doing and why?

Students are invited to leave a public comment below. Insights that might help other students are especially welcome. Let’s collaborate!

_

e-Discovery Team LLC COPYRIGHT 2017

ALL RIGHTS RESERVED

_


Five Tips To Avoid Mistakes In Electronic Document Review

January 9, 2017

5-Tips_ReviewThese tips are based on a long life of litigation legal practice, including thousands of document reviews going back to 1978. I have seen hundreds of mistakes over the years, especially in the last decade when my work as a lawyer has been limited to electronic discovery. Many of these blunders were made by “the other side.” Some were funny and made me smile, others were not and led to motions of all kinds. Keeping it real, I have made my own fair share of errors too. Those lessons were painful, but are now deeply etched. No doubt I would have made many more errors, but for the generous guidance provided by more senior and experienced attorneys that I have had the very good fortune to work with. It is with this great debt in mind that I offer up these tips.

Click here to download a Word version. [An earlier version of this article was published last year.)

Some Mistakes are Funny

Gloat_SimpsonsOn the funny side of observing document review mistakes, I will never forget the time, not too long ago, where the other side produced documents to us with the most important ones placed together up front. That was a surprising electronic zipped production to open. It was fairly obvious what had happened. The highly relevant documents were not mixed-in as they should have been with the other more plebeian merely relevant documents. Instead, the hot documents were all together at the front of the production with the lowest numbers. (Sixth tip – never do that!) Our team laughed at the error, as we easily and quickly found lots of great stuff to help our case. Still, we kept a discrete silence and did not gloat. (Seventh tip – Never do that either, at least not in front of them!)

Opposing counsel, who later became a friend, admitted the error to me months after the case settled. He found out what happened a few days too late. Even he chuckled as to how inadvertently “nice” they were. As is often the case, the mistake did not really matter in the end. We would have recognized the hot documents anyway. As usual when errors happen in e-discovery, he blamed the vendor. They almost always get blamed for mistakes, but, the truth is, vendors are just tools of the attorneys (no offense dear vendors, tools are important). The attorneys are almost always the ones ultimately responsible for screw-ups.

Lessons of History

Clarence Darrow and William Jennings BryanThe five tips shared here are born out of a long history of document review. How relevant could past legal practice be you might ask? In 1980, just like today, document discovery was and still is a search for documents and communications that have probative value. The tools and document forms have changed, but not the basic tasks. The federal rules have changed too, but not that much, and the ethics of an attorney controlled discovery system, not at all.

Discovery has always been a search to determine what really happened, to sort out the conflicting stories and find evidence for use at trial. Legal counsel never creates facts. That is called falsification of evidence and is a crime. Attorneys just find the facts and then do the best they can with them; make them look as good as possible by legal argument and clever presentation. The discovery effort has always been a fairly cooperative one between attorneys. It has always been a question of trust but verify. Conversely, there have always been a few slime balls in the Bar who do not get that, but that is what judges (and Bar ethics committees) are for, and they soon sniff out the weasels. All things evolve and change, but some basic patterns remain the same.

By the early nineties I sometimes had to look beyond paper files and investigate computers for possible evidence. That occasionally happened in trade-secret cases, much like today. Forensics was fairly easy back then. My favorite ESI search and review tool was Norton Utilities, which I had been using since the mid-eighties. Like most computer lawyers around those days, as we were called, I was by necessity a DOS master, and, until around 1997, a one man IT department for my law firm. It only took a few hours a week to do that for my then twenty person law firm, along with the help of an outside “computer repairman.” I would always learn a lot from those guys.

DOS_Screen

compuserve_FTPThe frequency of document reviews that included computer files increased somewhat in the early nineties as law firm clients began using more technologies. By then most corporations and many individuals began to rely on computers for work, although almost nobody but a few techno-nerds used email, electronic messaging and pre-Internet online communities. (I was considered an odd-ball hobbyist for using electronic messaging with CompuServeThe SourceThe Well, etc. in the mid to late eighties, and the Internet since 93-94 with Mosaic, then NetScape.) Instead, facsimile machines were the rage at that time, and they just generated more paper discovery.

Although the presence, or not, of computer files was a discovery issue in trade-secret and non-compete cases in the early 90s, electronic communications discovery was still not a factor. The adoption of tech by businesses and lawyers seemed slow to me then, and still seems slow today. (When will companies and law firms adopt the AI technologies that have been readily available for years now?)

Discovery of computer files, as e-discovery was then called, started to take off in the late nighties as corporate email finally became popular. It was part of the public’s discovery on the Internet. I had the opportunity back in 1996 to write a chapter on Internet law for the then popular book by Macmillan (Que), Special Edition, Using the Internet (3rd Ed. 1996), which is incredibly still sold on Amazon.

Using_Internet_96

My chapter in the book was the first after the introduction and was titled by the editors “Your Cyber Rights and Responsibilities: Law and Etiquette.” I still smile when I see how they tasked me not only with explaining all of the Law of the Internet, but also proper online etiquette. I tried to address the legal issues in 52 pages (I pretty much ignored the etiquette part), including discussion of all of the key cases of the day. I covered things like free speech, online agreements, privacy rights, crime, security and cryptology (I even included a coded message, which surprisingly, the editor decrypted and then made me clean up). These are all still hot issues.

When businesses started using the Internet too, the discovery and review of electronic information really started to take off. That is when electronic document review was truly born. That is also when the first e-discovery vendors like Kroll and Attenex (now FTI) started to become large national organizations.

By early turn of the century potential evidence in the form of computer files and emails were multiplying like tribbles. The amount of electronic  evidence started to explode. It has been a dangerous avalanche of e-discovery overload ever since. The needle in the haystack problem was born that still challenges document review today. See Document Review and Predictive Coding: Video Talks – Part One.

Like several others I sensed the danger in the information explosion, saw how it was overwhelming discovery and making it too expensive. For that reason in 2006, again like several others (although I was the only one in Florida), I stopped practicing as a commercial litigator and limited my work to e-discovery only. Since that time electronic document reviews have been front and center in my practice. To be honest, I have not even seen an original paper document in discovery since that time, although I have heard they still exist. (Other attorneys have shown me their paper cuts to prove it. What a dangerous job paper document reviews can be.)

Five Videos Explain the Five Tips

The five tips shared here are rooted in the ancient history of paper productions, and pre-vendor computer file search, but are designed for current electronic practices and post 2015 amended rules of procedure. After a lifetime of work in this area, there are more tips I could provide, and will do so in the future, I’m sure, but these are the ones that occur to me today. The videos below explain these five tips and how you can implement them.

In this opening eleven-minute video I share what may be the most important tip of all, the avoidance of time pressures and resultant hurried activities.

Tip # 1 – Never Put Yourself in a Time Bind – Be Proactive

________

5-Tips_Review_ETHICSThe next video explains the second tip, Ethics. It is always important to do the right thing, including the production of requested relevant documents that will harm your client and their case.  Ethics is document review, like in all other areas of legal practice, indeed, like all other areas of life, is imperative, not discretionary. My thanks to the legal mentors in my past who drilled this into me from my first day out of law school. Any success I have enjoyed in my career I owe, at least in part, to their good influences.

slippery_slopeCall this Ethics advice the Boy Scout tip if you wish, but it really works to avoid a panoply of errors, including potentially career-ending ones. It also helps you to sleep at night and have a clean conscience. The slippery slopes of morality are where the worst errors are made in all legal tasks, but this is particularly true in document review. Discovery in our system is run by lawyers, not judges, magistrates, or special masters. It is based on lawyers faithful conduct and compliance with the rules, including the all-important rules requiring the voluntary production of evidence harmful to a client (a notion strange to many legal systems outside of the U.S.).

Lawyers know the rules, even if their clients do not, and it is critical that they follow them earnestly, holding up against all pressures and temptations. At the end of the day, your reputation and integrity are all that you have, so compromising your ethics is never an acceptable alternative. The Rules of Professional Conduct must be the guiding star of all legal practice, including electronic document review. It is your job as a lawyer to find the evidence and argue it’s meaning; never to hide it. This video is a reminder of a core truth of lawyer obligations as officers of the court.

Tip #2 – Ethics and Electronic Discovery

For more of Losey’s thoughts on ethics and e-discovery, seeLawyers Behaving Badly: Understanding Unprofessional Conduct in e-Discovery, 60 Mercer L. Rev. 983 (Spring 2009); Mancia v. Mayflower Begins a Pilgrimage to the New World of Cooperation, 10 Sedona Conf. J. 377 (2009 Supp.); e-Discovery for Everyone, Chapters 15-19, (ABA, 2017).

focus2Our third tip is Focused Concentration, which was mentioned in passing in the Part One video on Time, and also tips four and five, on Worms and Check Again. The Focus tip is based on my own experiences in cultivating the ability to concentrate on legal work, or anything else. It is contra to the popular, but erroneous notion, a myth really, that you can multi-task and still do each task efficiently. Our brain does not work that way. See Eg. Crenshaw, The Myth of Multitasking: How “Doing It All” Gets Nothing Done; and the work of neuroscientist Daniel J Levitin, who has found the only exception is adding certain background music. All document reviewers who wear headsets, myself included, know this exception very well.

Tip #3 – Focused Concentration

Steve-Jobs-zenFor more on quality control and improved lifestyle by focused attention and other types of meditation, see my earlier video blog, Document Review and Predictive Coding: Video Talks – Part Six, especially the 600 word introduction to that video that includes information on the regular meditation practices of Supreme Court Justice Stephen Breyer, among others. See A Word About Zen Meditation. This practice helped Steve Jobs, and helps Justice Breyer and countless others. It could help you too. Also see these excellent online services, Insight Timer  and Mindfulnes App. These practices will, at the very least, allow for more focused attention to what you are doing, including document review, and thus greatly reduce mistakes.

The next Worms tip is a simple technical one, unique to e-discovery, where Worm is an acronym that means write once, read many times. I prefer to make productions on write-only or recordable only CDs, aka, CD-R, or DVD-R, and not by file transfers. I do not want to use a CD-RW, or DVD-RW meaning one that is rewritable.

Tip #4 – Use WORMS to Produce

Speaking of WORMs, did you know that the SEC requires all broker-dealers to preserve its records for three years in a format that prevents alteration? That means our Write Once Read Many times format. SEC Interpretation: Electronic Storage of Broker-Dealer Records, 17 CFR Part 241 [Release No. 34-47806] (5/12/13).

On December 21, 2016, twelve large broker-dealer firms agreed to pay fines totaling $14.4 million to the Financial Industry Regulatory Authority (FINRA) over allegations, in FINRA’s words, that “they failed to preserve electronic records in a WORM format that couldn’t be altered.” This has to be the all time most expensive “can of worms.”

The fifth tip of Check Again, has to do with the importance of redundancy in quality control, subject only to proportionality considerations, including the tip to spot check your final production CD. I discuss briefly the tendency of lawyers to be trapped by paralysis by analysis, and why we are sometimes considered deal killers by business people because we focus so much on risk avoidance and over-think things. There has to be a proportional limit on the number and cost of double-checks in document review. I also mention in the fifth tip my Accept of Zero Error and ei-Recall checks, which are quality assurance efforts that we make in larger document review projects.

Tip #5 – Check Again

____________

These are five tips to help everyone doing electronic document review. They are not necessarily the “top five,” but they are all important. We suggest you drill these five best practices into your document review team.

For more information on best practices of document review see these three periodically updated resources:

 

 


5 Tips

January 6, 2017

5-Tips_ReviewFive tips to help you to avoid costly mistakes in electronic document review. Click here to download a Word version. These tips are based on a long life of litigation legal practice, including thousands of document reviews going back to 1978. I have seen hundreds of mistakes over the years, especially in the last decade when my work as a lawyer has been limited to electronic discovery. Many of these blunders were made by “the other side.” Some were funny and made me smile, others were not and led to motions of all kinds. Keeping it real, I have made my own fair share of errors too. Those lessons were painful, but are now deeply etched. No doubt I would have made many more errors, but for the generous guidance provided by more senior and experienced attorneys that I have had the very good fortune to work with. It is with this great debt in mind that I offer up these tips.

Some Mistakes are Funny

Gloat_SimpsonsOn the funny side of observing document review mistakes, I will never forget the time, not too long ago, where the other side produced documents to us with the most important ones placed together up front. That was a surprising electronic zipped production to open. It was fairly obvious what had happened. The highly relevant documents were not mixed-in as they should have been with the other more plebeian merely relevant documents. Instead, the hot documents were all together at the front of the production with the lowest numbers. (Sixth tip – never do that!) Our team laughed at the error, as we easily and quickly found lots of great stuff to help our case. Still, we kept a discrete silence and did not gloat. (Seventh tip – Never do that!)

Opposing counsel, who later became a friend, admitted the error to me months after the case settled. He found out what happened a few days too late. Even he chuckled as to how inadvertently “nice” they were. As is often the case, the mistake did not really matter in the end. We would have recognized the hot documents anyway. As usual when errors happen in e-discovery, he blamed the vendor. They almost always get blamed for mistakes, but, the truth is, vendors are just tools of the attorneys (no offense dear vendors, tools are important). The attorneys are almost always the ones ultimately responsible for screw-ups.

Lessons of History

Clarence Darrow and William Jennings BryanThe five tips shared here are born out of a long history of document review. How relevant could past legal practice be you might ask? In 1980, just like today, document discovery was and still is a search for documents and communications that have probative value. The tools and document forms have changed, but not the basic tasks. The federal rules have changed too, but not that much, and the ethics of an attorney controlled discovery system, not at all.

Discovery has always been a search to determine what really happened, to sort out the conflicting stories and find evidence for use at trial. Legal counsel never creates facts. That is called falsification of evidence and is a crime. Attorneys just find the facts and then do the best they can with them; make them look as good as possible by legal argument and clever presentation. The discovery effort has always been a fairly cooperative one between attorneys. It has always been a question of trust but verify. Conversely, there have always been a few slime balls in the Bar who do not get that, but that is what judges (and Bar ethics committees) are for, and they soon sniff out the weasels. All things evolve and change, but some basic patterns remain the same.

By the early nineties I sometimes had to look beyond paper files and investigate computers for possible evidence. That occasionally happened in trade-secret cases, much like today. Forensics was fairly easy back then. My favorite ESI search and review tool was Norton Utilities, which I had been using since the mid-eighties. Like most computer lawyers around those days, as we were called, I was by necessity a DOS master, and, until around 1997, a one man IT department for my law firm. It only took a few hours a week to do that for my then twenty person law firm, along with the help of an outside “computer repairman.” I would always learn a lot from those guys.

DOS_Screen

compuserve_FTPThe frequency of document reviews that included computer files increased somewhat in the early nineties as law firm clients began using more technologies. By then most corporations and many individuals began to rely on computers for work, although almost nobody but a few techno-nerds used email, electronic messaging and pre-Internet online communities. (I was considered an odd-ball hobbyist for using electronic messaging with CompuServeThe SourceThe Well, etc. in the mid to late eighties, and the Internet since 93-94 with Mosaic, then NetScape.) Instead, facsimile machines were the rage at that time, and they just generated more paper discovery.

Although the presence, or not, of computer files was a discovery issue in trade-secret and non-compete cases in the early 90s, electronic communications discovery was still not a factor. The adoption of tech by businesses and lawyers seemed slow to me then, and still seems slow today. (When will companies and law firms adopt the AI technologies that have been readily available for years now?)

Discovery of computer files, as e-discovery was then called, started to take off in the late nighties as corporate email finally became popular. It was part of the public’s discovery on the Internet. I had the opportunity back in 1996 to write a chapter on Internet law for the then popular book by Macmillan (Que), Special Edition, Using the Internet (3rd Ed. 1996), which is incredibly still sold on Amazon.

Using_Internet_96

My chapter in the book was the first after the introduction and was titled by the editors “Your Cyber Rights and Responsibilities: Law and Etiquette.” I still smile when I see how they tasked me not only with explaining all of the Law of the Internet, but also proper online etiquette. I tried to address the legal issues in 52 pages (I pretty much ignored the etiquette part), including discussion of all of the key cases of the day. I covered things like free speech, online agreements, privacy rights, cryptology, crime, and security; issues still hot today.

When businesses started using the Internet too, the discovery and review of electronic information really started to take off. That is when electronic document review was truly born. That is also when the first e-discovery vendors like Kroll and Attenex (now FTI) started to become large national organizations.

By early turn of the century potential evidence in the form of computer files and emails were multiplying like tribbles. The amount of electronic  evidence started to explode. It has been a dangerous avalanche of e-discovery overload ever since. The needle in the haystack problem was born that still challenges document review today. See Document Review and Predictive Coding: Video Talks – Part One.

Like several others I sensed the danger in the information explosion, saw how it was overwhelming discovery and making it too expensive. For that reason in 2006, again like several others (although I was the only one in Florida), I stopped practicing as a commercial litigator and limited my work to e-discovery only. Since that time electronic document reviews have been front and center in my practice. To be honest, I have not even seen an original paper document in discovery since that time, although I have heard they still exist. (Other attorneys have shown me their paper cuts to prove it. What a dangerous job paper document reviews can be.)

Five Videos Explain the Five Tips

The five tips shared here are rooted in the ancient history of paper productions, and pre-vendor computer file search, but are designed for current electronic practices and post 2015 amended rules of procedure. After a lifetime of work in this area, there are more tips I could provide, and will do so in the future, I’m sure, but these are the ones that occur to me today. The videos below explain these five tips and how you can implement them.

In this opening eleven-minute video I share what may be the most important tip of all, the avoidance of time pressures and resultant hurried activities.

Tip # 1 – Never Put Yourself in a Time Bind – Be Proactive

________

5-Tips_Review_ETHICSThe next video explains the second tip, Ethics. It is always important to do the right thing, including the production of requested relevant documents that will harm your client and their case.  Ethics is document review, like in all other areas of legal practice, indeed, like all other areas of life, is imperative, not discretionary. My thanks to the legal mentors in my past who drilled this into me from my first day out of law school. Any success I have enjoyed in my career I owe, at least in part, to their good influences.

slippery_slopeCall this Ethics advice the Boy Scout tip if you wish, but it really works to avoid a panoply of errors, including potentially career-ending ones. It also helps you to sleep at night and have a clean conscience. The slippery slopes of morality are where the worst errors are made in all legal tasks, but this is particularly true in document review. Discovery in our system is run by lawyers, not judges, magistrates, or special masters. It is based on lawyers faithful conduct and compliance with the rules, including the all-important rules requiring the voluntary production of evidence harmful to a client (a notion strange to many legal systems outside of the U.S.).

Lawyers know the rules, even if their clients do not, and it is critical that they follow them earnestly, holding up against all pressures and temptations. At the end of the day, your reputation and integrity are all that you have, so compromising your ethics is never an acceptable alternative. The Rules of Professional Conduct must be the guiding star of all legal practice, including electronic document review. It is your job as a lawyer to find the evidence and argue it’s meaning; never to hide it. This video is a reminder of a core truth of lawyer obligations as officers of the court.

Tip #2 – Ethics and Electronic Discovery

For more of Losey’s thoughts on ethics and e-discovery, seeLawyers Behaving Badly: Understanding Unprofessional Conduct in e-Discovery, 60 Mercer L. Rev. 983 (Spring 2009); Mancia v. Mayflower Begins a Pilgrimage to the New World of Cooperation, 10 Sedona Conf. J. 377 (2009 Supp.); e-Discovery for Everyone, Chapters 15-19, (ABA, 2017).

focus2Our third tip is Focused Concentration, which was mentioned in passing in the Part One video on Time, and also tips four and five, on Worms and Check Again. The Focus tip is based on my own experiences in cultivating the ability to concentrate on legal work, or anything else. It is contra to the popular, but erroneous notion, a myth really, that you can multi-task and still do each task efficiently. Our brain does not work that way. See Eg. Crenshaw, The Myth of Multitasking: How “Doing It All” Gets Nothing Done; and the work of neuroscientist Daniel J Levitin, who has found the only exception is adding certain background music. All document reviewers who wear headsets, myself included, know this exception very well.

Tip #3 – Focused Concentration

Steve-Jobs-zenFor more on quality control and improved lifestyle by focused attention and other types of meditation, see my earlier video blog, Document Review and Predictive Coding: Video Talks – Part Six, especially the 600 word introduction to that video that includes information on the regular meditation practices of Supreme Court Justice Stephen Breyer, among others. See A Word About Zen Meditation. This practice helped Steve Jobs, and helps Justice Breyer and countless others. It could help you too. Also see these excellent online services, Insight Timer  and Mindfulnes App. These practices will, at the very least, allow for more focused attention to what you are doing, including document review, and thus greatly reduce mistakes.

The next Worms tip is a simple technical one, unique to e-discovery, where Worm is an acronym that means write once, read many times. I prefer to make productions on write-only or recordable only CDs, aka, CD-R, or DVD-R, and not by file transfers. I do not want to use a CD-RW, or DVD-RW meaning one that is rewritable.

Tip #4 – Use WORMS to Produce

Speaking of WORMs, did you know that the SEC requires all broker-dealers to preserve its records for three years in a format that prevents alteration? That means our Write Once Read Many times format. SEC Interpretation: Electronic Storage of Broker-Dealer Records, 17 CFR Part 241 [Release No. 34-47806] (5/12/13).

On December 21, 2016, twelve large broker-dealer firms agreed to pay fines totaling $14.4 million to the Financial Industry Regulatory Authority (FINRA) over allegations, in FINRA’s words, that “they failed to preserve electronic records in a WORM format that couldn’t be altered.” This has to be the all time most expensive “can of worms.”

The fifth tip of Check Again, has to do with the importance of redundancy in quality control, subject only to proportionality considerations, including the tip to spot check your final production CD. I discuss briefly the tendency of lawyers to be trapped by paralysis by analysis, and why we are sometimes considered deal killers by business people because we focus so much on risk avoidance and over-think things. There has to be a proportional limit on the number and cost of double-checks in document review. I also mention in the fifth tip my Accept of Zero Error and ei-Recall checks, which are quality assurance efforts that we make in larger document review projects.

Tip #5 – Check Again

____________

These are five tips to help everyone doing electronic document review. They are not necessarily the “top five,” but they are all important. We suggest you drill these five best practices into your document review team.

For more information on best practices of document review see these three periodically updated resources:

 

 


%d bloggers like this: