Looking at the Bigger Picture
It is easy to nitpick, particularly someone’s opinion. The discovery of an error is not necessarily the reason to reject a conclusion. Successful expert witness challenges must clear a higher standard than simply establishing that an expert made a mistake. Attacking an expert on trivialities can quickly become counterproductive. We see no reason to play gotcha with an expert.
Critical analysis of an expert’s work, we believe, is an opportunity to evaluate the story that expert tells. With that understanding comes either an opportunity to agree with it or to develop an articulation of why the expert misses the overall mark.
Our Scrutiny Approach
Rule 702 is the standard for expert challenges. It also provides a foundation for scrutinizing one’s own expert and/or the opposing party’s. Rule 702 states:
A witness who is qualified as an expert by knowledge, skill, experience, training, or education may testify in the form of an opinion or otherwise if the proponent demonstrates to the court that it is more likely than not that:
(a) the expert’s scientific, technical, or other specialized knowledge will help the trier of fact to understand the evidence or to determine a fact in issue;
(b) the testimony is based on sufficient facts or data;
(c) the testimony is the product of reliable principles and methods; and
(d) the expert’s opinion reflects a reliable application of the principles and methods to the facts of the case.
The evolution of Rule 702 from Daubert is worth observing. The Daubert standard states that an expert’s methodology is valid if:
- Whether the technique or theory in question can be, and has been tested;
- Whether it has been subjected to publication and peer review;
- Its known or potential error rate;
- The existence and maintenance of standards controlling its operation; and
- Whether it has attracted widespread acceptance within a relevant scientific community.
While issues such as a “known or potential error rate” are not spelled out in Rule 702, Rule 702 carries the spirit of Daubert as error rate is an indicator of “reliable principles and methods.” There is a beauty in Rule 702’s clean brevity, but a depth exists within it for a vigorous examination of an expert opinion. Whether or not an expert’s opinion is challenged before the court, the knowledge of weaknesses in that expert’s work opens other opportunities.
Examples of some of the things we may look at, whether assisting an expert or working opposed to one, include:
- Assumptions: To understand an expert’s opinion requires, at a minimum, knowing the assumptions the opinion is based on. Some experts clearly state their assumptions, while others do not. Would changing an incorrect assumption alter the expert’s conclusion? Does an underlying assumption render an expert’s conclusion irrelevant? Does an assumption reveal bias? Examination of assumptions can be a rich source for understanding of the expert’s work. And, when working with experts, establishing what those assumptions will be is critically important.
- Audience: Who is the expert’s report written for? We often see expert reports that say the report is exclusively for their client. This raises a question of whether the opposing party or a trier of fact is permitted to use it. But more importantly, it raises a question of whether the report contains sufficient information for a third party to understand the report.
- Biases for the Expert’s Client: There is nothing per se improper about an expert witness and the expert’s client having a good business and/or personal relationship. But whether that relationship has influenced the expert’s work, affecting the reliability of the work, is something to be watchful for. We can assist in making that determination.
- Complexity: Experts routinely perform complex work in the development of their opinions. There is nothing inherently wrong about that. But complexity can also be used as a mask and tool for intimidation for work that has a questionable foundation.
- Contradiction: Many experts have experience in testifying and are leaders in their field. By examining their past testimony and positions they’ve taken within their profession, often through publishing, it may be possible to spot issues that they are contradicting themselves on.
- Cognitive Biases: Everyone is subject to potential cognitive biases and experts are no different. We look for biases that can adversely affect the reliability of the expert’s opinion.
- Errors of Omission and Commission: Mistakes happen. Carelessness can as well. We work to identify whether those issues are present in an expert’s work and whether they adversely affect the expert’s opinion.
- Ethical Issues: If past ethical lapses exist in the expert’s history or there appear to be ethical issues associated with the expert’s work, it is important to know about it.
- Ipse Dixit: Experts basing aspects of their analysis solely on their “professional experience” is unfortunately common. So too are examples of “black box” results, such as calculations generated from a computer model that the expert can’t explain but insists must be trusted. Spotting this ipse dixit thinking is not always easy, but when it serves as even a component of the foundation of the opinion, it should be scrutinized. Conclusions should never be shrouded in mystery.
- Methodologies: Knowing and understanding the methodology the expert used can create opportunities. Was the methodology correctly employed? Is the methodology generally accepted, controversial, or one of a kind? What are the inherent limitations of the methodology? Is the methodology appropriate considering the theory of the case? Is the methodology supported by other methodologies the expert used?
- Qualifications: Rule 702 requires an expert’s qualifications come from knowledge, skill, experience, training, or education. We help confirm those qualifications.
- Relevance: It is not uncommon to see brilliant work, performed by well-qualified and brilliant experts, that isn’t relevant to the litigation. Rule 702 requires that an expert’s work reliably address the fact of the case. Incorrect assumption(s), vague or misapplied instructions from the client, a misunderstanding of the theory of the case, among other issues can result in an expert’s conclusion missing the mark.
- Replicability: Good and well-documented work should be reproducible. Replicability doesn’t mean the answer, insofar as the litigation, is the correct answer, but that the methodology is credible. When an expert’s work isn’t replicable either because the procedure used is uncertain or the results don’t align, the reliability of that work can come into question.
- Statistical and Research Problems: Statistical modeling, surveying, extrapolations, studying a sample, or other forms of research are routinely a core part of an expert’s opinion. If an error in the expert’s analysis exists, we can often spot it in this area.
- The Ultimate Issue: Rule 704 permits an expert’s opinion to “embrace an ultimate issue.” If an expert’s opinion meets this standard, was that the intent of counsel? An opinion that well paired to the theory of the case (the “ultimate issue”) as opposed to one that arrives there haphazardly. The latter may be evidence of the expert overreaching, or the expert and counsel not being on the same page. This may create a situation where small differences my exist between the expert’s theory of the case differs from counsel’s. Those cracks can lead to problems.
- Understandability: As simplistic as it may sound, it is important to confirm whether an expert’s analysis makes sense. Is it possible to follow the logic and the story the expert tells to understand the conclusion? When evaluating your own expert, will opposing counsel and/or the trier of fact be able to follow the logic? If not, it’s typically because there is a problem with it.