Why Interpretation is the Cornerstone of Evidence-based Data Driven Practice

This post responds to a comment by Richard Puyt where I thought I would try to explain my ideas on interpretation and evidence in a more complete manner.

First, a first order belief of mine: data driven practices, that are supported and shown valid by research evidence, is the best way to establish or improve business practices.  Common sense is not a good way to run your business, because it is often wrong.  However, you also need a good theory or mental framework to make sense of your data and you need a broad evaluation framework to understand and to explain how your research is related to your practice.  Without good frameworks, your level of analysis falls back to common sense, no matter how much data you have available.  It simply can become a case of garbage in = garbage out.

This is the point of Stanley Fish in the NY Times Opinionator Blog when he says:

. . . there is no such thing as “common observation” or simply reporting the facts. To be sure, there is observation and observation can indeed serve to support or challenge hypotheses. But the act of observing can itself only take place within hypotheses (about the way the world is) . . . because it is within (the hypothesis) that observation and reasoning occur.  (I blogged about this before here)

Your observations, be they data, measures or research results, need to be interpreted and that can only occur within an interpretive framework such as a good theory or hypothesis.  Furthermore, the quality of your analysis will depend as much on the quality of your interpretative framework as it does on the quality of your data.

Examples

Performance Measurement:  (I previously blogged about this here.)  Any performance measure implies a theoretical rationale that links performance with the measure.  This theoretical relationship can be can be tested, validated and improved over time.  It is not just that you are using a data driven performance system, but that you also have a well supported way of interpreting the data.

Research Evidence: When conducting a quantitative study, care is taken in choosing a sample and in controlling for a wide range of potential confounding variables.  The effects that are the research results may show a causal relationship that can be trusted.  However, you can not then assume that these results can be directly applied to your business where all the confounding variables are back in play and where the sample and context may be different.  It may be a very important piece of evidence, but it should only be a piece in a larger body of evidence.  This evidence can be the basis for a theory (what Fish calls a hypothesis) and used as a basis for a practice that is data driven (what Fish calls observation), but this practice needs to be tested and validated on it’s own merit, not because it relates to a research study.

This is the basis of Evidence-based data driven practice.  Good data, derived from good measures, with a good understanding of how the measures relate to your practice, an understanding that is tested over time.  This is not too hard to do and should be a foundation for business education.

2 thoughts on “Why Interpretation is the Cornerstone of Evidence-based Data Driven Practice

  1. Hi Howard,
    The last paragraph capture the essence of the strength and weakness of evidence based/informed (management) decision making. What is the measure? What is the definition of the measure and what is the quality of the data on which conclusions are drawn and decisions are made? Before you know it, you’ll wander off in the realm of philosophy of science.

    Thanks also for the interesting link you left on my blog. I’ll return the favour by giving you this one: http://bobsutton.typepad.com/my_weblog/2009/11/intuition-vs-datadriven-decisionmaking-some-rough-ideas.html He muses about Intuition and data driven decision making.

    By the way, have you read the books by Nassim Nicolas Taleb?
    I recommend Fooled by Randomness and Black Swan. It deals with decision making, risk appraisal and most of all, epistemological arrogance, ignorance and misperception.

    Have a nice weekend!

  2. Richard;
    Thanks for your response. I’ve taken it as a challenge to clarify or adapt, and I value such challenges greatly.

    1. A specific response – It’s not that I want to broach esoteric topics like those that can be found in the philosophy of science, but I do want to understand issues at their fundamental level and not reduced evidence-based practice to a simplistic case of following rules. Understanding the fundamentals, and doing what is needed accordingly, will give clear and rational direction to any task. A simplistic understanding of the evidence-based movement will make it susceptible to becoming just another passing fad.
    2. I’m working on another post to include more background and evidence relevant to this general topic before moving on.
    Thanks again for your interest and attention.

Leave a Reply

Your email address will not be published. Required fields are marked *