Why Interpretation is the Cornerstone of Evidence-based Data Driven Practice

This post responds to a comment by Richard Puyt where I thought I would try to explain my ideas on interpretation and evidence in a more complete manner.

First, a first order belief of mine: data driven practices, that are supported and shown valid by research evidence, is the best way to establish or improve business practices.  Common sense is not a good way to run your business, because it is often wrong.  However, you also need a good theory or mental framework to make sense of your data and you need a broad evaluation framework to understand and to explain how your research is related to your practice.  Without good frameworks, your level of analysis falls back to common sense, no matter how much data you have available.  It simply can become a case of garbage in = garbage out.

This is the point of Stanley Fish in the NY Times Opinionator Blog when he says:

. . . there is no such thing as “common observation” or simply reporting the facts. To be sure, there is observation and observation can indeed serve to support or challenge hypotheses. But the act of observing can itself only take place within hypotheses (about the way the world is) . . . because it is within (the hypothesis) that observation and reasoning occur.  (I blogged about this before here)

Your observations, be they data, measures or research results, need to be interpreted and that can only occur within an interpretive framework such as a good theory or hypothesis.  Furthermore, the quality of your analysis will depend as much on the quality of your interpretative framework as it does on the quality of your data.

Examples

Performance Measurement:  (I previously blogged about this here.)  Any performance measure implies a theoretical rationale that links performance with the measure.  This theoretical relationship can be can be tested, validated and improved over time.  It is not just that you are using a data driven performance system, but that you also have a well supported way of interpreting the data.

Research Evidence: When conducting a quantitative study, care is taken in choosing a sample and in controlling for a wide range of potential confounding variables.  The effects that are the research results may show a causal relationship that can be trusted.  However, you can not then assume that these results can be directly applied to your business where all the confounding variables are back in play and where the sample and context may be different.  It may be a very important piece of evidence, but it should only be a piece in a larger body of evidence.  This evidence can be the basis for a theory (what Fish calls a hypothesis) and used as a basis for a practice that is data driven (what Fish calls observation), but this practice needs to be tested and validated on it’s own merit, not because it relates to a research study.

This is the basis of Evidence-based data driven practice.  Good data, derived from good measures, with a good understanding of how the measures relate to your practice, an understanding that is tested over time.  This is not too hard to do and should be a foundation for business education.

Evidence and Interpretation: Two Sides of the Same Coin

Robert Aronowitz wrote an interesting historical analysis on the background of the current mammogram debate titled: Addicted to Mammograms.  His analysis provides another layer of meaning to the debate, which is my point.  Evidence must be interpreted. Aronowitz infers that the people at the Preventive Services Task Force, who made recommendations based on their interpretation of the evidence, didn’t understand how it would be re-interpreted in the media and the health industry, especially in the context of the current health insurance reform debate. But evidence and interpretation are two sides of the same coin.  One side may be stamped with permanent maker and the other with erasable marker, but you can’t have a one sided coin.  . . . Well, . . . maybe physicists can have a one sided coin, but not the rest of us mortals.

A New Path for Organizational Learning? Developing Discipline Specific Higher Order Thinking Skills for Evidence-based Practice

I am thinking of two ways of addressing evidence-based practice.  These are two ways in which one may devise consultive approaches for moving organizations toward evidence-based practice.  The one I have been discussing lately is to evaluate the processes, practices and practice protocols in terms of the evidence for their validity.  A second way is an educational approach: to develop individual and team abilities in the higher order thinking skills that are necessary to collect and use evidence in daily decision-making.  This is the approach taken by  Middendorf and Pace (2004).  As Middendorf and Pace point out, the types of higher order skills that are needed in many situations are often tied to specific disciplinary ways of thinking rather than to generic formulas of higher order thinking skills.  Their way of modeling the analysis skills needed to interpret and apply evidence is called decoding the disciplines, which can be conceived as 7 steps to uncover and solve problematic or unsuccessful thinking:

  1. Identify Bottlenecks; places where evidence is not being used or where analysis is breaking down.
  2. Identify how experts respond to these types of situations
  3. Identify how expert thinking can be modeled
  4. Devise feedback methods to scaffold expert thinking
  5. Devise ways to motivate learners to progress toward expert thinkers
  6. Devise assessments to monitor progress
  7. Plan for sharing learning and making this approach a part of the organizational culture.

The latest issue of The Chronicle of Higher Education (11-18-09) reports on the attempt to develop this approach at Indiana University in Bloomington.  David Pace’s history courses at IU attempts to develop two skills that he feels are core to the discipline of history: “assembling evidence and interpreting it”.

“Students come into our classrooms believing that history is about stories full of names and dates,” says Arlene J. Díaz, an associate professor of history at Indiana who is one of four directors of the department’s History Learning Project, as the redesign effort is known. But in courses, “they discover that history is actually about interpretation, evidence, and argument.”

The Chronicle reports that the history curriculum at IU is now organized around specific analytic skills and the different course levels by which they should be mastered.

Volume 98 of the journal New Directions for Teaching and Learning was devoted entirely to this topic.  It includes examples of the decoding methodology as it is applied to history, marketing, statistics, genetics, molecular biology, astronomy, the humanities, physiology, and a specific chapter devoted to supporting the assessment step.

I have a kind of initial excitement about this approach.  I’ve known that learning and education are important to all kinds of organizations today and I’ve always been enamored by the meme that businesses must become more like universities.  Decoding the Disciplines is a potential methodology that could crosses over between these two very different universes and also provide a model for organizational learning.

References

Middendorf, J. & Pace, D. (2004). Decoding the Disciplines: A Model for Helping Students Learn Disciplinary Ways of Thinking, New Directions for Teaching and Learning, 98, 1-12.

available at http://www.iub.edu/~tchsotl/part3/Decoding%20Middendorf.pdf

Glenn, D (2009). A Teaching Experiment Shows Students How to Grasp Big Concepts, The Chronicle of Higher Education, Nov 18, 2009.

Communicating Evidence Over Different Cognitive Frameworks: Overcoming Incommensurability in Communication Frameworks Between Research and Practice

My recent posts have highlighted the differences between scientific research and other types of practice as it relates to the design of evidence-based practice.  Previously I discussed how the larger scope of practice changes the epistemological needs of practice knowledge.*  In this post I will take up Nicolay Worren’s paper which suggests the cognitive frameworks that managers use to guide and control business practice are also different from those used to disseminate scientific practice.  (Worren, N., Moore, K. & Elliott, R. (2002). When Theories Become Tools: Towards a Framework for Pragmatic Validity, Human Relations, 55(10), 1227-1250.)

Nicolay notes that science is typically conducted and disseminated in propositional frameworks (often steeped in dense scientific vocabulary), but notes that managers depend more on narrative and visual cognitive and communication frameworks that are constructed in everyday language.  This can result in 2 problems:

1. People do not really understand the generalizable meaning of research because it is buried within obscure propositional frameworks.  Good communication must be constructed with the ability to span different cognitive and situational frameworks.  Consider the following quote from R.L. Ackoff **:

Until we communicate to our potential users in a language they can understand, they and we will not understand what we are talking about. If Einstein could do it with relativity theory, we should be able to do it with systems thinking (Einstein and Infeld, 1951). It is easy to hide the ambiguity and vagueness in our own thinking behind jargon, but almost impossible to do so when speaking or writing in ordinary language.
We have developed a vocabulary that equips our students with the ability to speak with authority about subjects they do not understand. Little wonder they do not become effective spokespersons to potential users.
Ackoff, R.L. (2006). Why Few Organizations Adopt Systems Thinking, http://ackoffcenter.blogs.com/ackoff_center_weblog/files/Why_few_aopt_ST.pdf

2. Managers must deal with practices that have a wide scopes and a high level of complexity.  Just as this creates different epistemological requirements for knowledge, it also entails different cognitive requirements for understanding and communicating.  While propositional frameworks are good for maintaining precision in deductive arguments, they do not have the speed of communication and the ability to communicate complexity, change/time, or emotion that can be found in narrative and visual frameworks.

Different cognitive frameworks can make the language of research and practice not only difficult to translate, but they can become almost incommensurate.   Again, I don’t think that research and practice are incommensurate, but you’ll need to engage in inductive processes to appropriately bring them together.

Notes

* Primarily this refers to the inability to design practices scientifically; with the amount of variable control necessary to ensure the same level of internal validity we see in research.  Without this level of control it is questionable when generalizing research to uncontrolled situations.  This does not mean that research is not relevant.  It means that decisions are dependent on inductive processes (as opposed to the deductive processes common in research) and that these processes are aligned with a goodness of fit model of verification (as opposed to deductive truth).

** I’m also indebted to Nicolay for pointing me to Ackoff and this source.  See his blog post:

http://www.nicolayworren.com/2009/11/russell-ackoff-1919-2009.html