Successful Practice Requires Science and Aesthetics: Trusting in Data and Beauty

In Praise of Data and Science

MIT’s Technology Review posted the article: Trusting Data, Not Intuition.  The primary idea is to use controlled experiments to test ideas and comes from Ronny Kohavi of Microsoft (and formerly of Amazon).  The article can be summarized as follows:

(W)hen ideas people thought would succeed are evaluated through controlled experiments, less than 50 percent actually work out. . . . use data to evaluate an idea rather than relying on . . . intuition.  . . .  but most businesses aren’t using these principles.  . . .What’s important, Kohavi says, is to test ideas quickly, allowing resources to go to the projects that are the most helpful.  . . . “The experimentation platform is responsible for telling you your baby is really ugly,” Kohavi jokes. While that can be a difficult truth to confront, he adds, the benefit to business—and also to employees responsible for coming up with and implementing ideas—is enormous.

This articles further supports my thesis that Evidence-based practice, analytics, measurement and practical experimental methodology are closely related, mutually supportive, and a natural synthesis.

In Praise of Aesthetics

I do believe that, while trusting science is an important idea, that trust should also be tempered because it is a tools for decision-making and acting, not a general method for living.  A successful life of practice is a balance between the empirical and the aesthetic.  You could say that aesthetics, looking at life emotionally and holistically is the real foundation of our experience and how we live life.  Within that frame, it is helpful to step back reflexively and consider the use of empirical tools to benefit our experience, but without denying our aesthetic roots.  Wittgenstein wrote on this (from the Stanford Encyclopedia of Philosophy article on Wittgenstein’s Aesthetics).

“The existence of the experimental method makes us think we have the means of solving the problems which trouble us; though problem and method pass one another by” (Wittgenstein 1958, II, iv, 232).

For Wittgenstein complexity, and not reduction to unitary essence, is the route to conceptual clarification. Reduction to a simplified model, by contrast, yields only the illusion of clarification in the form of conceptual incarceration (“a picture held us captive”).

What I want is to have access to the tools of science and the wisdom to know when to choose their reflexivity.  What I’m against is;

the naturalizing of aesthetics—(which) falsifies the genuine complexities of aesthetic psychology through a methodologically enforced reduction to one narrow and unitary conception of aesthetic engagement.

A Place for Cognitive Tools in Evidence-based Practice

Vygotskian education psychology places a high priority on mediational artifacts or cognitive tools; things like knowledge, concepts, criteria, schemas, etc . . .. These tools act as cognitive mediation and are instrumental to activity as subjects work on an object to produce an outcome.
Activity as Vygotsky's Unit of Analysis

Activity as Vygotsky's Unit of Analysis

I spoke here about how unity of the 3 elements and the central unit of analysis is the activity.  Lets consider an activity example relevant to evidence-based practice.
A clinician (the subject) uses the idea of evidence-based practice (the mediating artifact) to examine routine aspects of their practice (the object) with the goal of changing their practice to improve their patience’s health (the outcome).  If you find that evidence-based changes are not being made in a field, where would you look for a problem?  Many analysis have implied that there is a problem with the subjects, they’re just not using the available evidence or that their knowledge based is deficient.  I would say that it is much more likely that the solution can be found by developing an appropriate mediating artifact that can support clinicians in examining their practice.
This was the focus of Gal’perin, a prominate follower of Vygotsky.  He said that not all (cognitive tools (mediators) are of sufficient quality and that the quality of development (like the development of evidence-based practice) is most dependent on the quality of the cognitive tools.  Specifically, he thought that cognitive tools should be organized around and support the psychological functioning of the subject.    So, what are the psychological functions around which you might organize the concept of evidence-based practice?
  • First, don’t focus on the evidence, focus on the practice and use a tool that brings evidence to a practice focus.  An example might be a checklist used by a surgical team as they prepare for surgery.  The checklist reflects the available evidence and allows the team to bring that evidence to their practice focus, but still allows their cognitive load for addressing important aspect of their practice.
  • Second,  use cognitive tools to organize information and to orient evidence toward action.  A research finding may represent important evidential information, but they are seldom oriented to practice in a way that naturally leads to action.   An example is a network security assessment I developed.  It reflect HIPPA security requirements (the evidence) in a series of 46 questions.  The questions were structured not only to assess security status, to clarify an action plan that would improve the security status.  This again would reduce the cognitive load needed to include an enormous amount of information in a short time span.
Vygotsky developed this idea of mediational tools or cognitive artifacts during the 1920’s, but with the increasing importance of knowledge and other cognitive artifacts, it has never been as relevant or important.  Vygotsky was thinking mainly of children’s development, but his theory is also relevant to adults and their cognitive functioning in their work life.

Critical Thinking, Scientific Reasoning, and the Incorporation of Evidence into Everyday Practice: A Conceptual SymbiosisI

It seems to me that there is a natural affinity between evidence-based practice, scientific reasoning and critical thinking.  I think Kuhn (quoted in Dawson, 2000) captures the essence of this symbiosis:

I have undertaken here to show that these two abilities–the ability to recognize the possible falsehood of a theory and the identification of evidence capable of disconfirming it–are the foundational abilities that lie at the heart of both informal and scientific reasoning. These abilities lie at the heart of critical thinking, which similarly can be regarded, at the most global level, as the ability to justify what one claims to be true (Kuhn, 1993).

Some background considerations and directions for future thoughts and research.

  1. I’m taking the perspective that what cognitive control we have over our decisions and actions, is mediated by our beliefs, theories, schemas and prior knowledge.  Without this mediation everyday actions would represent an unbearable cognitive load.
  2. Although there are good strategies for enabling critical thinking, at it’s core, critical thinking is the ability and disposition to seek disconfirming evidence and use it to change our minds (beefs schemas, theories, etc. . . ).
  3. Although we often equate scientific thinking with the scientific method (hypothesis testing), the core of it’s reasoning is also the disposition to seek and make use of disconfirming evidence.
  4. Evidence-based organizations must actively support critical thinking through their culture and in the organization of their internal processes and practices.
  5. Practice validity (seeking evidence for the validity of organizational practices) is the ability to justify the efficacy of our actions, just as Kuhn considers critical thinking to be a way to justify our claims to truth.

A shout-out to Harold Jarche who’s post Critical thinking in the organization led me down this primrose path.

References

Dawson, R. (2000). Critical Thinking, Scientific Thinking, and Everyday Thinking: Metacognition about Cognition, Academic Exchange Quarterly, accessed 4-8–10 at http://www.thefreelibrary.com/Critical+Thinking,+Scientific+Thinking,+and+Everyday+Thinking:…-a067872702

Kuhn, D. (1993). Connecting scientific and informal reasoning. Merrill-Palmer Quarterly, 39(1), 74-103.

Two Different Ways of Implementing Evidence-based Practice and their Different Requirements for Evidence

It intuitively seems to me that there are two way of applying evidence in Evidence-based Management.

  1. One I’ll call evidence-based decision-making (EBDM), bringing evidence into decision processes.
  2. The other I’ll refer to as evidence-supported interventions (ESI), specific practices that are empirically supported.

I suspect that EBDM will be a tougher nut to crack in practice.  This is because decision-making is often context dependent, involves ill structured problems, and can be cognitively complex.  (See March, 1991; for one take on this complexity.)  Decision processes require a higher level of interpretation regarding the evidence and can easily fall prey to logical errors.  Most thinking on decision-making has stressed that research should begin by analyzing of how people make decisions in real time, not as some sort of abstract logical process.  As Daniel Kahneman (2003) puts it;

psychological theories of intuitive thinking cannot match the elegance and precision of formal normative models of belief and choice, but this is just another way of saying that rational models are psychologically unrealistic ( p. 1449).

Nonetheless, evidence should inform decision processes and I believe that evidence supported protocols, as one example, can prepare the decision space for better decision-making outcomes.  However, this type of process also begins to bring me closer to the second way of applying evidence; through evidence-supported interventions.

Mullen, Bledsoe, & Bellamy (2008) define Evidence-supported Interventions (ESI) as

specific interventions (e.g., assessment instruments, treatment and prevention protocols, etc.) determined to have a reasonable degree of empirical support.

(Other names might include evidence-based practices, empirically supported treatments, or empirically informed interventions.)  In implementation settings, ESIs function as standardized practices; practices where all or a portion of the operational, tactical, logistical, administrative or training aspects of a practice are able to conform to a specific and unified set of criteria.  In other words, the contexts of implementation will allow practice to be replicated exactly as they were defined and constructed in supporting research.   In being evidence-based, it is important that critical issues flow both ways.  If the contexts do not allow replication, or present confounding variables and complexity not addressed in research, it will necessarily reduce the level of support that can be claimed for any research supported practice.

There are many differences between EBDM and ESIs.  I would like to focus here on the different role that theory plays in each.  There are no data or practices that are completely theory free.  All are theory and value laden to some extent.  All datum, hypothesis, or knowledge depend on assumptions and implications that are based in someway on theory.  But, all do not depend in the same way or to the same extent.  I will borrow on Otto & Ziegler (2008) to explain how some of these differences can be ascribed to either causal descriptions or causal explanations.  First, I agree with Otto and Ziegler who say that

Probably, it is fair to say that the quest for causal explanation is theory driven, whereas causal description is not necessarily grounded in theory.

ESIs, because they focus on replication, do not need to be as concerned with the fact that they are relying on causal descriptions.  EBDM, however, are using evidence in a more theoretical way than in the replication of a standard practice.  Because they are dealing with complex and context independent reasoning, they need evidence that is valid in a causal explanative manner.

Two observations – From a strict positivist perspective, this creates a problem for EBDM because of the difficulty in achieving a necessary level of causal explanation.  Positivism can live better through a ESI approach because it can depend on causal description.  Instead an EBDM approach must adopt an argumentative type role in validating evidence.  This is the approach that validity theory has taken.  Validity theory began with a positivist framework that was centered on a criterion approach to validity.  As it became more and more apparent that constructs were the central concern (theoretical concerns) it adopted a unified construct validity perspective that needed an argumentative approach.  This is an approach where validity is never an either or proposition, but rather a concern for the level of validity achieved.  While this is not necessarily the most clear way, it is very pragmatic and practical and able to be implemented across a wide variety of practice locations.

Two Conclusions:

  1. EBDM is concerned with supporting naturalistic decision processes with evidence that is empirically and theoretically supported and can be a easily included in that decision process.
  2. ESIs are concerned with practices, protocols and processes that can function in a standardized manner through the replication of empirically supported research interventions.

References

Kahneman, D., (Dec., 2003). Maps of Bounded Rationality: Psychology for Behavioral Economics, The American Economic Review, Vol. 93, No. 5 , pp. 1449-1475.

March, J.G., (1991).  How Decisions Happen in Organizations, Human-Computer Interaction, 6, 95-117. accessed 02-15-2010 at http://choo.fis.utoronto.ca/fis/courses/lis2176/Readings/march.pdf

Mullen, E.J., Bledsoe, S.E. & Bellamy, J.L., (2008). Implementing Evidence-Based Social Work Practice, Research on Social Work Practice, Vol. 18 No. 4, July 2008 325-338.

Otto, H., & Ziegler, H., (2008). The Notion of Causal Impact in Evidence-Based Social Work: An Introduction to the Special Issue on What Works? Research on Social Work Practice, Vol. 18 No. 4, July 2008 273-277.

Why Interpretation is the Cornerstone of Evidence-based Data Driven Practice

This post responds to a comment by Richard Puyt where I thought I would try to explain my ideas on interpretation and evidence in a more complete manner.

First, a first order belief of mine: data driven practices, that are supported and shown valid by research evidence, is the best way to establish or improve business practices.  Common sense is not a good way to run your business, because it is often wrong.  However, you also need a good theory or mental framework to make sense of your data and you need a broad evaluation framework to understand and to explain how your research is related to your practice.  Without good frameworks, your level of analysis falls back to common sense, no matter how much data you have available.  It simply can become a case of garbage in = garbage out.

This is the point of Stanley Fish in the NY Times Opinionator Blog when he says:

. . . there is no such thing as “common observation” or simply reporting the facts. To be sure, there is observation and observation can indeed serve to support or challenge hypotheses. But the act of observing can itself only take place within hypotheses (about the way the world is) . . . because it is within (the hypothesis) that observation and reasoning occur.  (I blogged about this before here)

Your observations, be they data, measures or research results, need to be interpreted and that can only occur within an interpretive framework such as a good theory or hypothesis.  Furthermore, the quality of your analysis will depend as much on the quality of your interpretative framework as it does on the quality of your data.

Examples

Performance Measurement:  (I previously blogged about this here.)  Any performance measure implies a theoretical rationale that links performance with the measure.  This theoretical relationship can be can be tested, validated and improved over time.  It is not just that you are using a data driven performance system, but that you also have a well supported way of interpreting the data.

Research Evidence: When conducting a quantitative study, care is taken in choosing a sample and in controlling for a wide range of potential confounding variables.  The effects that are the research results may show a causal relationship that can be trusted.  However, you can not then assume that these results can be directly applied to your business where all the confounding variables are back in play and where the sample and context may be different.  It may be a very important piece of evidence, but it should only be a piece in a larger body of evidence.  This evidence can be the basis for a theory (what Fish calls a hypothesis) and used as a basis for a practice that is data driven (what Fish calls observation), but this practice needs to be tested and validated on it’s own merit, not because it relates to a research study.

This is the basis of Evidence-based data driven practice.  Good data, derived from good measures, with a good understanding of how the measures relate to your practice, an understanding that is tested over time.  This is not too hard to do and should be a foundation for business education.

A New Path for Organizational Learning? Developing Discipline Specific Higher Order Thinking Skills for Evidence-based Practice

I am thinking of two ways of addressing evidence-based practice.  These are two ways in which one may devise consultive approaches for moving organizations toward evidence-based practice.  The one I have been discussing lately is to evaluate the processes, practices and practice protocols in terms of the evidence for their validity.  A second way is an educational approach: to develop individual and team abilities in the higher order thinking skills that are necessary to collect and use evidence in daily decision-making.  This is the approach taken by  Middendorf and Pace (2004).  As Middendorf and Pace point out, the types of higher order skills that are needed in many situations are often tied to specific disciplinary ways of thinking rather than to generic formulas of higher order thinking skills.  Their way of modeling the analysis skills needed to interpret and apply evidence is called decoding the disciplines, which can be conceived as 7 steps to uncover and solve problematic or unsuccessful thinking:

  1. Identify Bottlenecks; places where evidence is not being used or where analysis is breaking down.
  2. Identify how experts respond to these types of situations
  3. Identify how expert thinking can be modeled
  4. Devise feedback methods to scaffold expert thinking
  5. Devise ways to motivate learners to progress toward expert thinkers
  6. Devise assessments to monitor progress
  7. Plan for sharing learning and making this approach a part of the organizational culture.

The latest issue of The Chronicle of Higher Education (11-18-09) reports on the attempt to develop this approach at Indiana University in Bloomington.  David Pace’s history courses at IU attempts to develop two skills that he feels are core to the discipline of history: “assembling evidence and interpreting it”.

“Students come into our classrooms believing that history is about stories full of names and dates,” says Arlene J. Díaz, an associate professor of history at Indiana who is one of four directors of the department’s History Learning Project, as the redesign effort is known. But in courses, “they discover that history is actually about interpretation, evidence, and argument.”

The Chronicle reports that the history curriculum at IU is now organized around specific analytic skills and the different course levels by which they should be mastered.

Volume 98 of the journal New Directions for Teaching and Learning was devoted entirely to this topic.  It includes examples of the decoding methodology as it is applied to history, marketing, statistics, genetics, molecular biology, astronomy, the humanities, physiology, and a specific chapter devoted to supporting the assessment step.

I have a kind of initial excitement about this approach.  I’ve known that learning and education are important to all kinds of organizations today and I’ve always been enamored by the meme that businesses must become more like universities.  Decoding the Disciplines is a potential methodology that could crosses over between these two very different universes and also provide a model for organizational learning.

References

Middendorf, J. & Pace, D. (2004). Decoding the Disciplines: A Model for Helping Students Learn Disciplinary Ways of Thinking, New Directions for Teaching and Learning, 98, 1-12.

available at http://www.iub.edu/~tchsotl/part3/Decoding%20Middendorf.pdf

Glenn, D (2009). A Teaching Experiment Shows Students How to Grasp Big Concepts, The Chronicle of Higher Education, Nov 18, 2009.

More on the Research Practice Gap and Evidence-Based Practice

How Do People Approach Evidence-Based Practice

Tracy at the Evidence Soup Blog has a recent post that got me thinking that the processes supporting Evidence-based Practice (EBP) must be centered on actual clinical practices (not some abstract formulation of practice) and that these processes should include both research and clinical expertise.  Tracy reviews a article in the July issue of Clinical Child Psychology and Psychiatry (How do you apply the evidence? Are you an improver, an adapter, or a rejecter? by Nick Midgley).  I hope to review the article myself soon, but my library resources do not yet have the July issue, so my take at this time is dependent on Tracy’s description.

First here is my first take on the article:

Rejectors seem to be rejecting a positivist version of EBP when they discuss normative prepackaged practices.  This is defensible, there is no reason to follow in the positivist’s footsteps

Innovators seem to be focusing on a top down “push” approach.  First, while research in this vain is important, technology and networks are moving toward a pull approach; giving answers to practitioners when they need it.  Secondly, in addition to a top down approach there is also a need for a deep bottom up understanding of practice: understanding practice needs and devising how dissemination models can meet these needs.  Understanding transfer problems may have the question backwards.

Adapter – I like this approach for the most part with two caveats.  First it looks like it is falling into the qualitative / quantitative divide that I dislike.  I believe that you choose the methodology to fit the research question.  Qualitative research is needed to find a deep understanding of practices or to unearth value issues.  But, I’ve seen too many qualitative studies that tried to answer quantitative type research questions (i.e. which intervention is better).  Coming from a validity perspective, I believe that all kinds of data can be integrated to arrive at an inferential judgement on practice validity.  Especially in medicine, I think we often have correlational based research data, but without a lot of theory and practice-based understandings.  We need to understand practices from multiple perspectives that come together like the pieces of a puzzle to make a coherent picture.

Another Way to Approach the Research Practice Gap from a Post-Positivist Perspective

One of Samuel Messick’s validity innovations was to connect construct validity with utility, values and consequences in a progressive matrix.  His original matrix can be found on page 27 in his 1995 Am Psych article available here.  What I have done is to adapted this matrix to what it might look like for Evidence-Based Practice. (The graphic is at the end of this post) (I believe the Messick’s use of the term Test Use is analogous to Clinical Experience, which I have termed Clinical Evidence and Judgement.  Tests exist as artifact and I also believe that practice, although more concrete, can also be analyzed as an artifact in much the same way as Messick analyzes tests.)

Messick uses a matrix which I have used as well, but it could also be viewed as a stepwise process.

  • Step 1. Inferences from Research Data and syntheses forms the evidentiary basis for Practice Validity (PV)
  • Step 2. P V + Clinical Evidence and Judgement forms the evidentiary basis for the Relevance and Utility (RU) of the practice.
  • Step 3. PV + Inferences from Research form the Consequential basis that informs the clinician of the Value Implications (VI) of  a practice
  • Step 4. PV + RU + VI + Social Consequences forms the Consequential basis for Clinical evidence regarding practice use

The bottom line is that Clinical evidence for using a practice is the total of practice validity, judgements of relevance and utility, the value implications from research inferences, and evidence for the personal and social consequences of a practice

Discussion always welcome!

8-7-post table.001

It’s Time Change in Scientific Communication and Dissemination

Publishing Science on the Web by John Wilbanks of the Common Knowledge Blog who participates in the discussion concerning the future of scientific dissemination in the digital age:

. . . science is already a wiki if you look at it a certain way. It’s just a really, really inefficient one – the incremental edits are made in papers instead of wikispace, . . .  And the papers are written in a highly specialized form of text that demonstrates the expertise of the writer in the relevant domain, but can form a language barrier to scientists outside the domain.  . . .  How can we get to enough technical standards so that this kind of science can be harvested, aggregated, and mashed up by people and machines into a higher level of discipline traversal? . . .  But the language barrier among scientists is preserved – indeed, made worse – by the lack of knowledge interoperability at the machine level. It’s the Tower of Babel made digital.

Two really important issues in scientific communication and dissemination that are critical for technological progress and for evidence-based practice.  One is the organization of scientific findings scattered through various journals instead of collaborative consolidated instruments like wikis.  Time is the real information problem today and some form of wiki is the answer.  The second issue is knowledge interoperability.  Precise language is important in scientific communication, but I still get the feeling that current writing styles and vocabularies in many disciplines, when you look at function, have more to do with politics than communication.

Thanks to George Siemens for the point

Making Inferences about the Use of Artifacts in Practices

This post is to think through what I brought up last post, applying the concept of validity to practice.


I remember hearing in school validity asked the question: “does the test measure what it’s intended to measure”?  The problem with this type of approach is that it leads you in circles, both practically and epistemologically.  Messick changed this to a question that was quite literally more consequential.  Is there evidence that the use of the test brings or contributes to the results you intended?


If you view a test or assessment as an artifact imbedded in a practice, you could apply the same type of logic to any artifact that play an active role in that practice.  In artifact creation like in Holmstrom’s article, the logic of validity could be applied as a guide.  Although there would still be an artistic element it would not be random or unsupported.


There is an evidentiary aspect to this way of considering artifacts.  Validity is all about evidence. Theoretical evidence, process evidence, empirical evidence, consequential evidence, generalizability evidence; this is all about validity.  In fact, validity theory can be a way of accounting for evidence from all type of methodology.  Random Controlled Trails are still the best way of judging validity for certain types of research questions, but any type of method can contribute to evidence.

Summary: Validity is a well-developed body of though that can be applied to making inferential judgements about evidence supporting the use of assessment or other active artifacts in the context of a specific practice.

Writing to Tame the Chaos

Recently found great writing and revising prompts and suggestions in the Tomorrows Professor Blog article by Gina Hiatt, Ph.D 851. Reducing Over-Complexity in Your Scholarly Writing

The first one struck me as an illustration of distributed cognition; how we use external aids to add structure and extend our thinking.

Write to find out what you think. Your thoughts will be somewhat muddled until you get them in writing. Don’t go around and around in circles internally until you know what to write. Write before you know what you’re going to say.

Learn to tolerate some degree of confusion, and yes, complexity in your early writing. I’ve noticed that many academics get panicky when their first draft is a mess. It’s supposed to be a mess! Have faith in the revision process.

I really do need to get something down “on the page” before I really understand the implications of what I’m thinking. It supports the limitations of short-term and working memory, but more than that too! It’s also the back and forth / give and take revising process.  I’m revising my thoughts and ideas while I get the first words down.  This is one of the main reasons I blog, to workout ideas on the page and over time.

Good thinking, writing and communicating should go hand in hand.  I also think that there are no principled breaks in the chiastic relationships between thought, writing and communicating.  I think there is a common sense that academic writing is for scholars and not for the rest. Now in one sense, academic are writing for other academics and thus their writing serves an instrumental purpose, but good writing should also be able to serve a broader purpose.  It should be able to communicate, and inspire good thinking for non-academics.

Why are non-academic not exposed to good thinking and why are academics not writing in ways that better influence practitioners:

  • I think there are low expectations for non-academics. To anyone familiar with the literature on teacher expectations, this should send up red flags!  Its is easy to downplay potential via expectations.
  • I think academics get too caught-up in the need for complexity that serves only disciplinary vocabulary and categorization schemes, not the underlying thinking.  Look at the current economy / finance mess and those complex derivatives.  It’s looking more and more like pretty simple fraud that people perpetrated on themselves and on the rest of us by using complex vocabulary and mathematical formulas to cover-up what was basically simple.  Thought can be complex, but there is a lot of complex writing that shouldn’t be.

I shouldn’t be ranting, not with MY dissertation, just working to try getting better.