A Practice Perspective on the Quants and the Humanists

Lee Drutman responded to Timothy Egan’s New York Times Article about creativity and Big Data.

First TE says that companies like Amazon who are based on quantitative methods are not creative because they “marginalized messiness”.  LD responds that “(d)ata analysis and everything that goes into it can be highly creative”, meaning (I guess) that Quants can get down in the mess too.  Both are good points but miss another aspect that unites the arts / humanities and the sciences, and this is the heart of my argument.  They are both creating practices that effect our live in important ways.   The point is that we all create.  It’s not whether we are or are not creative.  It’s a question of what we are creating.  From John Shotter’s Cultural Politics of Everyday Life:

But now, many take seriously Foucault’s (1972: 49) claim that our task consists of not – of no longer – treating discourses as groups of signs . . . but as practices that systematically form the objects of which they speak.

In other words, it’s not wether the Quants are creative, but do their analyses treat me as an object to be controlled, or do they treat me as a human being where the analysis respects my being.  That’s called ontologically responsible assessment.  Again, from Shotter:

I want to argue not for a radical change in our practices, but for a self-conscious noticing of their actual nature.

We should offer people clear and understandable analysis where they can make new connections, but also respects and is responsible to their rights as a person.  Yes, as Lee claims, the sciences and the humanities can work together.  But beyond that, they are both human based social practices.  If we see them as practices a la Foucault, there is much more in common than is different.  They are both not only creative, but they are creating.

Why Interpretation is the Cornerstone of Evidence-based Data Driven Practice

This post responds to a comment by Richard Puyt where I thought I would try to explain my ideas on interpretation and evidence in a more complete manner.

First, a first order belief of mine: data driven practices, that are supported and shown valid by research evidence, is the best way to establish or improve business practices.  Common sense is not a good way to run your business, because it is often wrong.  However, you also need a good theory or mental framework to make sense of your data and you need a broad evaluation framework to understand and to explain how your research is related to your practice.  Without good frameworks, your level of analysis falls back to common sense, no matter how much data you have available.  It simply can become a case of garbage in = garbage out.

This is the point of Stanley Fish in the NY Times Opinionator Blog when he says:

. . . there is no such thing as “common observation” or simply reporting the facts. To be sure, there is observation and observation can indeed serve to support or challenge hypotheses. But the act of observing can itself only take place within hypotheses (about the way the world is) . . . because it is within (the hypothesis) that observation and reasoning occur.  (I blogged about this before here)

Your observations, be they data, measures or research results, need to be interpreted and that can only occur within an interpretive framework such as a good theory or hypothesis.  Furthermore, the quality of your analysis will depend as much on the quality of your interpretative framework as it does on the quality of your data.

Examples

Performance Measurement:  (I previously blogged about this here.)  Any performance measure implies a theoretical rationale that links performance with the measure.  This theoretical relationship can be can be tested, validated and improved over time.  It is not just that you are using a data driven performance system, but that you also have a well supported way of interpreting the data.

Research Evidence: When conducting a quantitative study, care is taken in choosing a sample and in controlling for a wide range of potential confounding variables.  The effects that are the research results may show a causal relationship that can be trusted.  However, you can not then assume that these results can be directly applied to your business where all the confounding variables are back in play and where the sample and context may be different.  It may be a very important piece of evidence, but it should only be a piece in a larger body of evidence.  This evidence can be the basis for a theory (what Fish calls a hypothesis) and used as a basis for a practice that is data driven (what Fish calls observation), but this practice needs to be tested and validated on it’s own merit, not because it relates to a research study.

This is the basis of Evidence-based data driven practice.  Good data, derived from good measures, with a good understanding of how the measures relate to your practice, an understanding that is tested over time.  This is not too hard to do and should be a foundation for business education.

Scanning Horizons: Data Driven Practice

Summary: Without standards, data becomes more important in guiding practice.  Construct Measurement is also important to generate data that is relevant and of high quality relating to practice.

I proposed that management education and practice should become much more experimental and data-driven in nature — and I can tell you that it is amazing to realize how little business know and understand how to create and run experiments or even how to look at their own data!  We should teach the students, as well as executives, how to conduct experiments, how to examine data, and how to use these tools to make better decisions.  Dan Ariely (2009) in Technology Review

A second horizon exists where measurement is needed, but no standards exist.  Without standards, experimental methodology is another reasonable path.  Important tasks are to design measurement and to develop a clear logic leading from experimental results to improved practice.  Six sigma is an example of this kind of approach.  What can make it perplexing is the difficulties in developing measures when practice is rooted in social variables.  This calls for building measures based on complex educational, social or psychological constructs on which to base experiments.  Some companies that follow a balanced scorecard approach could be improved by the better measurement of relevant constructs.