I. The Issue, where should we go with higher education?

In the future of higher education conversation I take the perspective that the future is now, or rather, about following the trends that are discernible now.  It less about what to expect in the future than it is about what should be happening now.  These future questions address 2 potential areas: technology enabled new possibilities and the weaknesses of current pedagogy and curriculum.

1 New possibilities enabled by technology include things like: distance-education, open source content and programing, and increased connectivity through social some.

2 Weaknesses of  current pedagogy and curriculum, (allowing potential university competitors or collaborators) is more interesting because it is about change, a much more difficult process.  It also includes critiques of social science and educational paradigms (such as education, sociology or business MBA programs) and who can potentially tap into areas of discontent, especially discontent over the value of education and continued rising costs.

This will take more than one post and could be a potential research project.  I’ll begin by looking at the critiques of pedagogy and curriculum.

II. The Critiques of the current system

(a) The New York Times has issued debate on teacher preparation.  The debate is wide ranging, but there is significant current around the idea that education degrees do little to prepare you for classroom skills you will need in real schools.

(b) Seth Godin has a similar take on business schools (another arm of the social sciences).  First, Seth say b schools are good for 3 things:

  1. screen for future employers
  2. to build a network
  3. and third

“(and least important) reason to go to business school is actually to learn something. And this is where traditional business schools really fail. The core curriculum at business schools is as close to irrelevant as you can imagine.”

Seth goes on to indicate that the following list is what people really need and that b schools do not supply:

  1. Finding, hiring, and managing supergreat people
  2. Embracing change and moving quickly
  3. Understanding and excelling at business development and at making deals with other companies
  4. Prioritizing tasks in a job that changes every day
  5. Selling — to people, to companies, and to markets

This is an interesting critique, although it’s bait oversimplified.  Seth has completed an alternative MBA program and I’m guessing that his program is more about changing pedagogy with a more open curriculum.  When you think that open source education may open a very large universe of potential curriculum and the new pedagogy is how you navigate through this universe.  It highlights the increasing importance in trust: in yourself, in student, in teachers, mentors, and in the educational aspects of the collaboration process.

(c) Henry Mintzberg (of MiGill University) suggest closing down MBA programs in a Harvard Business Ideacast podcast; (also available through itunes) that management can not be studied out of context and leads to too many false positive decisions (to use a research based metaphor).

III My Take –

(a) People need resources, now more than ever, and university resources are a potentially large source, but we need a longer time view.  If we are forced to leave the university for resources, it will be a hugh loss of potential resources.

(b) How can we devise a pedagogy that is future oriented?  You might learn something today, but you’ll need a reminder 10 years from now, or you’ll need to revisit it in 10 years, or you’ll need that support community in 10 years and all the years in between?  How can we pedagogically organize content (curriculum) so that it can be accessed, synthesized, and further developed in a just in time fashion?  (That last one sounds like a pedagogy and technological question.)  Organizing things must be part of it.  Social media must be part of it.  Teaching process (not content) must be part of it, but an organized layered approach.  Following a rote process will not do (see the post on Bill Starbuck), you need to understand what the process is doing so that it can be shaped appropriately to the context.

(c) We need to re-think what it means to be an educated person in many different fields.  A problem with many programs is a common problem, they think that they already know what to do, and maybe 30 years ago there was an illusion that they did indeed.  What is an educated person; what can they do; who can they become?

I’ll chew on that for a while.

The Big Shift: Moving to a social-cultural-constructivist Educational Framework for Organizational Learning

While reading Jay Cross’s comments on John Hagel’s definition of the Big Shift the thought came to me, that this is really a re-definning of knowledge management within a framework that would be acceptable to a social-cultural-constructivist.  Here are a list of Hagel’s definition categories and my thoughts about them.
From knowledge stocks to knowledge flows: I interpret this as a shift from an attempt to objectify knowledge to the recognition that knowledge is bounded by people and contexts, and that knowledge becomes useful when actualized in real-time processes.  You don’t need a database of content that was written for different contexts and different times.  Instead you need access to conversations with people who have a degree of shared understandings (cognitive contexts).
From knowledge transfer to knowledge creation:  Constructivism is often considered synonymous with discovery learning and I don’t think that is correct, but learning is a building process.  Except for modeling (think: mirror neurons) transfer isn’t a valid metaphor for learning.  Better metaphors are creating, building or growing.  These are literal metaphors if you think of learning as the neurology of synaptic development.  Knowledge creation is often achieved by synthesizing new connection between previous knowledge in new ways and learning is represented neurologically by making new connections between existing neurons.
From explicit knowledge to tacit knowledge:  I really don’t like the term tacit knowledge; I’ve never seen a good definition.  Sometimes it’s explicit knowledge that hasn’t yet been well expresses, sometimes it refers to contextual elements.  I’ve always believed that knowing only exists for doing things, the idea that the deed preceded the word.  Sometimes explicit knowledge is just about trying to ascribing more capability to abstract knowledge than it is able to handle.  Let’s just accept that knowing is for doing, it’s one of the main reasons for getting learning out of the classroom and into the world.  Hagel doesn’t seem to realize this yet and why I don’t seem to get much value from his paragraph on tacit knowledge.
From transactions to relationships:  Trust is indeed becoming more and more important.  I also relate the idea of trust to Umair Haque’s idea of profiting by creating thick value, doing things that make peoples lives better.  I really believe that the transition from transactions to relationships and from thin value to thick has a lot more to do with financial and accounting frameworks than it appears on the surface.  The financial set up has to fit the situation correctly, especially if finance is driving your activity.
From zero sum to positive sum mindsets:
This has a lot to do with boundary crossing, open source, and the aforementioned transaction to relationships paragraph.  A major goal of all organization should be identifying their zero sum process pockets and thinking about moving them to positive sum frameworks.  Often the key is not in the processes themselves, but in the frameworks and cultural understandings that support those processes.
From push programs to pull platforms:
People tend to think of social media here, but that’s just a technology platform.  What is needed first is a cultural platform that makes employees partners and then a relationship platform that blurs organizational boundaries so there is a network to pull from.  While technology can facilitate much, people are the foundation and institutions are important facilitators.
From stable environments to dynamic environments:
This is not a choice, environments are becoming more dynamic, the trick is to develop resilience, the ability to identify when change is needed and the ability to adapt in a timely fashion.  The trick is to not let change become disruptive from a cognitive and a work-flow standpoint.  Sense – learn – respond, it needs to happen all the time and at all levels.  Organizations can cope if individuals are always learning and striving to improve, (something I believe is a part of human nature, that is if organizations do not make structures to stifle it) and if organizations take steps to be flexible in their policy structure.  Refer to the previous paragraph on transactions and relationships.  It is important the employees trust their organization and that their organization must trust their employees.  It;s about creating thick value through and through.
Again this is all pretty much consistent with a social cultural constructivist psychological and educational framework.  Previous ideas about knowledge management could be thought of as a management corollary to positivist psychology.  A rational view that just doesn’t square with the way things seem to work in real life.

Network ROI

Interesting IBM article I was thankfully pointed to by the Evidence-based Soup Blog
Wu et. al. (2009) Value of Social NetworksA Large Scale Analysis on Network Structure Impact to Financial Revenue of Information Technology Consultants
Care is needed interpreting this research as it is correlational and can not imply causation, but I would emphasize three findings:

  1. I believe there is evidence that diversity in project teams (maybe coupled with good communication skills) improves performance.  Wu provides evidence that this could apply to communication networks as well.
  2. Having access to powerful individuals (in the hierarchy) improves the performance of project teams
  3. The social networks of the entire project team seems to be more important than the networks of individuals

This would imply that companies should encourage the development of strong diverse project teams networks and should  support the involvement of upper level management in project team networks.

Channeling Pandora: Ideas from a 2007 Interview with Management Professor Bill Starbuck

Reading through documents from the Stanford Evidence-based Management Blog Site, I came across an interesting article (Un)Learning and (Mis)Education Through the Eyes of Bill Starbuck: An Interview with Pandora’s Playmate (Michael Barnett,(2007) Academy of Management Learning and Education, 6, 1, 114-127).

Starbuck seems to be concerned with two things: (1) methodological problems in research and (2) un-learning or fossilized behavior in organizations.
On methodology:  You can’t just apply standard statistical protocols in research and expect to get good answers.  You must painstakingly build a reasonable methodology fitting your methods to contexts and tasks, much like you are fitting together the pieces of a puzzle.  In a personal example: I consulted practically every text book I had when developing methods for my dissertation, but the most common were introductory statistical texts.  I kept asking myself: what am I doing, what are the core statistical concepts I need to do this, how can I shape my methods to fit my tasks to these core concepts.  Almost all advanced statistical techniques are an extrapolation of the concepts found in introductory statistics and your can’t really understand how to use these advanced procedures until you understand their core and how they fit your methodological circumstances.  As Starbuck points out, the concept of statistical significance is the beginning of results reporting, not the end.  You must go on to build a case for substantive importance.  He points out that mistakes are common in reporting effect sizes.  I believe that this often happens because people simply apply a statistical protocol instead of understanding what their statistic are doing.

A favorite issue of mine (that Starbuck implies, but does not address directly) is the lack of a theoretical framework.  Without theory, you are flying empirically blind.  Think of the four blind men empirically describing an elephant by holding a trunk, leg, body and tail.  Vision (or collaboration) would have allowed the men to “see” how their individual observations fit together as a hole.  You must begin with the empirical, but reality is always larger than your empirical study and you need the “vision” of a theoretical framework to understand the larger picture and how things fit together.  Theory is thus an important part of your overall methodological tact.

On (Un)Learning: Starbuck discusses the need to unlearn or to change organizational processes in response to the changing environment.  It is a problem where past success cloud your vision obscuring the fact the what worked before is no longer working.  The problem is not that people can’t think of what to do to be successful, it’s that they already know what to do and their belief keeps them from seeing that there even is a problem or seeing the correct problem.  Starbuck talks about problem solving in courses he taught.  He found that people often found that the problem they needed to solve was not the problem they initially had in mind.  Their biggest need was to change beliefs and perspectives.

The psychologist Vygotsky spoke of something very similar as fossillized behavior.  As someone is presented with a unique problem, they must work out the process solutions to the problem externally.  Later the process is internalized to some extent and become somewhat automated, requiring much less cognitive load.  After more time this can become fossilized, that is, behavior that is no longer tied to a process or reason, but continues as a sort of tradition or habit.  This would apply at the organizational level as well as the individual posychological level.  I would like to investigate the concept of organizational resilience as a possible response to fossilized organizational behavior as well as a way of responding to extreem events.  This would  emphasize an ability to change in response to environmental demands.  Starbuck thinks that constant change is too disruptive to organizations, but I believe that there may be a combination of processes, capabilities and diversity that enable organizations to sense and respond, not necessarily constantly, but reasonably on an ongoing basis as well as when the enevitable  black swan happens.

Beware the Statistical “Dragon King”

A power law describes a relationship between 2 quantities, typically between event frequency and event size where the larger the event, the smaller the frequency.  (Think of a long-tail distribution)  Recently, large events have been referred to as black swans, rare large improbable events (Talub, 2007).  Predicting black swans is difficult because there are too many variables and unknowns in prediction, but their effect size make them too problematic to ignore.

The Physics arXiv Blog recent discussed a proposition by Didier Sornette at the Swiss Federal Institute of Technology.  Sornette says that outsized outliers are more common then they should be in power distributions because they are subject to feedback loops.  He termed these outliers dragon kings.  In real life examples (at lest it seems to me that) these feedback loops seem to often be social; an example of jumping on the bandwagon.  This is another reason that black swans are much more common than they should be according to power laws.

Very relevant for risk management calculations.  If you are preparing for potential risks, beware not only for black swans (rare events with large effects that are hard to predict because you don’t know the future of many varibles), but also for dragon kings (feedback loops that increase the effect size of somewhat rare events, making them more common that a power law distribution would expect).  It provides a rationale for the development of resilient organizations, the ability to change quickly in response to environmental events, instead of relying on cost probability decision matrixes.

More on the Research Practice Gap and Evidence-Based Practice

How Do People Approach Evidence-Based Practice

Tracy at the Evidence Soup Blog has a recent post that got me thinking that the processes supporting Evidence-based Practice (EBP) must be centered on actual clinical practices (not some abstract formulation of practice) and that these processes should include both research and clinical expertise.  Tracy reviews a article in the July issue of Clinical Child Psychology and Psychiatry (How do you apply the evidence? Are you an improver, an adapter, or a rejecter? by Nick Midgley).  I hope to review the article myself soon, but my library resources do not yet have the July issue, so my take at this time is dependent on Tracy’s description.

First here is my first take on the article:

Rejectors seem to be rejecting a positivist version of EBP when they discuss normative prepackaged practices.  This is defensible, there is no reason to follow in the positivist’s footsteps

Innovators seem to be focusing on a top down “push” approach.  First, while research in this vain is important, technology and networks are moving toward a pull approach; giving answers to practitioners when they need it.  Secondly, in addition to a top down approach there is also a need for a deep bottom up understanding of practice: understanding practice needs and devising how dissemination models can meet these needs.  Understanding transfer problems may have the question backwards.

Adapter – I like this approach for the most part with two caveats.  First it looks like it is falling into the qualitative / quantitative divide that I dislike.  I believe that you choose the methodology to fit the research question.  Qualitative research is needed to find a deep understanding of practices or to unearth value issues.  But, I’ve seen too many qualitative studies that tried to answer quantitative type research questions (i.e. which intervention is better).  Coming from a validity perspective, I believe that all kinds of data can be integrated to arrive at an inferential judgement on practice validity.  Especially in medicine, I think we often have correlational based research data, but without a lot of theory and practice-based understandings.  We need to understand practices from multiple perspectives that come together like the pieces of a puzzle to make a coherent picture.

Another Way to Approach the Research Practice Gap from a Post-Positivist Perspective

One of Samuel Messick’s validity innovations was to connect construct validity with utility, values and consequences in a progressive matrix.  His original matrix can be found on page 27 in his 1995 Am Psych article available here.  What I have done is to adapted this matrix to what it might look like for Evidence-Based Practice. (The graphic is at the end of this post) (I believe the Messick’s use of the term Test Use is analogous to Clinical Experience, which I have termed Clinical Evidence and Judgement.  Tests exist as artifact and I also believe that practice, although more concrete, can also be analyzed as an artifact in much the same way as Messick analyzes tests.)

Messick uses a matrix which I have used as well, but it could also be viewed as a stepwise process.

  • Step 1. Inferences from Research Data and syntheses forms the evidentiary basis for Practice Validity (PV)
  • Step 2. P V + Clinical Evidence and Judgement forms the evidentiary basis for the Relevance and Utility (RU) of the practice.
  • Step 3. PV + Inferences from Research form the Consequential basis that informs the clinician of the Value Implications (VI) of  a practice
  • Step 4. PV + RU + VI + Social Consequences forms the Consequential basis for Clinical evidence regarding practice use

The bottom line is that Clinical evidence for using a practice is the total of practice validity, judgements of relevance and utility, the value implications from research inferences, and evidence for the personal and social consequences of a practice

Discussion always welcome!

8-7-post table.001

A Pedagogical Belief Statement

Learning at Work Podcast

This is the first in a series of podcasts that will be split between educational and music performance issues.  This first video podcast is sort of a pedagogical belief statement.  It is summed up on the final screen:

Most educational artifact in use today are rooted in concepts tailored to the needs of the middle-ages and ancient Greece.

We need newly designed artifacts:

  • New collaborative institutional structures to promote true lifelong learning.
  • New future oriented pedagogies that support network learning.
  • A focus on relationships as the source of pedagogy.
  • A educational committment that transcends the boundaries of time and space.

Please comment!  I’m hoping to have a development conversation around this theme over time to expand on this basic idea.

We Need Innovation in Creating Thick Value

A Live Science article, Obama: Key to Future Is Innovation by Robert Roy Britt, discusses Obama’s call for innovation through education and controlling health care costs, The problem is that the concepts are a little to vague and unfocussed.  What we need most is innovation in value creation similar to that called for by Umair Haque who writes in the Edge Economy Blog about The Value Every Business Needs to Create Now.  He advocates for thick value.  Thin value is “built on hidden costs, surcharges, and monopoly power”, while thick value is “awesome stuff that makes people meaningfully better off”.  As a country we either need people to shift to thick value or we need to start picking winners and losers with the loser coming from the ranks of the thin group.  If we reduce the cost of health care, it has to come out of someone’s pocket unless that someone starts to create thick value.

Some History and Opinions on Validity

Lee J Cronbach and Samuel Messick

Most of my ideas on validity are based on the ideas of Cronbach and Messick.  Lee J. Cronbach and Paul E. Meehl wrote the 1955 classic paper: Construct Validity in Psychological Tests in which they were the first to present a fully formed conception of construct validity.  Working out the details of this paper eventually led to a unitary concept centering on construct validity.  Tests and assessments, to varing degrees, are measures of constructs, not things in themselves.  Validity is the process of assembling various sorts of data to make inferences about the meaning and utility of the measurement of these constructs.  Messick lists six type of validity data that I discussed here.  Messick was instrumental in advocating for a unitary conception of validity, but I give him most credit for adding consequential evidence as a catagory of data for making inferences about validity.  Validity is not just technical standards, it’s about making accomplishments using assessments as artifacts in processes to attempt to do things.

I admire these men not just because they coined specific approaches, but because they dealt seriously with the big philosophical issues of their time.  Philosophies like positivism and objectivism were being criticized and challenged during the second half of the 20th Century.  They did not ignore these critiques, but took them seriously and addresses them head-on through their development of validity.

Why is Validity so Seldom Addressed as a Basis for Evidentiary Considerations

This is also about the 20th Century paradigm wars.  Most people in the 20th Century fell into the camps of either positivist or post-positivists.   Neither of these camps had much interest in Messick’s and Cronbach’s construct validity.  Positivists wanted simple empirical evidence of validity that could be expressed authoritatively in some sort of criterion coefficient.  They had little interest in any manner of inferring that was convoluted by theory or consequences in a validity that had to be argued instead of proclaimed.  Post-positivists on the other hand often rejected measurement and empirical methods for textual and critical methods.  These textual and critical methods were needed, but they sort of threw out the baby with the bath water.  By rejecting measurement approaches in general, they never saw that Cronbach and Messick came to share many of the same concerns and was working out a critique of positivism in their own way.

There were few people in the middle ground back then, but I believe that is changing today.  There is increasing call for evidence-based approaches and a renewed interest in both empirical and rational methodologies.  To avoid falling into the same problems of the positivists, we should seriously consider the type of post-positivist rational empirical method of validity advocated by Cronbach and Messick.