Monday 27 April 2015

The School Research Lead and Cognitive Biases


Acknowledging the influence and impact of cognitive biases on evidence-informed practice is essential for the School Research Lead. Why? Well, evidence-informed practice is built on the notion of teachers using their judgment to make decisions which directly or indirectly benefit their pupils and colleagues.  If those decisions result from systematic errors in judgement due to cognitive limitation, motivational factors and the environment, those decisions are less likely to bring about favourable outcomes.   In this post we will examine the work of Daniel Kahneman and other to help us gain a better understanding of cognitive biases.  Finally, we will examine what strategies can be adopted to mitigate the impact of cognitive biases, whilst at the same time develop our expertise as evidence-informed practitioners. 

What do we mean by cognitive biases?

Amos Tversky and Daniel Kahneman introduced the term 'cognitive bias.' It described systematic errors in judgment and decision-making which can be due to cognitive limitations, motivation factors and the environment. Recently Geoff Petty posted a blog which looked at the impact of confirmation bias on evidence-based practice.  Nevertheless, there is a wide range of cognitive biases, other than confirmation bias, which can impact upon evidence-informed practice, and are summarised by Wilke and Mata (2012).
  • Confirmation bias - the tendency to selectively search or interpret information in a way that confirms your perceptions or hypotheses.
  • Conjunction fallacy - the tendency to assume that specific conditions are more probable than a single general one.
  • Endowment effect - the tendency that people often give more value to on an object they already have than they would be pay to acquire it.
  • Fundamental attribution error - the tendency to over-emphasize personal factors and underestimate situational factors when explaining other people's behaviour
  • Gambler's fallacy - the tendency to think that future probabilities are changed by past events.
  • Halo effect - the tendency for a person's positive or negative traits to extend from one are of their personality to others' perceptions of them. 
  • Hindsight bias - a memory distortion phenomenon due to the benefit of feedback about the outcome of an event, people's recalled judgment of the likelihood of that event are typically closers to the actual outcome that their original judgments were.
  • Illusory correlations - the tendency to identify a correlation between a certain type of action and effect when no such correlation exists.
  • In-group bias - the tendency for people to give preferential treatment to others they perceive to be members of their own group.
  • Mere exposure effect - the tendency by which people develop a preference for things merely because they are familiar with them.
Let's now look at one of the cognitive biases - the Halo Effect - in more detail.

Cognitive bias and the Halo Effect

Phil Rosenzweig in his book The Halo Effect .... and the eight other business delusions that deceive managers  describes the Halo Effect as when, for example,  a company's sales and profits are up (a school's retention, achievement and success rates) people (inspectors or significant other stakeholders) may conclude that this is the results of brilliant leadership and strategy or a strong and coherent corporate (school) culture. When performance falters, (success rates or position in leagues tables fall) people conclude it is the result of weak leadership and management (the Headteacher/ Principal), and the company was arrogant or complacent (the school was coasting). Whereas the reality may be that little may has changed and that the school/company performance creates a HALO. This shapes the way judgements are made about outcomes for learners, teaching learning and assessment, leadership and management.

What can we do - a check-list for routing out cognitive biases

At the core of evidence-informed practice is the notion of making decisions within schools through the conscientious, explicit, judicious and skilful use of teacher expertise, school evidence, research evidence and values and preferences of stakeholders.  It should be self-evident that 'cognitive bias' is a real-threat to the making of decisions which lead to favourable outcomes

Kahneman, Lavallo and Sibony (2011) provide guidance on how to find dangerous biases before they lead to poor-decision-making.  Accordingly, Kahneman et al suggest you apply the following check-list.

  • Is there any reason to suspect that the team or individual making the recommendation are making errors motivated by self-interest?  If so, review the proposal with care and challenge any underpinning assumptions and evidence.  What are the proposers going to 'get out' of the proposal?
  • Has the team fallen in love with its proposals? Use this check-list to test the thinking and evidence which lie behind the proposals
  • Were there dissenting opinions, were these opinions fully explored? Is the team victim of 'group-think'? If so, discretely seek out dissenting opinions or other sources of evidence 
  • Could the diagnosis be overly influenced by an analogy to a memorable success? Has something similar, but different, worked in the past?  For example, remember when we did this, as that worked, so let's do the same 
  • Are credible alternatives included with the recommendation? Have alternatives ways of accessing evidence been considered?  Have different ways of conducting  study been considered, be it quantitative and or qualitative?
  • If this decision was to be made again in a year’s time, what information would you want and can you get more of it now? Is an internal source of evidence available to help inform the decision, have interim reports of national studies been published?
  • Do you know where the numbers came from – are there unsubstantiated numbers – have they been extrapolated from historical data?  Have assumptions been made about the direction of cause and effect, which do not stand up to critical scrutiny?
  • Is the team assuming that a person, organisation or innovation which is successful in one area will be just as successful in another? Just because a particular approach worked in say, the science department, does that mean it will work in art or a humanities subject.
  • Is the base case overly optimistic? Have the proposers assumed there will be no delays or barriers to implementation?  Most planners underestimate the resources, especially, needed to complete a project.
  • Is the worst case bad enough?  Has a pre-mortem been conducted?  Imagine that the project failed, then think back as to what could have contributed to the failure.
  • Is the recommending team overly cautious?  Have the proposers under-estimated the potential benefits? Have they down-played some of the positive indirect benefits?
What are the implications for the School Research Lead

Initially, four implications come to mind when thinking about how to minimise the risk of cognitive biases when engaging in evidence-informed practice within a school.  First, make sure the check-list is applied.  It will not get rid of all the potential biases, though it will be a start.  Second, use all of the check-list. Do not cherry pick those bits of the check-list which 'suit' your current stance or position.  Three, separate the recommenders from the decision-makers in the decision-making process. Proposed research projects need to be scrutinised by others, especially for ethical considerations, before approval is given  Finally, remember getting decisions right is an art not a science.  You won't be able to absolutely prove a research proposal is a good idea, but what you can do is increase your chance of success, be aware of the odds and manage the invariable trade-offs.

As for next week, I will be posting from researchED New York, so hope to have some fascinating news, ideas and perspectives to share.

References

Kahneman, D., Lovallo, D and Sibony, O. (2011) Before you make that big decision ... Harvard Business Review, June 2011

Rosenzweig, P. (2007) The Halo Effect .... and eight other business delusions that deceive managers, Free Press, London.


Wilke A. and Mata R. (2012) Cognitive Bias. In: V.S. Ramachandran (ed.) The Encyclopedia of Human Behavior, vol. 1, pp. 531-535. Academic Press.



Sunday 19 April 2015

Research Leads Brighton and keeping groupthink at bay?

researchED have done it again, with another fantastic event.  Last Saturday, Brighton University played host to the third researchED Research Leads conference.  As usual there were speakers who provided a range of insights into the use of evidence to inform practice.  Individuals, who are willing to give up Saturday to engage in real and effective professional learning, included:
  • Daniel Muijs - Can we (reliably) measure teacher effectiveness? 
  • Nick Rose - Developing Tools for Teacher Inquiry
  • James Mannion - Praxis : Professional Development through Research Inquiry
  • Louise Bamfied and Paul Foster - Building Research Rich Schools and Alliances
  • Rebecca Allen - How can you know what works in your school?
  • Lia Commissar - Teachers discussing learning with psychologists and neuroscientists - an online platform
However, there is always a danger when meeting up with like-minded individuals that you reinforce existing ‘biases’ or ‘predilections. and engage in groupthink.   Yet challenging your biases is an essential component of both,  genuine evidence-informed practice and the role of the school research lead.  Challenging biases is an essential part of evidence-informed practice, as it is necessary to actively seek out evidence and opinions, which do not confirm with your views or hypotheses. Second, School Research Leads need to mindful that in the first instance they may be working with volunteers, or shall say 'believers' who are like-minded, and which may result in groupthink leading to poor decisions being made about what should be inquired about, and how that inquiry should take-place.

With the danger of groupthink in mind, Sunstein and Hastie’s 2014 book Wiser: Getting beyond groupthink to make groups smarter, provides a range of insights on why and how group decision-making can go wrong. Sunstein and Hastie identify four problems that groups often run into:

  • Amplifying rather than correcting, individual errors of judgment
  • Cascade effects, as others follow what others say to do
  • Polarising, adopting more extreme positions than the ones they began with
  • Emphasising what everybody knows instead of focusing on critical information held by a few.
It doesn’t take a lot of imagination to see how School Research Leads may have to face and managed these four problems. However, help is at hand, and Sunstein and Hastie a number of ways to reduce the chance of failure.

Inquisitive and self-silencing leaders – who refuse to state a view at the outset and allow time and space for other ideas and perspective come to the fore.  In other words, when leading a session avoid starting off with the phrase. Well the evidence says..

Priming critical thinking – encourage the development of social norms where robust discussion is encouraged, try and reduce the incidence of subtle clues which re-inforce silence and the non-disclosure of information.  When conducting meetings leave time for a competing view to emerge, particular if a speedy consensus has been reached on what the evidence-says is ’.

Reward group success – emphasise that individuals will be rewarded on the basis of the success of the group, rather than individual outcomes, as this should provide better access to the thinking of a range of individuals.  

Role assignment – ensure that all individuals within the group are told that each holds relevant and different information to be contributed and relevant to the success of the group.  This is particularly the case in schools, as evidence-informed practice involves accessing information from all members of the school community, including administrators, teaching assistant and site management staff. Ask of everyone Tell me - what do you think?

Perspective changing  - all this require is a simple question.  If we were to ask someone else from a different department, school or sector – what would they think about the proposal or idea.

Sunstein and Hastie also recommend consideration of other more sophisticated though challenging methods to improve group decision-making – devil’s advocacy, red teams and the Delphi method – which I leave for you to explore for yourself.

To conclude, it is great to attend event such as researchED research leads one-day conferences, on the other hand, there is a real-risk that it can lead to existing biases and prejudices being confirmed.  These biases may impact on how school research leads conduct their work within their own schools, and which may contribute to groupthink and poor decisions in the development of a school based evidence-informed culture.  However, there are a number of techniques which can be used to minimise this risk, and which involve the careful and mindful management of groups.  Future posts will explore issues related to this post, such as cognitive bias and Edgar Schein’s concept of ‘humble inquiry’.


Monday 13 April 2015

School Research Leads and Ambitiously seeking to improve practice through the use of evidence

Last week' s post, adapting the work of Barends, Rousseau and Briner (2014),  provided a definition of evidence-informed practice for teachers and schools.  This post, using the first of the 9 A's outlined in the definition - Ambitiously : seeking to improve practice through the use of evidencewill detail both what ambitiously means, and the steps which can be taken to help realise that ambition.  

So what does -  Ambitiously: seeking to improve practice through the use of evidence  mean in practice?  Well, there are clearly has similarities with Hargreaves and Fullan's (2012)  guidance for teachers on how they can develop their professional capital.    In the first of their 10 action points, Hargreaves and Fullan state teachers need to  - Become a true pro.  Hargreaves and and Fullan state :

Teachers are dedicated.  They care about their subjects and their students. They put in endless extra hours.  It is a lot.  But it is  not enough .  Not if you want to teach like a pro.  Teaching like a pro means preparing yourself properly : putting in years of study and practice until you reach your 10,000 hours of of highly accomplished performance and then honing your skills even more as you help develop the next generation of teachers …..Teaching like a pro means connecting with the latest research evidence, inquiring into your own practice - with other colleagues and other schools, down the street and across the word - to find new ideas, get advice, and sift what work from what doesn't. (p155)

In other words, your ambition is to become a tru pro in the use of evidence-informed practice to bring about favourable outcomes for those affected by your actions.  To achieve this there are four principles: first, aim to be expert in the use of evidence; second, where evidence is referred to - check-it-out. third, just because you think you know what the evidence is, check-it-out; four, you are responsible for you own professional learning.  

Aim to be expert in the use of evidence  

In my post of 16 February, 2015 I described the Dreyfus model of human learning, which identifies five levels within the human learning process: novice; advanced beginner; competent; proficient; and, expert.  Brown and Rogers (2014) - in a study designed to facilitate the use of evidence-informed practice - combine the Dreyfus model  with the work of Flyvberg (2001) and Hall and Hord (2001) to create a typology of the descriptors of expertise in the use of evidence. 

Self-assess your self against these descriptors and identify your current level of expertise in the use of research evidence in your day-to-day practice.  Make a plan on how you an move from your current level of expertise to the next stage, for example, how do you move from being a competent to a proficient researcher.  Identify a small-step that you can take right now, to help you make that change.  But remember, expertise in the use of research-evidence is just one part of being an expert evidence-informed practitioner; you will also need to develop your expertise in the use of the other three (stakeholder views, school data and professional expertise) sources of evidence. 

Whenever evidence is referred to - check-it-out 
Just because an expert or guru cites evidence you still need to check-it out. Hargreaves and Fullan cites the 10,000 hour rule for becoming expert or proficientand  which has gained its prominence through the work of Malcolm Gladwell and his 2008 book Outliers.   Gladwell draws attention to  the work of Anders Ericsson and his colleagues who argue that even the most gifted performers required 10,000 hours of deliberate practice   Nevertheless, research by Hambrick et al (2014) reported  that only a third of differences in performance, in elite music and chess, can be explained by accumulated difference in deliberate practice. Let's be clear, I am not arguing that becoming an expert evidence-informed practitioner does not take time and deliberate practice. Instead, I am arguing that you need to check the evidence, especially when it is referred to be experts or gurus.  If in doubt - CHECK-IT-OUT. 

Just because you heard someone say something at a conference does not mean his or her view has not changed. 
Even when you think you know what the evidence is - CHECK-IT-OUT.  At researchED's 2014 conference I saw David Weston, of the Teacher Development Trust,  give a fantastic talk on how a teacher's expertise tended to develop rapidly in the first five years of teaching experience, and then quickly flattens out, with teachers of twenty years of experience being no more effective than those teachers with five years experience.   As such, I was going to use this summary of research to justify claim that teachers should engage in evidence-informed practice as a career-long strategy for professional learning. NevertheIess I  thought I'd better check the evidence, and thanks to David himself, - I've been pointed in the direction of research evidence suggesting the relationship between years of experience and teacher expertise is more complicated than first thought.  For example, Kraft and Papay's (2014) analysis shows that teacher effectiveness is linked to professional environments, with teachers in more supportive environments improving their effectiveness more over time than teachers working in less supportive contexts 

It's down to you  
Finally - acknowledge that getting better at using evidence is a career-long activity and is not something confined to the first few years of professional development.  No one else is responsible for your professional learning, your head-teacher or head of department may be able to help with time and resources, but they are not responsible for your learning.  Becoming an expert evidence-informed practitioner is important and should, wherever appropriate, be prioritised over the non-important and yet somehow urgent  activities that so-often consume our daily working-lives. 

References
Barends, E., Rousseau, D. M., & Briner, R. B. 2014. Evidence-Based Management : The Basic Principles. Centre for Evidence Based Management (Ed.). Amsterdam.
 *This definition is partly adapted from the Sicily statement of evidence-based practice: Dawes, M., Summerskill, W., Glasziou, P., Cartabellotta, A., Martin, J., Hopayian, K., Porzsolt, F., Burls, A., Osborne, J. (2005). Sicily statement on evidence-based practice. BMC  Medical Education, Vol. 5 (1)

Hargreaves, A. & Fullan, M. (2012) Professional Capital : Transforming teaching in every school, Routledge, Abingdon.