Friday, 13 October 2017

The school research lead, the 5 Whys and appraising school data ?

As a school research lead one of your key tasks will be to help colleagues interpret the  quantitative data which is generated by your school.  So in this post I am going to suggest that we look at a technique called the ' five whys,' which you can use to  analyse data in a way that will help get to the very heart of any underlying issue (Pojasek, 2000).  In doing so, we will use a case-study where last year's GCSE results in a particular department have been especially disappointing

Asking ‘why’ five times

The ‘five whys' is a simple technique, which involves asking the question ‘why’ at least five times so that you can get to the root cause of a problem.  The process tends to come to an end when it is no longer possible to come up with an answer to ‘why’.    But first let's look at what happens when you ask 'why' only once and then come up with a fairly 'lazy' answer

Problem: A subject’s examination results are substantially below the previous year’s results and the 1 why

Q Why are this department's examination results below those of the previous year

A Because the both the Head of Department and teacher who taught this subject are either newly qualified and relatively inexperienced, who need support and improvement targets

However we know from the work of (Crawford and Benton, 2017) that almost all of the change in a school's examination results can be explained by changes in year to year changes in the pupil cohort.  So let's have a go with the 5 whys

Problem: A subject’s examination results are substantially below the previous year’s results - the 5 whys

Q Why are examination results below the previous year’s results
A Because this year a weaker cohort of students took the subject

Q Why did a weaker cohort of student take the subject this year
A Because ‘stronger’ students who would normally take this subject chose other subjects.

Q Why did the stronger students choose other subjects 
A Because in the year before the students chose their ‘options’, they had been taught predominantly by non-specialist teachers who were adequate rather than inspiring 

Q Why did a non-specialist teachers deliver this subject
A Because all teachers had to have a full timetable

Q Why did all teachers have to have a full timetable
A Due to financial pressures it was not viable to have teachers on ‘light’ timetables

Pojaskek (200)) identifies a number of benefits which come from asking 'why' five times.  First, once you have got the hang of it, it's a pretty quick and easy technique to use.  Second, it helps you think through an issue so that you can drill down to the underlying cause of the problem.  Third, it may help you change your perception of the root cause of a problem.  That said, there a couple of clear challenges in using the 'five whys' and these include the need for strong facilitation skills - as the focus is on getting to the root cause of an issue rather than allocating blame. There's also the issue that there may be multiple issues in play - so it may be difficult to isolate the root cause

And some final words

In these times of acute financial pressures on schools it needs to be emphasised that decisions often have long-term consequences - and what may be a quick fix for the current year, may cause substantive problems in years to come.


CRAWFORD, C. & BENTON, T. 2017. Volatility happens: Understanding variation in schools’ GCSE results : Cambridge Assessment Research Report. Cambridge, UK: Cambridge Assessmentn.

POJASEK, R. B. 2000. Asking“Why?” five times. Environmental Quality Management, 10, 79-84.

Saturday, 7 October 2017

Leading the research and evidence based school : Are you ready?

When introducing evidence-based practice into your school a major challenge you will face is having to diagnose your school’s readiness to both engage with and implement research and other evidence. One way of helping you addressing this challenge is to use the Promoting  Action on Research Implementation in Health Services (PARiHS) framework (Kitson, Harvey, & McCormack, 1998) and which provides a useful conceptual and pragmatic heuristic to to help you think about the implementation of research within your school. The PARiHS framework describes the successful implementation of research into practice as being a function of the interaction of three core elements—the quality and type of the evidence, the context, setting or environment into which the research used, and the approaches and techniques of facilitation.

Evidence – this consists or four elements: academic/scientific research, practitioner expertise, school/organisational data and the views of stakeholders

Context – is the environment in which the proposed change – either an evidence-informed intervention or the adoption of evidence-based school leadership – is being implemented. As such, it can be divided into three sub-elements; the prevailing culture; school leadership; and the school’s approach to accountability and development.

Facilitation – this is where one or more persons – for example, senior leaders, school research leads and champions - makes things easier for others and this includes; the characteristics of facilitators; their role; style and skills. In this context, the facilitator’s role is help people understand the change required and what’s needed to be done to bring it about

Now each of these elements and sub-elements is placed on a low to high continuum, with (Kitson et al., 1998)(Kitson et al., 1998) stating

‘…that for the implementation of research into practice to be successful there needs to be a clear understanding of the nature of the evidence being used, the quality of context in terms of its ability to cope with change and type of facilitation needed to ensure a successful change process’ (p152)

For each of these three elements a range of conditions may exist which indicate the likelihood of the successful implementation of research and evidence-based practices. These conditions are illustrated in Figure 1

Figure 1 Conditions for evidence, context and facilitation

Likelihood of success of implementing change



Systematic reviews
Randomised controlled trials

Practitioner expertise
Expert opinion divided
Several ‘camps’

High level of consensus
Consistency of view of value of evidence

Organisational date
Little detailed data available
Data comes from a restricted number of sources.

High level of detailed quantitative data available  
Data available from multiple sources

Stakeholder views
Stakeholders not involved

Partnerships with full range of stakeholders

Task driven
Low regard for individuals
Low morale
Little or no CPD
Focus on the immediate

Learning school
Pupil centred
Values people
Focus on CPD
Focus on capacity and capability building

Lack of vision
Diffuse roles
Lack of team roles
Poor leadership
Poor organisation or management of the school

Clarity of vision
Clear roles
Effective team work
Effective organisational structures
Clear leadership

Absence of:
Audit and feedback
Peer reviews
Performance review
External evaluation

Internal measures regularly used
Audit or feedback used routinely
Peer review
External measures

Support structures
Not in place
Lack of journal clubs/research learning communities
No guidance on processes
No external partnerships with research schools or HEIs
Time not made availabled

Part of routine processes
Journal clubs and research learning communities embedded
Clear guidance on processes
Partnerships with research schools and HEIs
Dedicated and ring-fenced time available



Position in school
Change agenda

Change agenda successfully negotiated


Range of style and flexibility
Consistent and appropriate presence and support

Adapted from (Kitson et al., 1998) Figure 3 p151 

As such, those schools which would appear to have the greatest chances of successfully evidence-based practice and associated innovations, would appear to be predominantly on the right-hand high side of the continuum. Whereas those schools who have significant work to do to increase their chances of successfully implementing evidence-based practice, would have features primarily located on the left-hand side of the continuum.

Now having undertaken an initial assessment of your school’s readiness to use research and evidence, try to plot where you and your school are on the following evaluative grid. In doing so, you are going to focus on your evaluation of the evidence and context elements of the PARiHS model.

Figure 2 The PARiHs Diagnostic and Evaluative Grid - adapted from Kitson et al, 2008

In Figure 2, Q1 represents a school which has a weak context -  though strong evidence on which decision-makers and the main stakeholders can agree. Q2 represents the ideal situation for the implementation of evidence-based practice, where there is a strong supportive school context and agreement on the strength of evidence available. Q3 is where there is weak context and a school in this quadrant is not well placed to take advantage of any agreement about the strength of the evidence. Finally, Q4 represents a situation where the school has a strong context but where there is little or no agreement about the strength of the evidence available to bring about changes in practice

So what does this mean for the facilitation of evidence-based school leadership?  Drawing on the work of (Greenhalgh, 2017) if a primary concern is that colleagues are not aware of the research evidence available to them as teachers and school leaders, a priority for you as the school research lead/champion maybe to help individuals gain a greater awareness of the available evidence and how to evaluate it.    Alternatively, if there is a recognition that the school context is weak – there may be greater focus on putting in enabling conditions – such as focussing on pupil and staff learning, ensuring individuals have clear roles and responsibilities and there is an appropriate organisational framework - such as journal clubs or time for evidence-based CPD

However, whenever we look at a conceptual model and heuristic we need to see whether there is robust evidence demonstrating the efficacy of the approach. For as (Greenhalgh, 2017) notes in the context of healthcare there are no studies of how PARiHS has been used as the original authors have intended, indeed all studies into PARiHS have tended to use the framework to look back at what had been done. That said, as Greenhalgh notes – the PARiHs has what she calls ‘face-validity – that is it seems intuitively sensible that evidence, context and facilitation are all key to implementation. Furthermore, at a broad-brush level, the PARiHS framework is sufficiently flexible to allow its application to many different situations and examples’ (Greenhalgh, 2017).

And finally

You may wish to use the PARiHS framework as an initial diagnostic which captures your school’s readiness to engage with the implementation of research and evidence-based practice. If it works for you, fantastic. If not, there may be other models which work in your context such as the NFER self-assessment too

Thursday, 28 September 2017

Teachers and research use - Guest post by Professor Chris Brown

As avid readers of this blog will know, evidence informed practice (EIP) is considered to the hallmark of high performing education systems (Furlong, 2014), and is also regarded by many as a prerequisite for effective teaching and learning (OECD et al., 2016). It is also suggested (e.g. OECD, 2016) that optimal forms of EIP involve teachers collaborating to use research to address school priorities, where these priorities coincide with the day to day realities of the classroom (e.g. teachers using of research to improve pupil behavior or pedagogy). At the same time, little research has been undertaken into how this optimal situation can be achieved (Cain, 2015).

Recent work examining the use of research by teachers (e.g. Brown and Zhang, 2016; Stoll and Brown, 2015) suggest that it is possible to characterize teachers’ EIP behaviors according to a combination of their attitudes towards using research for school improvement and teachers’ actual engagement with research. Using teachers’ research-use attitudes and engagement as forming the axes of a 2 x 2 matrix (see Figure 1), means we can begin to think about four evidence-use ‘types’: with type 1 use representing teachers working collaboratively using research to address school improvement priorities; type 2 use teachers are those willing to work collaboratively to engage with research, but who lack the skills/experience required; type 3 teachers are those who work individually, using research to tackle ‘local’ issues of teaching and learning; and finally, type 4 teachers reject any form of research use.

Figure 1: teacher evidence use types

Recently I worked with a federation of three small Church Infant Schools based south Hampshire to develop model of professional learning where four of the statutory staff professional development (INSET) days allocated to schools in England are dedicated solely to evidence-informed professional development. Using a cycle of enquiry approach, the approach hopes to enable teachers to engage collaboratively with research, to develop new practices, to trial these practices, to measure their impact and then roll out the most successful within and across schools in the Federation. To help design the approach I wanted to assess the context for the model using the teacher research types set out above, which I did via teacher interviews.

As well as examining which teachers were situated where, interviewing them also meant I could undertake a ‘cross case’ approach to explore the factors that informed their attitudes towards research use for school improvement as well as their engagement with research. Beginning with the latter, it would seem that key to driving teachers’ actual use of research is their first hand engagement with research. For instance, those teachers who recognized the benefits of using research, were those who had typically recently completing postgraduate study.

A fundamental part of what drives positive attitudes towards research use for school improvement, on the other hand, is the extent to which the use of research is perceived as being something that should extend beyond the local setting: in other words to teachers’ collaborative and networked orientations (e.g. their use of learning conversations and networked learning conversations) and the extent to which evidence-use signifies not just a tool, but something that leads to 21st century teaching and learning within what the OECD refers to as ‘learning organisations’ (OECD, 2016). Related is where there was recognition from teachers that senior leaders were encouraging of the EIP agenda and, vitally, also engaging in acts (such as timetabling) to enable networked collaboration. Where participants held negative attitudes towards research use for school improvement, they highlighted a lack of support to encourage them to engage in research use that involved collaboration with colleagues.

Overall then, it would seem that to support type 1 type perspectives school leaders need to enable teachers to get their ‘hands dirty’, but they also need to ensure all teachers act more readily in the spirit of what Brown and Zhang (2016) refers to as the culture of the networked research learning community, and it is this use of networks in ways that can produce a multitude of benefits at a variety of levels that is likely to be the key to unlocking the potential that the optimal rational position of EIP has to offer.

Brown, C., and Zhang, D. (2016) Is Engaging In Evidence-Informed Practice In Education Rational? Examining The Divergence Between Teachers’ Attitudes Towards Evidence Use And Their Actual Instances Of Evidence Use In Schools, British Educational Research Journal, 42, 5, pp. 780-801.

Cain, T. (2015) Teachers’ engagement with published research: addressing the knowledge problem, Curriculum Journal, 26, 3, pp. 488-509

Furlong, J. (2014). Research and the Teaching Profession: building capacity for a self improving education system. Final report of the BERA-RSA Inquiry into the role of research in the teaching profession, (Lonon BERA).

OECD (2016) What makes a school a learning organization, available at:, accessed on 25 July, 2016.

See, B. H., Gorard, S. and Siddiqui, N. (2016) Teachers’ use of research evidence in practice: a pilot study of feedback to enhance learning, Educational Research, 58, 1, pp. 56-72.

Southworth, G. (2009) Learning centred leadership, in: B. Davies (Ed), The essentials of school leadership (2nd edn) (pp. 91–111) (London, Sage).

Stoll, L. and Brown, C. (2015) Middle leaders as catalysts for evidence-informed change, in C. Brown (Ed), Leading the use of Research & Evidence in schools (pp. 66-77) (London, IOE Press).


A sample of 15 pages of Chris's new book is available here: use EMERALD30 to get 30% off at