Not all evidence is good evidence

Not all evidence is good evidence

We have a growing problem in evidence-based education. To the credit of many, especially the classroom teachers who have driven this and organisations such as ResearchEd, we have seen growing interest in evidence-based practice in education. However, the growing popularity of the movement is leading to a bandwagon-effect, in which everyone and every organisation in education is starting to claim to be evidence-based. In some cases this may be a valid attempt to make sure evidence is looked at in a sufficiently broad way, but too often it is leading us to a situation in which any old evidence will do as long as it supports an argument, however out-of-date it is. This is not good enough. Not all evidence is created equal, and one of the aims of the original evidence-based movement was precisely to improve the quality and use of evidence in education, to move us on from the practices that brought us learning styles and brain gym (e.g. Goldacre, 2013).

Discovery learning and the US science standards

In a new article, Zhang, Kirschner, Cobern & Sweller (2021) ( express some well-founded concerns on the ‘Next Generation Science Standards’ in the US, and the evidence that underpins them. 

Essentially, the standards continue the prevailing direction of travel in US Science education (as codified by earlier standards in the 90’s), which is to strongly promote inquiry-based pedagogy (now morphed into so-called ‘scientific practice’). This is the familiar idea that we learn science by engaging in inquiry, doing experiments and learning to ‘think like a scientist’. As Zhang et al point out, this contradicts what we know about how pupils actually learn, and the important distinction between novices and experts in this learning. Expertise is built on knowledge, rather than just on a set of procedures or ways of thinking, as the concept of ‘thinking like a scientist’ implies. This role of knowledge has been confirmed across areas of expertise. The example of chess players is of course well-known, but knowledge has been found to underlie expertise in areas as diverse as wine tasting (Hughson & Boakes, 2002), hospitality management (Wilson-Wusch et al, 2014), nursing (McNamara & Fealy, 2014) and engineering (Hanrahan, 2014).

As Zhen et al argue,  this approach also goes against a large number of studies showing that explicit instruction outperforms inquiry based approaches (Ashman, 2018}, and recent analyses of PISA results specifically on science practice (e.g. Jerrim, et al, 2019). 


Yet still the proponents of the new US standard claim to be evidence-based. How come? As Zhen et al (2021) show, this is due to reliance on a set of evaluation studies looking at the impact of specific programmes. These evaluations often fail to use a valid control group or partial out the impact of the different components of these often broad interventions. 

This is clearly highly problematic. But why should we be concerned about what is happening in the US? Well, due to the outsized influence of US practice, it is more than likely that what happens there will influence us. When it rains in Washington, it starts drizzling in London. Moreover, we are seeing similar moves from some quarters here.

In the UK, many organisations are still promoting approaches based on inquiry learning in science, often from a striving to improve ‘engagement or counter the declining science scores in PISA (this notwithstanding the clear evidence that the main reason for the latter is the abolition. Of KS2 Science SATs). Influential organisations like the Welcome Trust appear to be following this strategy.

Quality matters

Being evidence-based means attending to the quality and breadth of the research we use. What is the current state-of-the -art, how much evidence is there, how have the studies been conducted? Of course, this is a challenge, and no one can be expected to look at all possible research in depth to or understand the intricacies of all methods. What we should do, however, is interrogate the quality of research people claim as evidence, and expect them to be able to warrant their claims that this is indeed valid and reliable research. We neeed to look at the credibility of sources. We also need to be sure that what we are told reflects what we know about the fundamental processes of learning. After all, if it doesn’t, how exactly is it supposed to work in the first place? 


Ashman, G. (2018). The truth about teaching. An evidence-based guide for new teachers. London: Sage Publications. 

Goldacre, B. (2013). Building evidence into education.

Hanrahan, H. (2014). The evolution of engineering knowledge. In: Young, M. & Muller, J. (eds.), Knowledge, expertise and the professions. Milton Park: Routledge, p. 109-127.

Hughson, A. & Boakes, R. (2002). The knowing nose: the role of knowledge in wine expertise. Food Quality and Preference, 13(7-8). 463-472.

Jerrim, J., Oliver, M., & Sims, S. (2019). The relationship between inquiry-based teaching and students’ achievement. New evidence from a longitudinal PISA study in England. Learning and Instruction, 61 (1), 35–44.

McNamara, M. & Fealy, G. (2014). Knowledge matters in nursing. In: Young, M. & Muller, J. (eds.), Knowledge, expertise and the professions. Milton Park: Routledge, p.157-170.

Wilson-Wünsch, B., Beausaert, S.,  Tempelaar, D. &  Gijselaers, W.  (2015) The making of hospitality managers: the role of knowledge in the development of expertise. Journal of Human Resources in Hospitality & Tourism, 14(2), 153-176.

Zhang, L., Kirschner, P., Cobern, W. & Sweller, J. (2021). There is an evidence crisis in science educational policy. Educational Psychology Review.

One thought on “Not all evidence is good evidence

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: