Mixed methods, mixed outcomes? Combining an RCT and case studies to research the impact of a training programme for primary school science teachers

Judith Bennett, Pamela Hanley, Ian Abrahams, Louise Elliott, maria Turkenburg-van Diepen

Research output: Contribution to journalArticle

Abstract

A randomised controlled trial (RCT) and a series of case studies were used to determine the impact of two variants of an intervention (a professional development programme) aimed at improving primary school science teachers’ subject and pedagogic content knowledge, and enhancing their subject leadership ability. Ninety-six schools were randomly assigned to full or partial treatment groups or a ‘business-as-usual’ control group. Quantitative data
were collected from teachers and pupils through an assessment of scientific knowledge based on standardised assessment items. Qualitative data were collected through interviews and lesson observation initially in thirty case study schools. There were three data collection points: pre- and post-intervention, and one year later. [Guskey, T. (1986). Staff development and the process of teacher change. Educational Researcher, 15(5), 5–12.] Levels of Professional
Development Evaluation model was used as the analysis framework. The quantitative data from the teachers’ subject knowledge assessment indicated neither the full nor the partial training programmes had a statistically significant impact on teachers’ performance. In contrast, the qualitative data suggested that many teachers in the full treatment group believed that their subject knowledge had improved and reported increased confidence in their teaching of science. Lesson observations provided corroborating evidence of change in teachers’
practice, and some modest evidence of wider change in schools. There was no statistically significant improvement in pupil performance in subject knowledge assessments when teachers had participated in the intervention. In the context of research methods, the study suggests that a mixed-methods approach to evaluation is likely to yield a more rounded and nuanced picture of the overall impact of an intervention.
LanguageEnglish
Pages490-509
Number of pages20
JournalInternational Journal of Science Education
Volume41
Issue number4
Early online date7 Jan 2019
DOIs
Publication statusPublished - 2019

Fingerprint

training program
primary school
teacher
science
subject teacher
pupil
school
knowledge
Group
pedagogics
evaluation
performance
evidence
research method
confidence
leadership
staff
ability
Teaching
interview

Cite this

Bennett, Judith ; Hanley, Pamela ; Abrahams, Ian ; Elliott, Louise ; Turkenburg-van Diepen, maria. / Mixed methods, mixed outcomes? Combining an RCT and case studies to research the impact of a training programme for primary school science teachers. In: International Journal of Science Education. 2019 ; Vol. 41, No. 4. pp. 490-509.
@article{769be709f3fb43b691a36c4d40c90b27,
title = "Mixed methods, mixed outcomes? Combining an RCT and case studies to research the impact of a training programme for primary school science teachers",
abstract = "A randomised controlled trial (RCT) and a series of case studies were used to determine the impact of two variants of an intervention (a professional development programme) aimed at improving primary school science teachers’ subject and pedagogic content knowledge, and enhancing their subject leadership ability. Ninety-six schools were randomly assigned to full or partial treatment groups or a ‘business-as-usual’ control group. Quantitative datawere collected from teachers and pupils through an assessment of scientific knowledge based on standardised assessment items. Qualitative data were collected through interviews and lesson observation initially in thirty case study schools. There were three data collection points: pre- and post-intervention, and one year later. [Guskey, T. (1986). Staff development and the process of teacher change. Educational Researcher, 15(5), 5–12.] Levels of ProfessionalDevelopment Evaluation model was used as the analysis framework. The quantitative data from the teachers’ subject knowledge assessment indicated neither the full nor the partial training programmes had a statistically significant impact on teachers’ performance. In contrast, the qualitative data suggested that many teachers in the full treatment group believed that their subject knowledge had improved and reported increased confidence in their teaching of science. Lesson observations provided corroborating evidence of change in teachers’practice, and some modest evidence of wider change in schools. There was no statistically significant improvement in pupil performance in subject knowledge assessments when teachers had participated in the intervention. In the context of research methods, the study suggests that a mixed-methods approach to evaluation is likely to yield a more rounded and nuanced picture of the overall impact of an intervention.",
keywords = "Mixed methods, randomised controlled trial, teacher professional development",
author = "Judith Bennett and Pamela Hanley and Ian Abrahams and Louise Elliott and {Turkenburg-van Diepen}, maria",
year = "2019",
doi = "10.1080/09500693.2018.1563729",
language = "English",
volume = "41",
pages = "490--509",
journal = "International Journal of Science Education",
issn = "0950-0693",
publisher = "Taylor and Francis Ltd.",
number = "4",

}

Mixed methods, mixed outcomes? Combining an RCT and case studies to research the impact of a training programme for primary school science teachers. / Bennett, Judith; Hanley, Pamela; Abrahams, Ian; Elliott, Louise; Turkenburg-van Diepen, maria.

In: International Journal of Science Education, Vol. 41, No. 4, 2019, p. 490-509.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Mixed methods, mixed outcomes? Combining an RCT and case studies to research the impact of a training programme for primary school science teachers

AU - Bennett, Judith

AU - Hanley, Pamela

AU - Abrahams, Ian

AU - Elliott, Louise

AU - Turkenburg-van Diepen, maria

PY - 2019

Y1 - 2019

N2 - A randomised controlled trial (RCT) and a series of case studies were used to determine the impact of two variants of an intervention (a professional development programme) aimed at improving primary school science teachers’ subject and pedagogic content knowledge, and enhancing their subject leadership ability. Ninety-six schools were randomly assigned to full or partial treatment groups or a ‘business-as-usual’ control group. Quantitative datawere collected from teachers and pupils through an assessment of scientific knowledge based on standardised assessment items. Qualitative data were collected through interviews and lesson observation initially in thirty case study schools. There were three data collection points: pre- and post-intervention, and one year later. [Guskey, T. (1986). Staff development and the process of teacher change. Educational Researcher, 15(5), 5–12.] Levels of ProfessionalDevelopment Evaluation model was used as the analysis framework. The quantitative data from the teachers’ subject knowledge assessment indicated neither the full nor the partial training programmes had a statistically significant impact on teachers’ performance. In contrast, the qualitative data suggested that many teachers in the full treatment group believed that their subject knowledge had improved and reported increased confidence in their teaching of science. Lesson observations provided corroborating evidence of change in teachers’practice, and some modest evidence of wider change in schools. There was no statistically significant improvement in pupil performance in subject knowledge assessments when teachers had participated in the intervention. In the context of research methods, the study suggests that a mixed-methods approach to evaluation is likely to yield a more rounded and nuanced picture of the overall impact of an intervention.

AB - A randomised controlled trial (RCT) and a series of case studies were used to determine the impact of two variants of an intervention (a professional development programme) aimed at improving primary school science teachers’ subject and pedagogic content knowledge, and enhancing their subject leadership ability. Ninety-six schools were randomly assigned to full or partial treatment groups or a ‘business-as-usual’ control group. Quantitative datawere collected from teachers and pupils through an assessment of scientific knowledge based on standardised assessment items. Qualitative data were collected through interviews and lesson observation initially in thirty case study schools. There were three data collection points: pre- and post-intervention, and one year later. [Guskey, T. (1986). Staff development and the process of teacher change. Educational Researcher, 15(5), 5–12.] Levels of ProfessionalDevelopment Evaluation model was used as the analysis framework. The quantitative data from the teachers’ subject knowledge assessment indicated neither the full nor the partial training programmes had a statistically significant impact on teachers’ performance. In contrast, the qualitative data suggested that many teachers in the full treatment group believed that their subject knowledge had improved and reported increased confidence in their teaching of science. Lesson observations provided corroborating evidence of change in teachers’practice, and some modest evidence of wider change in schools. There was no statistically significant improvement in pupil performance in subject knowledge assessments when teachers had participated in the intervention. In the context of research methods, the study suggests that a mixed-methods approach to evaluation is likely to yield a more rounded and nuanced picture of the overall impact of an intervention.

KW - Mixed methods

KW - randomised controlled trial

KW - teacher professional development

UR - http://www.scopus.com/inward/record.url?scp=85059678636&partnerID=8YFLogxK

U2 - 10.1080/09500693.2018.1563729

DO - 10.1080/09500693.2018.1563729

M3 - Article

VL - 41

SP - 490

EP - 509

JO - International Journal of Science Education

T2 - International Journal of Science Education

JF - International Journal of Science Education

SN - 0950-0693

IS - 4

ER -