Scientists spend considerable time engaging in critical analysis of ideas prior to publishing works or proclaiming findings. Students of science need some guided practice in this skill, especially when the principles under study are complex. Lombardi, Brandt, Bickel, and Burg’s 2016 article highlighted why this is important; students need to evaluate claims in controversial socio-scientific texts because it is there that the gap in knowledge between layperson and expert seems to be quite wide and multiple, often opposite, perspectives compete for recognition.
By providing competing accounts for climate change, the research team sought to discover the varied types of explanations 7th grade students used when considering these differing accounts during classroom instruction. Further, they hoped to determine how those evaluations related to statements of reasonableness, what they call plausibility judgments, after instruction.
The 85 participants in the study were from the 7th grade student body in one middle school who were all enrolled in Earth Science. The students used a MEL (Model-Evidence-Link) Diagram to connect textual evidence to two models; one positing human factors and resultant increases in gases as the reason for climate change, and the other positing an increase of energy from the sun as the reason for climate change. Students then wrote statements for these connections. Lombardi and colleagues analyzed these written responses by finding common categories across the samples and then by ranking those categories by level of scientific understanding. Additionally, students completed several measures on plausibility judgments of the two causes of climate change (human factors or solar irradiation) and a third measure on knowledge of climate change, with this last measure forming the basis for post-instructional knowledge.
Four categories of responses developed from analyzing students’ written responses. The first was erroneous evaluation in which students did not demonstrate scientific understanding. The second, descriptive evaluation, was characterized by a superficial connection between evidence and models. Relational evaluations showed more elaboration, but the similarities were often text-based. The final category, critical evaluation, was characterized by causal relationships and detailed reasoning. The research team determined that students’ plausibility judgments scores and types of evaluation scores predicted post-instructional knowledge. In particular, students who were able to link evidence that contradicts a model earned higher evaluation scores. The authors suggest that this finding be used by science educators to emphasize contradictory evidence as a way to enhance student understanding of science content and the process of evaluation.
These findings suggest that students grow in science with the combined use of the MEL Diagram and expectation that students write evaluative statements of their understandings on the ways the models are linked with evidences, in this case toward the scientific model that human factors contribute to climate change. The students in this study only utilized the MEL Diagram for 90 instructional minutes. Likely, students would continue to demonstrate refinement of evaluation skills and plausibility judgments if the activities continued as integrated components of the curricula. Further, scientists often work collaboratively, yet these students worked individually. A follow-up study may wish to consider the social dynamics of scientific argumentation on the levels of student evaluation.
The overall importance for this study lies in guiding students to consider evidence and to critically analyze what they read. This occurs in all subject areas and outside of school. Armed with these skills, they can engage in more systematic and rational analysis of information, whether they are in a classroom, interacting on social media, or watching television. This critical consumption of information is beneficial for students and the overall progress of knowledge.
Lombardi, D., Brandt, C. B., Bickel, E. S., Burg, C. (2016). Students' evaluations about climate change. International Journal of Science Education, 38, 1392-1414. Doi: 10.1080/09500693.2016.1193912
By Jessica Vandenberg
Educational Psychology Doctoral Student, NC State