Title: Measuring Critical Thinking: Results from an Art Museum Field Trip Experiment
Author(s): Brian Kisida, Daniel H. Bowen and Jay P. Greene
Publisher: Journal of Research on Educational Effectiveness
Year: 2015
URL: http://dx.doi.org/10.1080/19345747.2015.1086915
Topics: arts education, museums, field trip, visual arts, critical thinking
Methods: randomized controlled trial involving 8,000 elementary, middle- and high-school students assigned by lottery to attend a field trip and facilitated tour of the Crystal Bridges Museum of American Art in Bentonville, Arkansas. Researchers collected demographic information on the students and conducted a textual analysis of essays written by the students after the field trip responding to an image of a work of art. The essays were coded using a critical-thinking assessment rubric developed by the US Department of Education.
What it says: This study validates and expands upon the results of the authors’ 2013 Crystal Bridges study. The museum field trip was led by trained museum docent facilitating open-ended, student-led discussion about art work in the collections. Following the field trip, students completed surveys on their demographics, prior art consumption and production, knowledge of art, and attitudes toward cultural institutions. Students were also shown an image of a painting that was not part of the Crystal Bridges collection and given five minutes to write an essay describing what was going on in the painting, and what they saw that led them to that conclusion. In the first semester of the experiment, as discussed in the 2013 study, students were shown a representational work of art. Students participating in the study’s second semester were shown an abstract work of art.
All students assigned to the treatment group demonstrated stronger critical-thinking skills in their essays than those in the control group. However, across the board, some aspects of critical thinking as measured by the seven-section rubric were more evident than others; and measurements were not consistent between student responses to representational and abstract pieces. Specifically, students responding to the representational painting showed many examples of observations and interpretations in their written responses, whereas responses to the abstract piece were heavy on observation and light on interpretation. Instances of “problem finding,” “flexible thinking,” and “comparisons” were less likely in response to abstract work.
However, as reported in the 2013 study, a relatively modest “dose” of arts education – one visit to the visual arts museum – produced a significant effect in the treatment group. Many of the students had never attended a school-based field trip before, and the authors note that students who reported prior exposure to arts education – including non-visual arts education – displayed stronger critical-thinking outcomes in general than students who reported little or no arts exposure. Female students and students from larger communities also scored higher on the critical-thinking rubric. Interestingly, students attending Title I schools showed significantly higher critical-thinking outcomes than their more affluent counterparts when responding to the representational artwork, but the differences were less pronounced for the groups responding to the abstract work.
What I think about it: The 2013 Crystal Bridges was rightly lauded for its scale, clarity, and thoughtfulness. This 2015 follow-up continues in that mold. Randomized controlled trials such as this one are considered a gold standard for research; the high level of inter-rater reliability among the researchers coding the student essays – who were not aware of any student characteristics (including whether they were in the treatment or control group) – leaves little to fault in the study’s design. There are limitations, of course: there is no way to know whether the effects on the treatment group last over time, for example, and whether they would remain consistent in an urban area that afforded residents more cultural opportunities. The difference in student responses to the abstract versus representational works of art also raises questions about the depth of conclusions to be drawn. Students were only given five minutes to write their essays, so the fact that they primarily stuck to observations and interpretations isn’t surprising; nor is it illogical that students working with the abstract piece offered fewer interpretations and more observations about the work. It would be interesting to see how the responses would have evolved if students were given more time to work on them. It would also be useful to know which elements of critical thinking were on display during the treatment group discussions at the museum. According to the authors: “The goal of the museum educators was to facilitate an open-ended, student-centered approach to discuss the works of art, encourage a deep level of engagement, and motivate students to seek out their own unique interpretations.” The extent to which students accomplished this, and the balance of observation versus interpretation in the discussions, may have depended on their abilities to respond to the essay prompt in a short amount of time.
Another question emerges: how much of the impact on students’ critical thinking had to do with the field trip and how much had to do with the arts-based nature of the experience? The authors note that “this research does not establish which components of the art museum experience were essential for increases in critical-thinking skills, or if these same effects could be generated from school-based arts exposure.” I wonder whether there were components that didn’t have to do with the arts at all. If students were guided to discuss a representational photograph, or to observe an environment for a science class, would such observational practice lead to similar results? And how much, if any, of the critical-thinking gains exhibited in this study might transfer over to other activities?
What it all means: Not many randomized controlled trials take place in arts education, so this one is heartening; perhaps it will serve as inspiration to other researchers interested not only in the impact of the arts on students, but how critical-thinking skills are cultivated in the first place. Despite its scale, the study leaves several questions unanswered. It does confirm that, in the short term, students who participated in a field trip to the Crystal Bridges Museum were able to respond to works of art in a more robust way than students who did not. As with the first Crystal Bridges study, the fact that this effect is most pronounced for Title I students examining representational work seems worthy of further examination.