Drawing-Based Evaluation of Ecosystem Knowledge Used as Alternative Assessment

Dentzau, M. W., & Martínez, A. J. G. (2016). The development and validation of an alternative assessment to measure changes in understanding of the longleaf pine ecosystem. Environmental Education Research, 22, 129-152.

Many informal environmental education programs lack protocols for assessing their participants' knowledge. When assessments do occur, educators typically assess students' knowledge with conventional multiple-choice, true-or-false, and short-answer questions. This study's researchers investigated an alternative, drawing-based method of assessing student understanding.

The study took place at an informal environmental education facility, The E.O. Wilson Biophilia Center at Nokuse Plantation (“The Center”), which focuses on re-establishing the longleaf pine ecosystem in the Florida Panhandle and educating visitors about biodiversity, balanced ecosystems, conservation, preservation, and restoration. The Center and the school districts whose students attend Center programs sought an alternative to conventional knowledge assessments.

The Center chose to use a drawing-based assessment because they perceived it to be minimally intrusive, easy to administer and score, and enjoyable for participants. Existing research on the efficacy of drawings as learning assessment tools is mixed, however. Some studies find a link between drawings and student understanding, while others find no link. Therefore, this study's researchers sought to understand whether drawings might be an effective way to measure changes in fourth-grade (9–10 years old) students' understanding of the local ecosystem after participating in programs at The Center. To do so, they also wished to develop a reliable rubric for scoring the drawing-based assessment.

Over two years, the researchers collected pre- and post-program drawings from fourth-grade students who attended multiple days of programming at The Center. Classroom teachers asked students to draw a picture of the longleaf pine forest prior to attending The Center's program and again after attending the program. In the first year, the researchers collected drawings from 205 students who attended the program for two days and from 201 students who attended for five days. In the second year, the researchers collected drawings from 293 students who attended the program for four days. The assessment, called Draw a Longleaf Pine Forest Ecosystem (D-LLPFE), asked students to draw what they thought the longleaf pine forest ecosystem looked like in northern Florida. The researchers designed the assessment to measure understanding of the ecosystem's ecologically important aspects. During the first year, researchers did not provide a script explaining the activity; during the second year, they provided a script and drawing paper.

Based on pilot results and trends, the researchers created a scoring rubric based on the ecosystem's key ecological features that were the focus of The Center's program. The rubric included 20 items grouped under Fauna, Flora, Ecosystem Diversity, and Characteristics Specific to Longleaf Pine and Forest Processes. The highest possible score was 20 points. The researchers tested the rubric through several iterations with independent science educators and examined inter-rater reliability; they also collected evidence of the assessment's content validity.

From analyses of the first-year data, the researchers found improvement from pre-test to post-test in the students' portrayal of important aspects of the longleaf pine ecosystem for both the two-day and five-day groups. The researchers made changes to four items in the rubric for the second year, based on first-year results. Analyses of the second-year data also showed an increase from pre-test to post-test in the depiction of important ecosystem components. Specifically, comparing pre- and post-program drawings, researchers found a statistically significant increase in average scores of 11.7%. The researchers did not find any significant differences in scores based on students' gender or ethnicity.

The researchers pointed out several important limitations to the study: the restrictions on drawing size and amount of time given for the task, the way the teachers introduced the task to the students, and the students' desires and attempts to “make it look nice” may have influenced the exclusion of certain ecosystem components. In year two of the study, however, the researchers mitigated some of those limitations by providing a standard assessment prompt and an explanatory script. The researchers also interviewed 41 students from the second year, asking clarifying questions about their drawings. Although the interviews provided some evidence of the assessment's validity, the students' responses also indicated concerns about some of the assessment's limitations, such as time restrictions and students' drawing abilities.

The Bottom Line

<p>A drawing-based assessment, paired with a scoring rubric, may offer a reliable, valid, and useful alternative assessment option for particular purposes and contexts, such as the one in this study: assessing fourth-graders' learning about key ecological features of a specific ecosystem. Drawings may provide a useful alternative assessment for informal educational settings, as they are enjoyable, minimally intrusive, and the authors argue that they are easy to administer and score. In the case of this program, comparing student drawings from pre- and post-participation was useful in helping assess changes in ecological understanding. As much as possible, however, evidence for the reliability and validity of the assessment is important to consider, particularly if the data will be used for high-stakes or evaluative purposes.</p>