Funders in a growing number of fields are placing increasing emphasis on evidence-based practice (EBP) in making decisions about the kinds of programs to fund. The author of this paper, who works for Outward Bound Canada, raises questions about whether this trend is appropriate in the realm of adventure education and therapy. Broadly, evidence-based practice refers to a way of managing, and especially funding, programs based on research results that demonstrate programs' effectiveness. Evidence-based medicine, social work, and health services are now common. In this system, some research results carry more weight than others: Anecdotal evidence, testimonials, and personal communications rank lowest. Next are observational studies, interviews, qualitative studies, and expert opinions, among other research approaches. The next group includes quasi-experimental designs, cohort-controlled studies, and case-controlled studies. The highest ranking goes to random controlled trials (RCT) with experimental designs.
The author raises questions about whether this emphasis on experimental designs for evaluating programs is the best way to understand how effective outdoor education and therapy programs are. He believes that “EBP's value system suggests most published forms of evidence gained from non-RCT methodologies, along with practitioner knowledge and the collective history of a field, are treated as near irrelevant, effectively ignoring alternative knowledge claims.” He cites others who have criticized the shift toward EBP in medicine, as it ignores physician knowledge, basic research, and the field's shared knowledge.
Another potential problem with the EBP approach is that, typically, a funder will designate a program that has demonstrated success through experimental methods as a “model” program that other programs should follow in order to receive funding. The author urges caution that this kind of approach does not lead the field to over-replicate one model, eliminating creative new and different approaches. And the author also points to research that has found that some model programs in substance abuse—including the D.A.R.E. program—were based on tenuous scientific conclusions. Yet these programs were considered models, and many programs followed based on early, questionable research results.
The author expresses deep concern that a move toward EBP “may compromise the development of a meaningful and inclusive research agenda in adventure education and therapy.” He urges program managers and researchers to think carefully about the extent to which the field should adopt the EBP approach. “There is no doubt that research in experiential education can be improved and that flirting with the EBP paradigm will move researchers to pursue more rigorous research designs, regardless of methodology. I simply recommend proceeding with caution.”
The Bottom Line
Many fields are moving toward evidence-based practice (EBP)—a system that funds programs according to their demonstrated effectiveness. In this system, the most highly respected and valid way to demonstrate effectiveness is through rigorous, randomized, and controlled experimental methods. But the author of this paper argues that this emphasis on controlled experiments may not be entirely appropriate in outdoor adventure education and therapy as it could deemphasize the importance of the perspectives of outdoor leaders and participants. The author also raises concerns related to the effects of replicating only those programs that have demonstrated success through controlled experimental methods. Although most researchers and practitioners agree that increased rigor in evaluating experiential programs is good, fully embracing the EBP approach might lead to problems for the field, including ignoring more qualitative, practice-based evidence and alternative knowledge systems.