Page 12: References & Additional Resources
To cite this module, please use the following:
The IRIS Center. (2010). Fidelity of implementation: Selecting and implementing evidence-based practices and programs. Retrieved from https://iris.peabody.vanderbilt.edu/module/fid/
Barr, J. E.,Tubman, J. G., Montgomery, M. J., & Soza-Vento, R. M. (2002). Amenability and implementation in secondary school antitobacco programs. American Journal of Health Behavior, 26(1), 3–15. Retrieved from http://www.atypon-link.com/PNG/doi/pdf/10.5555/ajhb.2002.26.1.3?cookieset=1
Buher-Kane, J., Peter, N., & Kinnevy, S. (2005). Building an evaluation tool kit for professional development. The Evaluation Exchange, 11(4). Retrieved on October 30, 2009, from http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/professional -development/building-an-evaluation-tool-kit-for-professional-development
California Department of Education. (2007). Getting Results fact sheet: What does Getting Results say about implementing programs with fidelity? Retrieved on October 30, 2009, from http://www.gettingresults.org/c/@QcSm_DdsuAAVA/Pages/[email protected] Factsheet.pdf
Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balain, S. (2007, November). A conceptual framework for implementation fidelity. Implementation Science, 2(40). Retrieved on October 30, 2009, from http://www.implementationscience.com/content/2/1/40
Coalition for Evidence-Based Policy. (2003, December). Identifying and implementing education practices supported by rigorous evidence: A user friendly guide. Retrieved on October 30, 2009, from http://www.ed.gov/rschstat/research/pubs/rigorousevid/rigorousevid.pdf
Council for Exceptional Children. (2009). Council for Exceptional Children’s questions & answers: How the American Recovery and Reinvestment Act impacts special education and early intervention. Retrieved on October 30, 2009, from http://www.cec.sped.org/Content/NavigationMenu/PolicyAdvocacy/CECPolicyResources/ EconomicStimulus/Stimulus_Q_A.htm
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18(1), 23–45.
Dufrene, B. A., Noell, G. H., Gilbertson, D. N., Duhon, G. J. (2005). Monitoring implementation of reciprocal peer tutoring: Identifying and intervening with students who do not maintain accurate implementation. School Psychology Review, 34(1), 74–86.
Durlak, J. A. (1998). Why program implementation is important. Journal of Prevention & Intervention in the Community, 17(2), 5–18.
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of the research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.
Dusenbury, L., Brannigan, R., Falco, M. & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug use prevention in school settings. Health and Education Research, 18(2), 237–256.
Engelmann, S., Osborn, S., & Hanner, S. (1997). Corrective reading comprehension skills: Comprehension B2. Chicago: SRA.
Fixsen, D. L., & Blase, K. L. (2009). Implementation: The missing link between research and practice. The National Implementation Research Network. Retrieved October 24, 2011, fromhttp://www.fpg.unc.edu/~nirn/resources/publications/NIRN_brief_1_2009.pdf
Fixen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network. Retrieved on October 30, 2009, from http://cfs.fmhi.usf.edu/resources/publications/NIRN_Monograph_Full.pdf
Florida Center for Reading Research. (2007). Guidelines for reviewing a professional development program. Retrieved on October 30, 2009, from http://www.fcrr.org/FCRRReports/guides/gpdrrp.pdf
Florida Center for Reading Research. (2007). Guidelines for reviewing a reading program. Retrieved on October 30, 2009, from http://www.fcrr.org/FCRRReports/guides/grrp.pdf
Foorman, B. R., & Moats, L. C. (2004). Conditions for sustaining research-based practices in early reading instruction. Remedial and Special Education, 25(1), 51–60.
Fullan, M. (2001). Leading in a culture of change. San Francisco, CA: Jossey-Bass.
Georgia Department of Education. (2008). Response to intervention: The Georgia student achievement pyramid of Interventions. Retrieved on October 30, 2009, from http://public.doe.k12.ga.us/DMGetDocument.aspx/Response%20to%20Intervention- %20%20The%20GA%20Student%20Achievement%20Pyramid%20of% 20Interventions%20Sept%2024,%202008.pdf?p=6CC6799F8C1371F602EFD9 AD5D5961F5ADF3144E74E105E01A0134927B8716A3&Type=D
Gersten, R., Chard, D., & Baker, S. (2000). Factors enhancing sustained use of research-based instructional practices. Journal of Learning Disabilities, 33(5), 445–457.
Gersten, R., Vaughn, S., Deshler, D., & Schiller, E. (1997). What we know about using research findings: Implications for improving special education practice. Journal of Learning Disabilities, 30(5), 466–476.
Gersten, R., Vaughn, S., & Kim, A. (2004). Introduction: Special issue on sustainability. Remedial and Special Education, 25(1), 3–4.
Greenwood, C. R., Tapis, Y., Abbot, M., & Walton, C. (2003). A building-based case study of evidence-based practices: Implementation, reading, behavior, and growth in reading fluency, k–4. The Journal of Special Education, 37(2), 95–110.
Getting Results. (n.d.). About fidelity of program implementation. Retrieved on October 30, 2009, from http://www.gettingresults.org/c/@wEUVvNSAJckIQ/Pages/[email protected] AdminTalkingPoints.doc
Gottfredson, G. D., Gottfredson D. C., & Czeh, E. R. (2000). National study of delinquency prevention in schools. Marriottsville, MD: Gottfredson Associates, Inc.
Gresham, F. M., Gansle, K. A., & Noell, G. H. (1993). Treatment integrity in applied behavior analysis with children. Journal of Applied Behavior Analysis, 26(2), 257–263.
Gresham, F. M., MacMillan, D. L., Beebe-Frankenberger, M. E., & Bocian, K. M. (2000). Treatment integrity in learning disabilities intervention research: Do we really know how treatments are implemented? Learning Disabilities Research & Practice, 15(4), 198–205.
Gunn, B. (2009). Developing structures for improving the implementation of core, supplemental, and intervention programs. PowerPoint presentation retrieved on October 30, 2009, from http://www.nevadareading.org/resourcecenter/readingprograms.attachment/300169/Program_ Implementation_Fidelity-Developing_Structures.ppt
Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin.
Herman, J., & the Tennessee Intervention Group. (2006, June). Tennessee Reading First intervention guide. Retrieved on October 30, 2009, from http://www.tennessee.gov/education/readingfirst.doc/TNRFInterventionGuide.pdf
Holcombe, A. Wolery, M., & Snyder, E. (1994). Effects of two levels of procedural fidelity with constant time delay on children’s learning. Journal of Educational Behavior, 4(1), 49–73.
Horner, R., Sugai, G., & Roger, B. K. (n.d.). School-wide positive behavior support: Providing state-wide leadership. Retrieved on November 12, 2009, from http://www.pbis.org/common/pbisresources/presentations/robintro.ppt
Indiana Department of Education. (n.d.). Needs assessment worksheet. Retrieved on October 30, 2009, from www.doe.in.gov/sdfsc/pdf/N-Aworksheet.pdf
Johnson, E., Mellard, D. F., Fuchs, D., & McKnight, M. A. (2006). Responsiveness to Intervention (RTI): How to do it. Lawrence, KS: National Research Center on Learning Disabilities.
Joyce, B., & Showers, B. (2002). Student achievement through staff development. Retrieved on October 30, 2009, from http://literacy.kent.edu/coaching/information/Research/randd-engaged-joyce.pdf
Johnson, E., Mellard, D. F., Fuchs, D., & McKnight, M. A. (2006). Responsiveness to intervention (RTI): How to do it – Section 4: Fidelity of implementation. Lawrence, KS: National Research Center on Learning Disabilities. Retrieved on October 30, 2009, from http://www.rti4success.org/images/stories/RTIManual/rtimanualsection4fideltyof omplementation.pdf
Kendall, P. C., Gosch, E., Furr, J. M., & Sood, E. (2008). Flexibility within fidelity. Journal of the American Academy of Child & Adolescent Psychiatry, 47(9), 987–993.
Klingner, J. K., Vaughn, S., Hughes, M. T., & Arguelles M. E. (1999). Sustaining research-based practices in reading. Remedial and Special Education, 20(5), 263–274.
Lane, K. L., Bocian, K. M., MacMillan, D. L., & Gresham, F. M. (2004). Treatment integrity: An essential—but often forgotten—component of school-based interventions. Preventing School Failure, 48(3), 36–43.
Lillehoj, C. J., Griffin, K. W., & Spoth, R. (2004). Program provider and observer ratings of school-based preventive intervention implementation: Agreement and relation to youth outcomes. Ames, IA: Institute for Social and Behavioral Research, Iowa State University.
Massachusetts Department of Elementary and Secondary Education. (2008, December). Rubric for evaluating math intervention materials. Retrieved on November 2, 2009, from http://www.doe.mass.edu/frameworks/summit/TImodel_rubric.pdf
McIntosh, K., & Krugly, A. (2009, October). Sustaining school-wide PBS: The principal’s perspective. PowerPoint presentation at the PBIS Forum, October 9, 2009. Retrieved on November 9, 2009, from www.cenmi.org/Portals/3/…/McIntosh_Sustainability_handout.pdf
Montana Office of Public Instruction. (n.d.). Fidelity. Retrieved on October 30, 2009, from http://opi.mt.gov/Pub/RTI/EssentialComponents/Fidelity/Reading/Resources/Fidelity.pdf
Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. Fidelity criteria: Development, Measurement and Validation. (2003). American Journal of Evaluation, 24(3), 315–340.
National Center on Response to Intervention. (n.d.). Fidelity: Three dimensions. Retrieved on October 30, 2009, from http://www.rti4success.org/index.php?option=com_content&task=view&id=735&Itemid=2
Noell, G. H., Gresham, F. M., & Gansle, K. A. (2002). Does treatment integrity matter? A preliminary investigation of instructional implementation and mathematics performance. Journal of Behavioral Education, 11(1), 51–67.
Noell, G. H., Witt, J. C., Slider, N. J., Connell, J. E., Gatti, S. L., Williams, K. L. (2005). Treatment implementation following behavioral consultation in schools: A comparison of three follow-up strategies. The School Psychology Review, 34(1), 87–106.
North Dakota Department of Public Instruction. (n.d.). Needs assessments. Retrieved on October 30, 2009, from www.dpi.state.nd.w/grants/needs.pdf
The Out-of-School Time Resource Center. (2009, January). The Out-of-Time Resource Center survey toolkit. Retrieved on November 9, 2009, from http://www.sp2.upenn.edu/ostrc/research/…/OSTRCSurveyToolkit_000.pdf
The Out-of-School Time Resource Center. (n.d.). The OSTRC survey. Available by request from http://www.sp2.upenn.edu/ostrc/resources/prodeveval/index.html
Protheroe, N. (2008, October). The impact of fidelity of implementation in effective standards-based instruction. Principal, 88(1), 38–41.
Reynolds, S. (2007). Peer coaching: Building relationships and growing professionally. Retrieved on October 30, 2009, from http://www.pde.state.pa.us/able/lib/able/fieldnotes07/fn07peercoaching.pdf
Ringwalt, C. L., Ennett, S., Johnson, R., Rohrbach, L.A., Simmons-Rudolph A., Vincus, A., & Thorne, J. (2203). Factors associated with fidelity to substance use prevention curriculum guides in the nation’s middle. Health Education and Behavior, 30(3), 375–391. Retrieved on October 30, 2009, from http://heb.sagepub.com/cgi/content/abstract/30/3/375
RTI Teaching Learning Connections. (n.d.). Fidelity of implementation. Retrieved on October 30, 2009, from http://rtitlc.ucf.edu/Awareness/documents/Fidelity_of_Implementation.pdf
Sherman, R., Dlott, M., Bamford, H., McGivern, J., & Cohn, M. (2003, August). Evaluating professional development resources: Selection and development criteria. Retrieved on October 30, 2009, from http://www.pro-net2000.org/CM/content_files/99.pdf
Siegfriend, E., Osborn, J., Osborn, S., & Zoref, L. (1988). Reading Mastery V: Teacher’s guide. Chicago, IL: Science Research Associates.
Siegfriend, E., Osborn, J., Osborn, S., & Zoref, L. (1988). Reading Mastery VI: Teacher’s guide. Chicago, IL: Science Research Associates.
Simmons, D. C., & Kame’enui, E. J. (2003, March). A consumer’s guide to evaluating a core reading program grades k–3: A critical elements analysis. Retrieved on November 2, 2009, from http://www.doe.virginia.gov/VDOE/Instruction/Reading/ConsumerGuideReading.pdf
Sloboda, Z., Stephens, P., Pyakuryal, A., Teasdale, B., Stephens, R. C., Hawthorne R. D., Marquette, J, Williams J. E. (2008). Implementation fidelity: The experience of the adolescent substance abuse prevention study. Health Education Research, 24(3), 394–406.
Truancy and Dropout Prevention Program. (n.d.). Needs assessment. Retrieved on October 30, 2009, from http://www.fsu.edu/~truancy/needs.html
Truancy and Dropout Prevention Program. (n.d.). Research activities and outcomes. Retrieved on October 30, 2009, from http://www.fsu.edu/~truancy/research.html
U.S. Department of Education. (2009, April). American Recovery and Reinvestment Act of 2009: Using ARRA funds provided through Part B of the Individuals with Disabilities Act (IDEA) to drive school reform and improvement. Retrieved on October 30, 2009, from http://www.ed.gov/policy/gen/leg/recovery/guidance/idea-b-reform.pdf
U.S. Department of Education. (2009, April). Guidance: Funds for Part B of the Individuals with Disabilities Education Act made available under the American Recovery and Reinvestment Act of 2009. Retrieved on October 30, 2009, from http://www.ed.gov/policy/gen/leg/recovery/guidance/idea-b.pdf
U.S. Department of Education. (2009, April). Guidance: Funds under Title I, Part A of the Elementary and Secondary Education Act of 1965 made available under the American Recovery and Reinvestment Act of 2009. Retrieved on October 30, 2009, from http://www.ed.gov/policy/gen/leg/recovery/guidance/title-i.pdf
U.S. Department of Education. (1998). Implementing schoolwide programs: An idea book on planning. Retrieved on October 30, 2009, from http://www.ed.gov/pubs/Idea_Planning?Step_2.html
U.S. Department of Education, Office of Special Education Programs. (2005, June). School-wide evaluation tool (SET): Version 2.1. Retrieved on November 12, 2009, from http://www.pbis.org/common/pbisresources/tools/SET_v2.1.doc
U.S. Department of Education, Office of Special Education Programs. (n.d.). School-wide positive behavior supports Website. Accessed on November 12, 2009, at http://www.pbis.org/school/swpbs_for_beginners.aspx
Vaughn, S., Klingner, J., & Hughes, M. (2000). Sustainability of research-based practices. Exceptional Children, 66(2), 163–171.
Wallace, F., Blase, K., Fixsen, D., Naoom, S. (2008). Implementing the findings of research: Bridging the gap between knowledge and practice. Alexandria, VA: Educational Research Service.
Wilder, D. A., Atwell, J., & Wine, B. (2006). The effects of varying levels of treatment integrity on child compliance during treatment with a three-step prompting procedure. Journal of Applied Behavior Analysis, 39(3), 369–373.
Witt, J. C. (2009, September). What do we know about assessing and improving fidelity of RTI? PowerPoint presentation 2009 Fall Conference of the Student Support Team Association for Georgia Educators: Dublin, GA.
American Psychological Association. (2005). Policy statement on evidence-based practice in psychology. Washington, D.C.: Author. Retrieved on November 10, 2011, from https://www.apa.org/practice/resources/evidence/index
Here the American Psychological Association outlines its structural philosophies on evidence-based practices, clinical expertise, and clinical implications.
Baker, S., Gersten, R., Dimino, J. A., & Griffiths, R. (2004). The sustained use of research-based instructional practice: A case study of peer-assisted learning strategies in mathematics. Remedial and Special Education, 25(1), 5–24.
In this article, the authors describe the results of a study into PALS implementation at the elementary school level. Their findings suggest that a number of factors—high levels of professional development and the overall flexibility of the PALS strategy, to name but two—figured into a substantial degree of implementation fidelity among instructors and a significant outcome improvement among students.
Center for American Progress, & Education Resource Strategies. (2009, April). Realigning resources for district transformation: Using American Recovery and Reinvestment Act funds to advance a strategic education reform agenda. Retrieved on November 2, 2009, from https://www.americanprogress.org/issues/education-k-12/reports/2009/04/22/5959/realigning-resources-for-district-transformation/
In this report, the progressive think-tank assesses the current state of educational reform and suggests ways to deploy funds made available through the 2009 American Recovery and Reinvestment Act for the purpose of wisely investing those monies to the betterment of student achievement.
Collier-Meek, M. A., Fallon, L. M., Sanetti, L. M. H., & Maggin, D. M. (2013). Focus on implementation: Assessing and promoting treatment fidelity. TEACHING Exceptional Children, 45(5), 52– 59.
This informative overview of treatment fidelity, “the link between evidence-based interventions and changes in student performance,” includes a comprehensive definition of the subject, as well as notes and thoughts on the development of effective systems and procedures, ways to review student-outcome data, and effective methods of providing performance feedback, among much else.
Council for Exceptional Children. (2009, April). American Recovery and Reinvestment Act Questions and Answers: How the American Recovery and Reinvestment Act Impacts Special Education and Early Intervention. Retrieved on November 10, 2011, from http://www.cec.sped.org/Content/NavigationMenu/PolicyAdvocacy/CECPolicyResources/EconomicStimulus/ARRA_Q&A_Final_April_2009.pdf
The Council for Exceptional Children presents this handy Q&A, everything you ever wanted to know about the 2009 federal American Recovery and Reinvestment Act as it pertains to special and early education.
De Fazio, C. M., Fain, A. C., & Duchaine, E. L. (2011). Using treatment integrity in the classroom to bring research and practice together. Beyond Behavior, fall.
This short article outlines the basics of treatment integrity in the implementation of interventions. Besides a definition of the key terms, the authors here discuss the relationship between treatment integrity and classroom outcomes, as well as ways in which treatment integrity data can be collected and utilized. Sample treatment integrity forms are included as reference points.
Dude, M., Duchnowski, A., Clarke, S. (n.d.). Treatment integrity within applied research settings. PowerPoint presentation. Retrieved on November 2, 2009, from http://rtckids.fmhi.usf.edu/presentations.cfm
These PowerPoint slides take a look at the application of treatment integrity within applied research for students and their families. Included are notes on the benefits of treatment integrity, links between assessment and intervention, and possible barriers.
Greenwood, C. R., Terry, b., Arreaga-Mayer, C., & Finney, R. (1992). The classwide peer tutoring program: Implementation factors moderating students’ achievement. Journal of Applied Behavior, 25, 101–116. Retrieved on November 3, 2009, from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1279659/
In this article, the authors outline the results of their research to assess how implementation in classwide peer tutoring programs affects student outcome. Their findings suggest that variation in implementation did indeed influence student response. A discussion of implications follows.
Griffith, A. K., Duppong Hurley, K., & Hagaman, J. L. (2009). Treatment integrity of literary interventions for students with emotional and/or behavioral disorders. Remedial and Special Education, 30(4), 245–255.
This paper looks at some 44 studies published over three decades to examine the effect of treatment fidelity on the educations of children with emotional or behavioral disorders. The authors find that treatment integrity data were often not reported, and only reported at all in half of the studies. Ruminations on future research is included.
Guskey, T. R. (2002). Does it make a difference? Evaluating professional development. Educational Leadership, 59(6), 45–51. Retrieved on November 10, 2011, from https://www.ascd.org/el/articles/does-it-make-a-difference-evaluating-professional-development
Thomas Guskey outlines in five steps a method for improving a school’s professional development program. A working definition of evaluation, as well as a look at the critical levels of professional development evaluation, are included.
Harvard Family Research Project. (2004). Promoting quality through professional development: A framework for evaluation. Retrieved on November 2, 2009, from https://archive.globalfrp.org/publications-resources/browse-our-publications/promoting-quality-through-professional-development-a-framework-for-evaluation
This publication of Harvard’s Issues and Opportunities in Out-of-School Time Evaluation project examines the impact of evaluative approaches to OST programs. On hand is an overview of professional development, an answer to the question “why evaluate professional development initiatives,” and a model evaluation framework.
Hirschstein, M. K., Edstrom, L. V., Frey, K. S., Snell, J. L., & McKenzie, E. P. (2007). Walking the talk in bully prevention: Teacher implementation variables related to initial impact of the Steps to Respect program. School Psychology Review, 36, 3–21.
Here the authors discuss the results of their research into the effects of teacher implementation on the effectiveness of an anti-bullying program. Their findings indicate that fidelity of implementation did indeed have a positive effect on outcomes across a range of program vectors. The study is detailed and further implications are treated.
Jones, H. A., Clarke, A. T., & Power, T. J. (2008). Expanding the concept of intervention integrity: A multidimensional model of participation engagement. In Balance, 23, 4–5.
This article explores the current definition of what constitutes implementation fidelity and suggests the feasibility of an expansive, multifaceted approach to improving student outcomes.
Killion, J. (2005/ 2006). Evaluating the impact of professional development in eight steps. Evaluation Exchange, 11(4). Retrieved on November 2, 2009, from https://archive.globalfrp.org/evaluation/the-evaluation-exchange/issue-archive/professional-development/evaluating-the-impact-of-professional-development-in-eight-steps
This eight-step outline for assessing the effect of professional development implementation can serve as an orderly and user-friendly guide for those engaged in the often-difficult and confounding process.
Kimpston, R. D. (1983, April). Curriculum fidelity and implementation tasks employed by teachers. Presentation made at the Annual Meeting of the American Education Research Association, Montreal, Canada.
This presentation highlights a study that attempts to delineate the gap between planned curriculums and those that are, in effect, delivered in classrooms. The results demonstrate that adaptation by teachers was common and that significant variance between what was prescribed by school districts and what was actually taught necessitated further study.
King, R., & Torgesen, J. K. (2006). “Improving the effectiveness of reading instruction in one elementary school: A description of the process.” In P. Blaunstein & R. Lyon (Eds.), It Doesn’t Have to be This Way. Lanham, MD: Scarecrow Press, Inc.
The chapter presented here takes a detailed look at a “whole school change project that took place over six years” at an elementary school in Florida. In it, the authors enumerate the school’s efforts to increase the reading outcomes of the students through implementation and monitoring of an evidence-based reading program.
Kreider, H., & Bouffard, S. (2005/ 2006). A conversation with Thomas R. Guskey. Evaluation Exchange, 11(4). Retrieved on November 2, 2009, from https://archive.globalfrp.org/evaluation/the-evaluation-exchange/issue-archive/professional-development/a-conversation-with-thomas-r.-guskey
Dr. Thomas Guskey, a professor in the College of Education at the University of Kentucky and a lettered expert on the subject, talks about his method of professional development evaluation.
Kretlow, A. G., & Blatz, S. L. (2011). The ABCs of evidence-based practice for teachers. TEACHING Exceptional Children, 43(5), pp. 8–19.
This article lays out the case for the use of evidence-based practices and programs, especially in the special education classroom, where, the authors argue, they are most needed. Included here are thoughts on individual cases, as well as attempts to preemptively address questions that teachers might pose regarding evidence-based practices. A figure including some tips on how to navigate the What Works Clearinghouse Website is a useful addition.
Mitchem, K., Wells, D., & Wells, J. (2003). Using evaluation to ensure quality professional development in rural schools. Journal of Research in Rural Education, 18(2), 96–103.
Here the authors describe their proposal to increase the effectiveness of professional development practices in rural schools. The key, they believe, is an evaluative approach that helps to create a coherent program of professional development training.
Noell, G. H. (2008). “Research examining the relationship among consultation process, treatment integrity, and outcomes.” In W. P. Erchul & S. M. Sheridan (Eds.), Handbook of research in school consultation: Empirical foundations for the field (pp. 315–334). Mahwah, NJ: Erlbaum.
In this chapter, the author takes a look at how and to what extent treatment integrity affects student outcomes in mathematics. Findings indicate that teachers who completed a consultation on implementation fidelity saw in improvement in their students’ overall performance.
O’Donnell, C. L. (2008). Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in k–12 curriculum intervention research. Review of Educational Research, 78(1), 33–84. Retrieved on November 2, 2009, from http://rer.sagepub.com/cgi/content/abstract/78/1/33
This review suggests a paucity in available studies on evaluation methods for common interventions (particular those related to core curricula). To remedy this situation, this paper includes a clarification of the nature of curriculum intervention, as well as a look at evaluation criteria and fidelity measures.
Power, T. J., Blom-Hoffman, J., Clarke, A. T., Riley-Tillman, T. C., Kellerher, C., & Manz, P. (2005). Reconceptualizing intervention integrity: A partnership based framework of three follow-up strategies. School Psychology Review, 34, 87–106.
Here the authors examine the efficacy of a tri-part follow-up program for ensuring implementation fidelity. Findings suggest that fidelity of implementation must be frequently reassessed and, if necessary, corrected.
Simmons, D.C, & Kame’enui, E.J. (2006). A consumer’s guide to evaluating a core reading program. Center on Teaching and Learning, College of Education, University of Oregon. Retrieved on August 22, 2022 from http://reading.uoregon.edu/cia/curricula/con_guide.php
This tool enumerates in clearly outlined questions—and across categories—the characteristics that should be present in a comprehensive core reading program for grades K–3.
Slavin, R. E. (2004). Built to last: Long-term maintenance of Success for All. Remedial and Special Education, 25(1), 61–66.
In this article, the author examines the success of the Success for All educational reform program. Despite a downbeat take on school reform writ large (“The story of educational innovation over the long run is a depressing one.”), he finds that Success for All has been implemented effectively—and over a long-term—by those school that have engaged in it. He ends with a hopeful note about the future of the program and the possibility of its expansion to more and more schools.
Sterling-Turner, H. E., Watson, T. S., & Moore, J. W. (2002). The effects of direct training and treatment integrity on treatment outcomes in school consultation. School Psychology Quarterly, 17(1), 47–77.
This study examines the effects of the direct training of consultees on implementation fidelity. It supports a general conclusion that greater integrity yielded more positive learner outcomes. A number of specific cases are detailed and considered.
U.S. Department of Education. (2009, April). American Recovery and Reinvestment Act of 2009: IDEA recovery funds for services to children and youths with disabilities. Retrieved on November 2, 2009, from https://dpi.wi.gov/sites/default/files/imce/sped/pdf/aara-idea-factsheet.pdf
This publication of the U.S. government, Department of Education, outlines uses of funds made available under the American Recovery and Reinvestment Act of 2009 for IDEA. Includes are notes on relevant fiscal issues and accountability principles.
U.S. Department of Education. (2009, April). American Recovery and Reinvestment Act of 2009: Using ARRA funds to drive school reform and improvement. Retrieved on November 2, 2009, from https://ccip.ode.state.oh.us/documentlibrary/ViewDocument.aspx?DocumentKey=66235
This publication of the U.S. government, Department of Education, outlines uses of funds made available under the American Recovery and Reinvestment Act of 2009 for the purpose of school reform and improvement. Includes are notes on framing questions for decision making, examples of fund dispersal, and tips on establishing data systems.
The OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports https://www.pbis.org/
Funded by the Office of Special Education Programs, this center provides schools with capacity-building information and technical assistance in order to help them to identify, adapt, and sustain effective school-wide disciplinary practices.
State Implementation and Scaling up Evidence-based Practices (SISEP) Center https://fpg.unc.edu/
This site, hosted by the University of North Carolina Chapel Hill and funded in part by OSEP, acts as a resource to help states in their efforts to successfully implement and sustain the use of evidence-based practices in their schools. On hand is a host of resources, including notices about upcoming workshops, information briefs, and notes on Communities of Practice, among many, many others.