ED514190
ED514190
ED514190
Background/context:
Description of prior research, its intellectual context and its policy context.
In the present education policy environment a high priority has been placed on improving
teacher quality and teaching effectiveness in U.S. schools (Darling-Hammond et al., 2009;
Obama, 2009). Standards-based educational improvement requires teachers to have deep
knowledge of their subject and the pedagogy that is most effective for teaching the subject.
States and school districts are charged with establishing and leading professional development
programs, some with federal funding support, which will address major needs for improved
preparation of teachers. The whole issue of teacher quality, including teacher preparation and
ongoing professional development, and improving teacher effectiveness in classrooms, is at the
heart of efforts to improve the quality and performance of our public schools.
More recently, several major research synthesis projects have broadly analyzed evidence
on the effects of mathematics and science teacher preparation and development initiatives on
student achievement. One approach to reviewing evidence across studies is to apply a logic
model and to examine the relationship of teacher preparation on student achievement through
effects on intervening variables such as teacher knowledge and instructional practices (Clewell et
al., 2004; Ingvarson, Meiers & Beavis, 2005). This kind of full analytic model allows educators
and leaders to identify key decisions about the organization, delivery and support of teacher
development that are ingredients to positive outcomes.
State and local education agencies are responsible for directing and managing the use of
federal funds for teacher development and improvement as well as guiding programs supported
by states. Additionally, states are now required under NCLB to report on the qualifications of
teachers in core academic subjects and the proportion of teachers that receive high quality
professional development each year. Finally, states provide leadership for local systems on how
to design, select, and implement professional development for teachers. Strong, research-based
program designs, and evidence on their effects, are now in high demand across the U.S.
States and in turn local districts seek models for designing and implementing effective
professional development and particularly models supported by research evidence.
The intended audiences for the study’s findings are education leaders, decision-makers
and researchers. The study design builds on prior research and reporting on professional
development programs and evaluation findings (Blank, de las Alas & Smith, 2007, 2008). The
study was designed to measure and summarize consistent, systematic findings across multiple
studies that show significant effects of teacher professional development on student achievement
gains in K-12 mathematics or science.
The meta analysis study focused on identifying and analyzing research studies that
measured effects of teacher professional development with a content focus on math or science.
The meta analysis was carried out to address two primary questions:
The study took place in the United States over a period of two years from 2006 to 2008,
with analysis extended to the first part of 2009.
Across all the studies reviewed, the focus was on teachers in public elementary and
secondary schools teaching math or science at one or more grades K-12 and teachers who
participated in a professional development program aimed to improve their teaching in math or
science.
The meta analysis identified 16 studies of programs that had significant effect sizes and
provided teachers with professional development in mathematics or science. The information
available on program interventions indicated they included combinations of learning activities
such as summer institutes, coursework, study group, classroom mentoring, and professional
networking. Eight of the programs also offered teachers opportunities to put into practice newly-
learned lessons from the professional development by leading classroom instruction, and seven
of the programs bring teachers to observe a classroom with either an exemplary teacher modeling
instruction or a peer teacher implementing lessons learned during the professional development.
More details about program characteristics are available in Table 2. <Insert Table 2 here.>
Research Design:
Description of research design (e.g., qualitative case study, quasi-experimental design, secondary analysis, analytic
essay, randomized field trial).
Meta analysis
The design for the meta analysis built on prior studies in education (Borman et al., 2002;
Yoon et al., 2007; Lipsey & Wilson, 2001) and applied it to findings about professional
development across states and districts.
The design had four steps:
The design for the meta-analysis was also informed by a review of findings on teacher
development programs conducted by the American Institute for Research (Yoon, et al., 2007).
Figures 1 and 2 illustrates the process in more detail. <Insert Figures 1, 2>. In particular, the
meta-analysis study design centered on two areas: capturing the characteristics of the
professional development programs discussed in the studies and documenting the resulting
measurable student outcomes the studies attribute to the professional development programs.
The search process for potential studies included published and unpublished works as
well as evaluation reports from funded state and federal professional development projects. The
study authors conducted an intensive electronic search, using multiple and well-known databases
and meta-databases. In addition, searches were conducted targeting certain periodicals in which
evaluation studies of professional development programs would be featured. Publications and
databases of major education research centers were also examined. Moreover, the study authors
contacted principal investigators listed by program grants from the U.S. Department of
Education Title II-B project evaluations and the research studies funded by the Institute of
Education Sciences, the NSF Teacher Preparation Continuum and MSP project evaluations, and
studies of the Local Systemic Initiatives. Lastly, cross-checks were carried out with findings
from prior reviews and synthesis studies in teacher professional development. Four hundred
sixteen studies or reports were identified for pre-screening. A review of the corresponding
abstracts of those studies reduced the count to 74 studies. These remaining studies were
screened by a team of trained coders who utilized a coding form based on a coding and
reconciliation software program and form developed by AIR (Yoon, et al., 2007). Figure xx
outlines the document review process and the resulting studies included in the meta analysis.
<Insert Figure xx>. Meta regression analyses were conducted with the remaining sixteen
studies, with the focus on studies that featured professional development in mathematics, since
these studies produces the greater number of effect sizes than professional development in
science.
Findings / Results:
Description of main findings with specific details.
Conclusions:
Description of conclusions and recommendations based on findings and overall study.
Based on the results of the meta analysis of findings from teacher professional
development studies, several recommendations can be made about use of meta analysis methods
and their use for researchers, evaluators and education leaders.
This meta analysis review did not include systematic identification or review of
intervening measures of the professional development treatment, such as measures of gains in
teacher knowledge, improvement in practices, or fidelity of implementation of what was learned.
Several of the studies identified did report analysis of differences on these kinds of measures
between teachers in the treatment and control groups. Further analysis across studies would
provide stronger evidence and useful information about the relationship between professional
learning of teachers from a specific initiative and subsequent improved learning by students.
Appendix A. References
References marked with an asterisk (*) indicate studies included in the meta analysis.
Ball, D. L., & Bass, H. (2000). Interweaving content and pedagogy in teaching and learning to
teach: Knowing and using mathematics. In J. Boaler (ed.), Multiple perspectives on the
teaching and learning of mathematics. (pp. 83–104). Westport, CT: Ablex.
Banilower, E. R., Boyd, S. E., Pasley, J. D., & Weiss, I. R. (2006, February). Lessons from a
decade of mathematics and science reform: A capstone report for the Local Systemic
Change through Teacher Enhancement Initiative. Retrieved from Horizon Research, Inc.
website: http://www.pdmathsci.net/reports/capstone.pdf
Birman, B. F., & Porter, A. C. (2002). Evaluating the effectiveness of education funding streams.
Peabody Journal of Education, 77(4), 59–85.
Birman, B. F., Le Floch, K. C., Klekotka, A., Ludwig, M., Taylor, J. Walters, J. et al. (2007).
State and local implementation of the No Child Left Behind Act: Volume II — teacher
quality under NCLB: Interim report. Washington DC: U.S. Department of Education,
Office of Planning, Evaluation and Development, Policy and Program Studies Service.
Blank, R. K., de las Alas, N., & Smith, C. (2007, February). Analysis of the quality of
professional development programs for mathematics and science teachers: Findings from
a cross-state study. Retrieved from Council of Chief State School Officers website:
http://www.ccsso.org/content/pdfs/year%202%20new%20final%20NSF%20Impde%20F
all%2006%20%20Report%20-032307.pdf
Blank, R. K., de las Alas, N., & Smith, C. (2008, February). Does teacher professional
development have effects on teaching and learning? Evaluation findings from programs
in 14 states. Retrieved from Council of Chief State School Officers website:
http://www.ccsso.org/content/pdfs/cross-state_study_rpt_final.pdf
Borko, H. (2004, November). Professional development and teacher learning: Mapping the
terrain. Educational Researcher, 33(8), 3-15.
Borman, G. D., Hewes, G. M., Overman, L. T., & Brown, S. (2002, November). Comprehensive
school reform and student achievement. A meta analysis. (Report. No. 59). Baltimore,
MD: Center for Research on the Education on Students Placed At Risk, Johns Hopkins
University.
Carey, K. (2004, Winter). The real value of teachers: Using new information about teacher
effectiveness to close the achievement gap. Thinking K-16, 8(1), 3–41.
Chambers, J. G., Lam, I., & Mahitivanichcha, K. (2008, September). Examining context and
challenges in measuring investment in professional development: a case study of six
school districts in the southwest region. (Issues & Answers Report, REL2008-No. 037).
Retrieved from U.S. Department of Education, Institute of Education Sciences website:
http://ies.ed.gov/ncee/edlabs/regions/southwest/pdf/REL_2008037.pdf
Choy, S. P., Chen, X., & Bugarin, R. (2006, January). Teacher professional development in
1999-2000: What teachers, principals, and district staff report. (NCES 2006-305).
Washington, DC: National Center for Education Statistics.
Clewell, B. C., Cosentino de Cohen, C., Campbell, P. B., Perlman, L., Deterding, N., Manes, S.,
et al. (2004, December). Review of evaluation studies of mathematics and science
curricula and professional development models. Report submitted to the GE Foundation.
Unpublished manuscript.
Cohen, D. K., & Hill, H. C. (1998). Instructional policy and classroom performance: The
mathematics reform in California (RR-39). Retrieved from Consortium for Policy
Research in Education website: http://www.cpre.org/Publications/rr39.pdf
Corcoran, T. B. (2007). Teaching matters: How state and local policymakers can improve the
quality of teachers and teaching. (CPRE Policy Briefs RB-48). Philadelphia, PA:
Consortium for Policy Research in Education, University of Pennsylvania.
Corcoran, T., & Foley, E. (2003). The promise and challenge of evaluating systemic reform in an
urban district. Research perspectives on school reform: Lessons from the Annenberg
Challenge. Providence, RI: Annenberg Institute at Brown University.
Council of Chief State School Officers. (2006). Improving evaluation of teacher professional
development in math and science, year 1 project report. Washington, DC: Author.
Darling-Hammond, L. (1999). Teacher quality and student achievement: A review of state policy
evidence. Retrieved from Center for the Study of Teaching and Policy website:
http://depts.washington.edu/ctpmail/PDFs/LDH_1999.pdf
Desimone, L. M., Porter, A. C., Garet, M. S., Yoon, K. S., & Birman, B. F. (2002). Effects of
professional development on teachers’ instruction: Results from a three-year longitudinal
study. Educational Evaluation and Policy Analysis, 24(2), 81–112.
Frechtling, J (2001). What evaluation tells us about professional development programs in math
and science. In C. R. Nesbit, J. D, Wallace, D. K. Pugalee, A.-C. Miller, & W. J. DiBiase
(Eds.), Developing Teacher Leaders: Professional Development in Science and
Mathematics (pp. 17–42). Columbus, OH: ERIC Clearinghouse for Science Mathematics,
and Environmental Education.
Garet, M. S., Birman, B. F., Porter, A. C., Desimone, L., Herman, R. & Yoon, K. S. (1999).
Designing effective professional development: Lessons from the Eisenhower program
and technical appendices (Report No. ED/OUS99-3). Washington, DC: American
Institutes for Research.
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes
professional development effective? Results from a national sample of teachers.
American Educational Research Journal, 38(4), 915–945.
!"#$%&'()'!))'*+%+",-$&'*)'.)&'/'(0-12"++$34-5$+"&'6)'7899:&'(;<<+"=)'.+#"$>$2'%-'%+#?0'
<#%0+<#%>?,'>$'%0+'?-$%+@%'-A',B,%+<>?'"+A-"<)'!"#$%&'()*+,&'-%.('/)0#1#'$&2)
3.,$('/4)557C=&'DE93F8)
Guskey, T. R. (2003, June). What makes professional development effective? Phi Delta Kappan,
84(10), 748–750.
Harris, D. N., & Sass, T. R. (2007, March). Teacher training, teacher quality and student
achievement. (Working Paper 3). Retrieved from National Center for Analysis of
Longitudinal Data in Education Research website:
http://www.caldercenter.org/PDF/1001059_Teacher_Training.pdf
*Heller, J. I., Curtis, D. A., Rabe-Hesketh, S., Clarke, C., & Verbencoeur, C. J. (2007, August
29). The effects of "Math Pathways and Pitfalls" on students' mathematics achievement:
National Science Foundation final report. Retrieved from ERIC database. (ED498258)
Hiebert, J. (1999, January). Relationships between research and the NCTM standards. Journal
for Research in Mathematics Education, 30(1), 3–19.
Hill, H. C., Schilling, S. G., & Ball, D. L. (2004, September). Developing measures of teachers’
mathematics knowledge for teaching. Elementary School Journal, 105(1), 11.
Ingvarson, L., Meiers, M. & Beavis, A. (2005, January 29). Factors affecting the impact of
professional development programs on teachers’ knowledge, practice, student outcomes
& efficacy. Education Policy Analysis Archives, 13(10). Retrieved from
http://epaa.asu.edu/epaa/
Kennedy, M. (1998). Form and substance in inservice teacher education. (Research Monograph
No. 13). Madison, WI: University of Wisconsin, Staff National Institute for Science
Education.
Lipsey, M. W., & Wilson, D. B. (2001). Practical meta analysis. Applied Social Research
Methods Series (Vol. 49). Thousand Oaks, CA: Sage.
Loucks-Horsley, S., Hewson, P., Love, N., & Stiles, K. E. (1998). Designing professional
development for teachers of science and mathematics. Thousand Oaks, CA: Corwin
Press.
*META Associates. (2006, March). Northeast Front Range math/science partnership (MSP) to
increase teacher competence in content. Year 2 evaluation report: January 1, 2005–
December 31, 2005. Golden, CO: Author.
*META Associates. (2007, March). Northeast Front Range math/science partnership (MSP) to
increase teacher competence in content. Final evaluation report: January 1, 2004–
December 31, 2006. Golden, CO: Author.
*Meyer, S. J., & Sutton, J. T. (2006, October). Linking teacher characteristics to student
mathematics outcomes: Preliminary evidence of impact on teachers and students after
participation in the first year of the Math in the Middle Institute Partnership. Paper
presented at the MSP Evaluation Summit II, Minneapolis, MN.
National Center on Education Statistics. (n.d.). Statewide longitudinal data systems grant
program: Grantee state [Website]. Retrieved from
http://nces.ed.gov/Programs/SLDS/stateinfo.asp
National Commission on Teaching & America’s Future (1996). What matters most: Teaching for
America’s future. New York: Author.
*Niess, M. L. (2005). Oregon ESEA Title IIB MSP: Central Oregon consortium. Report to the
U.S. Department of Education, Mathematics and Science Partnerships. Corvallis, OR:
Department of Science & Mathematics Education, Oregon State University.
Obama, B. (2009, March 10). Taking on education [Web log message]. Retrieved from
http://www.whitehouse.gov/blog/09/03/10/Taking-on-Education/
O’Reilly, F. E., & Weiss, C. H. (2006, April). Opening the black box: Using theory-based
evaluation to understand professional development for k-12 teachers of math and
science. Paper presented at the Annual Meeting of the American Educational Research
Association, San Francisco, CA.
*Palmer, E. A., & Nelson, R. W. (2006, September). Researchers in every classroom. Evaluation
report, 2005-06. Barnes, WI: ASPEN Associates.
*Rubin, R. L., & Norman, J. T. (1992). Systematic modeling versus the learning cycle:
Comparative effects of integrated science process skill achievement. Journal of Research
in Science Teaching, 29, 715–727.
Scher, L. S., & O’Reilly, F. E. (2007, March). Understanding professional development for k-12
teachers of math and science: A meta-analysis. Paper presented at the Annual Meeting of
the American Educational Research Association, Chicago, IL.
Showers, B., Joyce, B. & Bennett, B. (1987). Synthesis of research on staff development: A
framework for future study and state-of-the-art analysis. Education Leadership, 45(3),
77-87.
*Siegle, D., & McCoach, D. (2007). Increasing student mathematics self-efficacy through
teacher training. The Journal of Secondary Gifted Education, 18(2), 278–331.
Supovitz, J. A. (2003). Evidence of the influence of the National Science Education Standards on
the professional development system. In K. S. Hollweg & D. Hill (Eds.), What is the
influence of the National Science Standards? (pp. 64–75). Washington, DC: National
Academy Press.
Wayne, A. J., Yoon, K. S., Zhu, P., Cronen, S., & Garet, M. S. (2008, November).
Experimenting with teacher professional development: Motives and methods.
Educational Researcher, 37(8), 469–479.
Weiss, I. R., Banilower, E. R., McMahon, K. C., & Smith, P. S. (2001). Report of the 2000
national survey of science and mathematics education. Retrieved from Horizon Research,
Inc. website: http://2000survey.horizon-research.com/reports/status/complete.pdf
Wilson, S. M., & Berne, J. (1999). Teacher learning and the acquisition of professional
knowledge: An examination of research on contemporary professional development.
Review of Research in Education, 24, 173–209.
Yoon, K. S., Duncan, T., Lee, S., W.-Y., Scarloss, B., & Shapley, K. (2007). Reviewing the
evidence on how teacher professional development affects student achievement. (Issues &
Answers Report, REL 2007-No. 033). Retrieved from Institute of Educational Sciences
website: http://ies.ed.gov/ncee/edlabs/regions/southwest/pdf/REL_2007033.pdf
High Quality PD
Teacher Instructional
• Content-focused
Knowledge & Practices
• Active Learning
Skills
• Coherence
• Duration/Frequency
• Collaborative
Participation
Effects on Students
• Measures of Achievement
• Cohorts over Time
• Student Unit Records
• Linked to Teachers
Figure 2 Steps in Study Design
Identification and collection of potential studies
• Prior and ongoing literature reviews and meta-analyses
• Electronic search through databases, meta-databases & use of search terms
• Journal search
• Research centers
• NSF & US ED recently completed studies
Pre-screening
Data Analysis
• Test of Homogeneity
• Descriptive Statistics
• Correlations between Professional Development
Programs and Student Outcome Effect Sizes
†
For Cohen's d an ES > 0.0 ! 0.3 is a "small" effect, >0.3 and <0.8 a "medium" effect and "0.8 a "large" effect.
Table 1 – continued
Treatment
Teachers N Median
Study Grade/ School Level; Size (All Effect Number of
Study Design Content Area Teachers) Size Effects Student Outcome Measure
Palmer & QED Gr. 5-10; Science 16 (43) .11 5 Northwest Evaluation Association
Nelson, assessments pre-posttest gain
2006*
Rubin & RCT Middle; Science 7 (16) .64 8 Middle Grades Integrated Process Skill
Norman, Test pre-posttest
1992
.12 Group Assessment of Logical Thinking
Test pre-posttest
Saxe, QED Elementary; Math 17 (6) xx Study-specific assessments
Gearhart, & (Computational Scale)
Nasir, 2001
1.63 6 Study-specific assessments (Conceptual
Scale) posttest
Scott, 2005 QED Gr. 3; Science 3 (6) .20 2 Iowa Test of Basic Skills pre-posttest gain
Siegle & RCT Gr. 5; Math 7 (15) .20 2 Math Achievement Test
McCoach,
2007
Snippe, 1992 RCT High; Math 87 (198) -.01 21 Terra Nova
.20 ACCUPLACER
.06 WorkKeys
Walsh- QED Gr. 5; Math 4 (6) .26 2 PSG Achievement Assessment pre-
Cavazos, posttest gain
1994
Table 2: Professional Development Features
[
Table 4: Correlation Table of Math Post-Only Professional Development Design Elements
1 2 3 4 5 6 7 8 9 10 11 12 13
Time
1. Contact Hr. 1
2. Frequency .741** 1
3. Duration .834** .623** 1
PD Activities
4. Summer
.577** .399** .655** 1
Institutes
5. College
.744** -.171 .596** .618** 1
Courses
6.Conferences -.196 .094 .146 -.403** -.249* 1
7. Study Group -.694** -.253 -.602** -.524** -.369** .287* 1
Active Learning
8. Lead Discussion -.196 .094 .146 -.403** -.249* 1.000** .287* 1
9. Learning Network -.657** .048 -.601** -.351** -.471** .249* .796** .249* 1
10.Develop Assessments -.138 .398** .135 .345** -.249* -.172 .021 -.172 .155 1
11. Observe Teachers -.154 .562* .084 .418** -.360** -.249* -.298* -.249* -.093 .692** 1
12. Classroom Mentoring -.421** -.571** -.742** -.394** -.028 -.347** .579** -.347** .502** -.347** -.502** 1
Coherence
13. Link to curriculum, .043 -.161 .106 -.406** -.244* .221 .163 .221 -.158 -.080 -.324** -.059 1
Goals
Two-tail test: * significant at p<.05; ** significant at p<.01