Data-Driven Decision Making and Dewey's Science of Education
Data-Driven Decision Making and Dewey's Science of Education
Data-Driven Decision Making and Dewey's Science of Education
Education
Natalie Schelling, Lance E. Mason
Education and Culture, Volume 37, Number 1, 2021, pp. 41-59 (Article)
Abstract
This paper considers elementary teachers’ perceptions of data-driven
decision making (DDDM) through the lens of Dewey’s science of
education. Dewey argues that educational science can be used to
improve teaching and learning, but it must be flexible, attendant
to multiple dimensions of growth, and grounded in real classroom
experiences with teachers at the helm. Results indicate that teachers
have a vision of classroom assessments consistent with Dewey’s vision,
but often feel inhibited by the constraints of standardized assessments.
Keywords: DDDM, science, assessment, standardization
Introduction
This paper uses results from a qualitative study to consider how data can be used
for elementary teachers’ instructional planning in a manner consistent with
Dewey’s vision for a science of education. Dewey1 argues that science can be used
to make human endeavors, including education, more intelligent and socially
useful. This is ostensibly consistent with the current conception of data-driven
decision making (DDDM), the systematic use of student data to inform decisions in
educational settings.2 However, Dewey3 also cautions against the oversimplification
of using physical science as the model for human understanding, stating, “When
we introduce a like simplification into social and moral subjects we eliminate
the distinctively human factors.” Dewey4 argues that a “science of education” is
particularly complex and multifaceted, as many past and present factors influence
whatever conclusions can be derived (19). He aims at a human science that is flexible,
responsive, and recognizes both linear and nonlinear dimensions of educational,
social, and moral growth.
The contemporary American education system has veered far from Dewey’s
vision. Today’s prevailing educational science emphasizes teachers’ data use as
evidence of their instructional effectiveness, but is often based upon the kind
Background
Data-Driven Decision Making
Definition and Purpose
The emphasis on data-driven decision making (DDDM) is largely the result of
NCLB legislation in the United States,5 which emphasized measurable outcomes of
education, prioritizing the use of standardized tests and their resulting data to inform
educational decisions. Following NCLB, many individual states adopted systems of
standardized assessment for students and of evaluations for teachers that emphasize
using assessment data to plan instruction. In many states, the results of standardized
assessments and evaluations of teacher effectiveness impact the funding allocated to
schools and districts. Indiana adopted the Indiana Statewide Testing for Educational
Progress (ISTEP) standardized assessment system and the RISE evaluation of teacher
effectiveness, which includes the use of DDDM within the classroom.6
Method
Participants
Given the influence of state policy on DDDM and teachers’ practices, the current
study focused on one state, Indiana. As a result of NCLB, Indiana developed a state
assessment system, ISTEP, and a teacher evaluation system, RISE, which empha-
sizes assessment practice and DDDM.29 A total of nine elementary school teachers
participated in three focus groups. The teachers were purposefully sampled from
elementary schools in Indiana to represent various (1) grade levels; (2) teaching
experience; (3) school SES; (4) district locations (see Table 1). The percentage of
students receiving free or reduced lunch was used as an indicator of school SES.30
Geographical locations were identified using the National Center for Education
Statistics classifications31 (see Table 1 for classification descriptions).
Procedures
Convenience sampling was used to identify participants for the first focus group.
A university practicum coordinator made recommendations for other districts to
represent different locations and school SES. After the university practicum coor-
dinator helped make initial contact with the district, convenience sampling was
used to identify participants. Focus groups were conducted at times and locations
convenient for the participating teachers.
3–5
Leslie (Reading 4 60 Town: Distant
specialist)
Annie 5 16 49 Town: Distant
1
Laura 4 7 60 Town: Distant
*Town: Distant—Territory inside an urban cluster that is more than 10 miles and less than
or equal to 35 miles from an urbanized area; City: Small—Territory inside an urbanized
area and inside a principal city with population less than 100,000; Suburb: Large—Territory
outside a principal city and inside an urbanized area with population of 250,000 or more
Measures
A structured interview protocol was used to guide the focus group conversation.
Interview questions were created based on the guidelines suggested by Fishbein and
Ajzen32 to address constructs within TPB. The current study primarily analyzed
four main questions:
1. What do you think about conducting DDDM?
2. What individuals or groups approve/disapprove of you con-
ducting DDDM?
3. Do people who are important to you think you should conduct DDDM?
4. Is conducting DDDM something you chose to do?
Analysis
Focus groups were audio recorded, then transcribed using the software Temi.
Researchers used deductive coding to categorize responses according to the research
constructs (i.e., social norms and autonomy). After data analysis, member checks
were conducted.34
Results
Attitudes
Teachers were asked about their instrumental attitudes, thoughts about DDDM.
In general, teachers expressed positive instrumental attitudes. They thought that
conducting DDDM was helpful and important to give their instruction further
direction and target interventions for struggling students. Teachers discussed using
DDDM as a basis to define attainable goals for students’ differing needs. In partic-
ular, teachers thought that DDDM was helpful when assessment data reaffirmed
their practice or showed students’ growth. Noting mounting pressure to “justify”
or “prove” the effectiveness of their practices to stakeholders such as administrators
and parents, teachers thought it was beneficial to have tangible data that supported
the efficacy of their teaching. Katie reflected positively on the benefits she perceived
by using DDDM in her own classroom,
If you wanted to say, [a student might] have a learning disability, you want
test data and interventions that have data will back that up. It gives you
good feedback if you’re doing your job or not. It helps with the grouping.
It helps you get a real pulse on how the class is doing collectively and indi-
vidually. It gives you support and it gives something that’s not subjective,
so you can look, not that it’s a comparison contest, but my team, we look
at each other’s data. Like in our meetings we’ll literally say this person,
their kids score in the nineties, like sometimes they’ll get like a hundred
percent and they’ll be like, what did you do? And then it’ll be a very helpful
conversation for us as teachers. So there’s benefits professionally and then
there’s benefits for the kids for individualized instruction and then there’s
benefits for the parents too.
Like Katie, most teachers discussed benefits not only to their own practice, which
benefited their students, but also for stakeholders to understand what and how
they were teaching, particularly given the high stakes associated with teaching
evaluations.
Across the focus groups, teachers discussed the limitations of standardized assess-
ment. Teachers reflected on the inability of standardized assessment to capture the
holistic view of the student, which is more attainable with classroom assessment.
Teachers emphasized that standardized assessment, and the analysis of the data,
occurred within the context of the already busy classroom, where they are teaching
but also attending to students’ other needs.
Social Norms
Participating teachers were asked to identify individuals who influenced their
DDDM practices, particularly who expected them to conduct DDDM. In each focus
group, participants indicated that principals and district administrators greatly
emphasized the use of DDDM. This was especially important as administrators,
especially principals, played a large role in the implementation of the RISE eval-
uation. Teachers also acknowledged that administrators, attending to the stakes
associated with standardized assessment data, were under pressure themselves.
This pressure, both on teachers and their students, created a negative culture of
assessment. Teachers with principals who focused on results, rather than progress,
were more hesitant to discuss data with colleagues and generally less confident
in their DDDM practices. In addition, the high-stakes teaching evaluation pro-
cess, which emphasizes DDDM, caused teachers anxiety to the detriment of their
instructional effectiveness.
Faith continued to explain that students would compete over test scores, rather
than setting their own personal goals. In these situations, DDDM is not achieving
its intended aims of allowing data to inform the creation of more effective learning
for students. Instead, focusing on results, which some principals use as the sole
metric of success, ignoring the other needs of students and teachers, is detrimental
to student learning.
Autonomy
Similar to their attitudes, teachers’ perceptions of their autonomy in DDDM differed
for standardized and classroom assessments. In general, teachers felt that they had
some degree of autonomy. There was an expectation from administrators and state-
level stakeholders to conduct assessment, but teachers felt that, at least for classroom
assessment, they were able to design assessments as needed and interpret the data
as it was useful to them, attending to the particular context of their class. Teachers
agreed that, regardless of external expectations, they would still implement their
own assessments to gauge the progress of their students. However, the emphasis
of standardized assessment in DDDM, as reflected in teaching evaluations, led the
teachers to feel that they had less autonomy. In reflecting on the implementation
of and interpretation of standardized assessment, teachers agreed that they mostly
complied with the expectations of their administrators and state guidelines,
Discussion
Consistent with Dewey’s philosophy, teachers recognized that educational research,
specifically using student data, was useful and important in the instructional design
process. On Dewey’s points of epistemological and methodological flexibility, teach-
ers often identified student data from classroom assessment as assisting in a broader
and continuous diagnostic evaluation of students and their needs, rather than as
a providing an immediate or static prescription for practice. Teachers benefited
from analyzing data in a collaborative, deliberative process with other teachers and
professionals in which the data were but one component of a bigger picture in the
process. A collaborative approach also allowed teachers to gain multiple perspec-
tives on students’ progress, perhaps from others who had taught the students. Here,
consistent with Dewey’s vision, points of data were revisited and reconsidered in
light of the particular experiences of the teachers with a given student, in addition
to considering a range of other metrics, both formal and informal, at the teachers’
disposal. Further, teachers in schools where conversations about data were collab-
orative and low-stakes felt more comfortable having authentic and meaningful
discussions, which ultimately best served the needs of students.
Also consistent with Dewey’s assertion that research be grounded in con-
cerns from actual classrooms, teachers especially benefited from instances in which
they collected and interpreted their own assessment data. Teachers believed that
Conclusion
The conclusions of this study suggest that DDDM, when used in a manner consistent
with Dewey’s science of education, can assist classroom learning. The results of the
qualitative study identify a great deal of alignment between Dewey’s vision and the
needs of classroom teachers and their students. Like Dewey, teachers recognize
the need to contextualize data in light of the situational complexities before them,
and to use the information diagnostically, rather than punitively. The study shows
teachers using data in a deliberate manner with other professionals as part of a
broader picture of student growth.
Perhaps most evident is the alignment between Dewey and the teacher
respondents’ focus on such work being grounded in actual classrooms. Teachers
were strong proponents of using classroom assessment in ways that were useful
Notes
1 John Dewey, Democracy and Education (New York: Free Press, 1916); John Dewey,
The Sources of a Science of Education (Reprint, New York: Liveright, 2013).
2 Ellen B. Mandinach, “A Perfect Time for Data Use: Using Data-Driven Decision Making
to Inform Practice,” Educational Psychologist 47 (2012): 71–85.
3 John Dewey, The Later Works, 1925–1953, Volume 3: 1929, edited by Jo Ann Boydston
(Carbondale, IL: Southern Illinois University, 1984), 173.
4 Dewey, The Sources of a Science of Education, 19.
5 Viki M. Young and Debbie H. Kim. “Using Assessments for Instructional Improvement:
A Literature Review,” Education Policy Analysis Archives 18, no. 19 (2010).
6 Indiana Department of Education, “RISE Evaluation Model: Evaluator and Principal
Handbook Version 3.0,” Indiana Department of Education, January 2020.
7 Mandinach, “A Perfect Time for Data Use.”
8 Laura Hamilton, Richard Halverson, Ellen Mandinach, Jonathan A. Supovitz, Jeffrey C.
Wayman, and What Works Clearinghouse, “Using Student Achievement Data to Support
Bibliography
Armstrong, Jane, and Katy Anthes. “How Data Can Help.” American School Board Journal
188, no. 11 (2001): 38–41.
Carlson, Deven, Geoffrey D. Borman, and Michelle Robinson. “A Multistate District-Level
Cluster Randomized Trial of the Impact of Data-Driven Reform on Reading and
Mathematics Achievement.” Educational Evaluation and Policy Analysis 33, no. 3
(2011): 378–98.
Creswell, John W. Research Design: Qualitative, Quantitative, and Mixed Method Approaches,
4th ed. Thousand Oaks, CA: Sage Publications, 2014.
Dewey, John. Democracy and Education. New York: Free Press, 1916.
Dewey, John. The Later Works, 1925–1953, Volume 3: 1929. Edited by Jo Ann Boydston.
Carbondale, IL: Southern Illinois University, 1984.
Dewey, John. The Sources of a Science of Education. Reprint. New York: Liveright, 2013.
Fishbein, Martin, and Icek Ajzen. Predicting and Changing Behavior. New York: Psychology
Press, 2015.
Hamilton, Laura, Richard Halverson, Ellen Mandinach, Jonathan A. Supovitz, Jeffrey C.
Wayman, and What Works Clearinghouse. “Using Student Achievement Data to Support
Instructional Decision Making.” IES Practice Guide. NCEE 2009-4067. National Center
for Education Evaluation and Regional Assistance, 2009.
Hoover, Nancy R., and Lisa M. Abrams. “Teachers’ Instructional Use of Summative Student
Assessment Data.” Applied Measurement in Education, Teachers’ and administrators’ use
of evidence of student learning to take action: Conclusions drawn from a special issue on
formative assessment, 26, no. 3 (July 2013): 219–31. doi:10.1080/08957347.2013.793187.
Indiana Department of Education. “Indiana Department of Education School & Community
Nutrition Programs.” Retrieved in 2017 from https://www.doe.in.gov/nutrition/school
-nutrition-programs.
Indiana Department of Education. “RISE Evaluation Model: Evaluator and Principal
Handbook Version 3.0.” Indiana Department of Education, January 2020. https://www
.doe.in.gov/sites/default/files/evaluations/rise-handbook-30.pdf
Mandinach, Ellen B. “A Perfect Time for Data Use: Using Data-Driven Decision Making to
Inform Practice.” Educational Psychologist 47 (2012): 71–85. https://doi.org/10.1080/0
0461520.2012.667064.