Artikel 3 - Insights Chinese Primary Mathematics Teachers Gained Into Their Students' Learning From Using Classroom Assessment Techniques
Artikel 3 - Insights Chinese Primary Mathematics Teachers Gained Into Their Students' Learning From Using Classroom Assessment Techniques
Artikel 3 - Insights Chinese Primary Mathematics Teachers Gained Into Their Students' Learning From Using Classroom Assessment Techniques
Article
Special Issue
The Quality of Classroom Assessments
Edited by
Prof. Dr. Anders Jönsson
https://doi.org/10.3390/educsci9020150
education
sciences
Article
Insights Chinese Primary Mathematics Teachers
Gained into their Students’ Learning from Using
Classroom Assessment Techniques
Xiaoyan Zhao 1,2, * , Marja van den Heuvel-Panhuizen 1,3,4, * and Michiel Veldhuis 3,5, *
1 Freudenthal Institute, Faculty of Science, Utrecht University, 3584 CC Utrecht, The Netherlands
2 School of Teacher Education, Nanjing Normal University, Nanjing 210023, China
3 Freudenthal Group, Faculty of Social and Behavioural Sciences, Utrecht University, 3584 CS Utrecht,
The Netherlands
4 Faculty of Education, Art and Culture, Nord University, 8049 Bodø, Norway
5 iPabo University of Applied Sciences, 1061 AD Amsterdam, The Netherlands
* Correspondence: [email protected] (X.Z.); [email protected] (M.v.d.H.-P.);
[email protected] (M.V.)
Received: 13 March 2019; Accepted: 11 June 2019; Published: 18 June 2019
Abstract: In this study, we explored the insights that Chinese primary mathematics teachers gained
into their students’ mathematical understanding from using classroom assessment techniques
(CATs). CATs are short teacher-initiated targeted assessment activities proximate to the textbook,
which teachers can use in their daily practice to make informed instructional decisions. Twenty-five
third-grade teachers participated in a two-week program of implementing eight CATs focusing on the
multiplication of two-digit numbers, and filled in feedback forms after using the CATs. When their
responses described specific information about their students, emphasized the novelty of the gained
information, or referred to a fitting instructional adaptation, and these reactions went together
with references to the mathematics content of the CATs, the teachers’ responses were considered
as evidence of gained insights into their students’ mathematics understanding. This was the case
for three-quarters of the teachers, but the number of gained insights differed. Five teachers gained
insights from five or more CATs, while 14 teachers did so only from three or fewer CATs, and six
teachers showed no clear evidence of new insights at all. Despite the differences in levels of gained
insights, all the teachers paid more attention to descriptions of students’ performance than to possible
instructional adaptations.
1. Introduction
information about their students’ learning [3]. Teacher-led assessment activities that are interwoven
with instruction and fully integrated in the teachers’ daily teaching practice, such as questioning,
observing students, and giving quizzes or teacher-made written assignments, can provide insights
about students’ thinking and about what productive and actionable next instructional steps might
be taken [4]. When the assessment focuses on figuring out what students know, or what difficulties
students have, for the purpose of making decisions about further instruction, it is considered as
formative assessment. Formative assessment in which the teacher has the lead is often referred to as
classroom assessment [5–10].
What information can be collected through classroom assessment depends largely on what
assessment activities are conducted. Helpful assessment activities are those which offer teachers
a window into the students’ thinking to uncover their mathematical conceptions and skills [10].
Therefore, much attention has been paid to gaining knowledge about how mathematics teachers can
improve their assessment activities to acquire adequate information about their students’ development
(see, for example, Schoenfeld [11]). Research has shown that using various oral questioning strategies
and written tasks, and then analyzing students’ responses, offers mathematics teachers opportunities
to reveal their students’ understanding [11,12]. In particular, challenging students with open-ended
problems enables teachers to diagnose students’ understanding and reveal their methods of problem
solving [12,13]. Other measures that make assessments by teachers more informative are using
rubrics [14] or concept maps [15] as frameworks for analyzing students’ responses. Both measures
were found to assist teachers in identifying gaps in their students’ understanding of the particular
mathematical topics under investigation.
Characteristic of the aforementioned studies on assessment techniques is that the techniques are
general in nature. They can be applied to any subject and to any mathematical topic. When teachers
are provided with such examples of assessment techniques, it can happen, as was found by James and
McCormick [22] (p. 976), that some teachers understand the “spirit” behind the assessment techniques,
and thus are able to adapt them to their teaching, but that others just catch the “letter” of them and
carry them out in a ritualized and mechanistic way. The latter may be the result of providing teachers
with assessment techniques that are not directly related to the content the teachers are teaching at that
moment. To avoid this, and to have assessment techniques that can generate indications for further
instruction, the techniques should be content-dependent.
A study in which this content-dependent approach was adopted is that of Phelan et al. [23].
The aim of their study was supporting teachers to assess students’ learning in pre-algebra. To find out
what had to be assessed, an expert panel was organized to map algebra knowledge and its prerequisites.
Such a map was used to design the questions and tasks that could provide teachers with the necessary
information. This innovative content-dependent approach to assessment, which differs from providing
teachers with general assessment guidelines, turned out to be rather successful, and apparently had a
positive impact on students’ learning [23].
To make the assessment even closer to the teaching at hand, Veldhuis and Van den
Heuvel-Panhuizen [19] took the textbook used by the teachers as a starting point. They designed brief
and targeted activities, called classroom assessment techniques (CATs), that teachers could use in their
daily practice to reveal information about students’ learning of a particular mathematical concept or
skill. The ultimate goal of CATs is providing teachers with deep insights into students’ mathematical
thinking to make adequate instructional decisions. This requires a skillful way of questioning [24,25],
or, in the words of Heritage and Heritage [26] (p. 187), “questioning lies at the epicenter of formative
assessment.” For the CATs, this implies that they were designed to serve as an eye-opener for teachers
to acquire knowledge about their students’ learning that they did not have before. This goes beyond
knowing whether students are able to flawlessly carry out particular calculation procedures. Instead,
the CATs are intended to delve deeper and reveal whether and how students understand the underlying
concepts of problems and see the relationships between problems, and to what extent they are flexible
in solving problems. Therefore, rather than just repeating the tasks in the textbook, CATs present
the content to be assessed from a different perspective and in an unfamiliar way. In addition to the
content-dependency, what is innovative of the CATs is in particular that teachers are offered a new
perspective for looking at students’ understanding. This makes CATs different from the usual ways of
assessing students, but at the same time, these new activities are close to the known daily teaching
practice. Moreover, to make the CATs manageable for the teachers, they have a format that supports
teachers to gather the students’ information efficiently and makes the assessment feasible to carry out.
The two main formats that Veldhuis and Van den Heuvel-Panhuizen [19] used for their CATs were
red/green cards and worksheets. With the students responding to a question by holding up a red or a
green card, the teacher can quickly gather information about the group as a whole. The worksheets,
mostly containing a few problems on a specific mathematical concept or skill, are meant to provide
teachers with more information on individual students’ strategy use.
providing information to students than on getting information from them. Furthermore, when taking
action to understand students’ thinking, teachers are more likely to do so before lessons than during
or after lessons [33,34]. Moreover, when teachers plan their teaching, textbooks serve as the main
source, rather than findings from assessing their students’ learning [32,35]. Also, the exercises in
textbooks have an important role in the decisions that teachers make about assessment activities. Yet,
such exercises may be more suitable for summative assessment than for classroom assessment [36].
According to Liu [36], this may lead to teachers focusing on assessing the result of learning, namely
what basic knowledge and skills students have acquired, instead of assessing how students developed
their mathematical thinking during the learning process. Furthermore, studies have revealed that
only very limited attention has been paid to improving teachers’ assessment practice to get more
information about students’ learning [37,38].
Taking into account the promising international findings about the use of assessment techniques,
we explored whether this approach to assessment could assist Chinese primary mathematics teachers
in their assessment practice. Specifically, as a sequel to the studies carried out in the Netherlands [19]
in which classroom assessment techniques (CATs) for primary mathematics education were developed
and teachers were supported in using CATs, we investigated the use of CATs in China. Six third-grade
mathematics teachers of two primary schools in Nanjing, China, participated in a pilot study [39].
The focus of this pilot study was on assessing the topic of division, in particular three-digit numbers
divided by a one-digit number. In line with the Dutch studies [19], the CATs in the Chinese pilot
study were also based on a textbook analysis and formulated in such a way that they were not just
a repetition of what is in the textbook. In this way, CATs might give teachers access to a deeper
level of students’ skills and understanding. It was found that teachers recognized that it can be very
revealing to challenge students with questions that are not completely prepared by the textbook. Also,
they appreciated the use of red/green cards for providing quick information. In general, teachers were
positive about the CATs as a way to reveal their students’ learning in an effective and efficient way.
2. Methods
In the study, Chinese teachers were asked to use a number of CATs in their regular teaching of
multiplication during the first two weeks of the second semester of Grade 3. Teachers were informed
about the CATs through a teacher guide and two researcher-led meetings. Data on how teachers
used the CATs and what insights they got from them were gathered through feedback forms and a
teacher-written final report
Educ. Sci. 2019, 9, 150 5 of 19
2.1. Participants
For practical reasons, we decided to set up the study in Nanjing. We contacted three local
teaching research offices, which are responsible for inspecting the educational quality of the schools
and providing professional development to primary school teachers in their administrative districts.
One of these offices volunteered to participate. To include various schools in terms of the school’s
reputation, educational quality, and location, nine out of 40 primary schools in its district were selected
by this local teaching research office. The Grade 3 mathematics teachers and their students of these nine
schools took part in the study. Thus, our sample consisted of 25 teachers and their students in 25 classes.
苏
In
教版 all the classes, the same textbook series was used, namely the Sujiaoban (苏教版) textbook [40].
2 4 2 4 2 4
× 1 2 × 1 2 × 1 2
The amount
4 8 in 2 boxes 4 8 The amount
4 8
2 4 0 in 10 boxes 2 4 0
The amount
2 8 8 in 12 boxes
2 2 3 2 4 3
× 2 3 × 3 1 × 2 1
× 1 2 × 1 2 × 1 2
The amount
4 8 in 2 boxes 4 8 The amount
4 8
2 4 0 in 10 boxes 2 4 0
The amount
2 8 8 in 12 boxes
Educ. Sci. 2019, 9, 150 6 of 19
2 2 3 2 4 3
× 2 3 × 3 1 × 2 1
In the subsequent new lesson, the students are prompted to further strengthen their understanding
of the structure of the multiplication algorithm. To achieve this, the textbook offers only the start of the
algorithm for 24 × 53 (Figure 3). The students have to complete the remaining steps of the algorithm.
Right after this, the textbook provides a description in words of the steps to be taken when carrying
out the multiplication algorithm of two-digit numbers. The students are told to first choose the digit
in the ones place of the lower number to multiply the upper number, and then do the same for the
digit in the tens place of the lower number. After this, for every calculated product, they have to write
the last digit of the product in the same column as the digit chosen from the lower number. Finally,
the students need to add the two products.
2 4 22 44
× 5 3 ×× 55 33
7 2 77 22
Figure 3. Multiplication algorithm for which the start is given.
The ratio aspect of multiplication is dealt with in revision lessons halfway and at the end of the
chapter. In the first problem offered to the students, they have to find out how many pencils there are
in 10, 20, 40, and 80 boxes when in one box, there are 10 pencils. The problem is presented in a ratio
table (Figure 4). The first column shows that in five boxes, there are 50 pencils in total. The students
have to fill in the remaining empty cells. In the end, they have to explain what they can discover
from the ratio table. The focus in this problem is on the external ratio between the number of boxes
and of pencils, or in other words, on the functional relationship between them. This is even clearer
in the next ratio-table-like problem (Figure 5), in which the students are explicitly asked to multiply
two given numbers. Also, this functional relationship is emphasized by the notation of the following
ses
accompanying exercises nd
and ,, in which the students are required to fill in
the empty frames.
for doing two assessments, if necessary. Here, exemplarily, we explain four CATs in detail. Three are
meant for assessing multiplication with multiples of 10 (CAT-1), the structure of the multiplication
algorithm (CAT-3), and the ratio aspect of multiplication (CAT-4). Finally, near the end of the chapter,
when students have learned the multiplication algorithm for two-digit numbers, it is assessed whether
the students’ understanding goes beyond mechanically carrying out the algorithm (CAT-8). To show
the possible ways of collecting information with the CATs, we chose two CATs of each format: CAT-1
and CAT-4, which had a red/green card format, and CAT-3 and CAT-8, which had the individual
worksheet format. In addition, CAT-1, CAT-3/CAT-4, and CAT-8 were meant to be used at the beginning,
in the middle, and by the end of teaching with this chapter, respectively. A further reason for discussing
these CATs anticipates on our finding that the CATs differed in the degree to which they gave teachers
insights. Choosing these CATs gave us the opportunity to provide a fair picture of what CATs can bring
about. CAT-1 and CAT-4 were CATs that helped fewer teachers gain insights than CAT-3 and CAT-8.
does not necessarily mean that students understand what they are doing and understand the structure
of multiplications with two-digit numbers.
CAT-3 (Figure 7) has a worksheet format and is meant to give teachers an extra opportunity
to assess whether their students can identify the components of a multiplication and how they
understand what is behind the algorithm. In this CAT, the same numbers are used as in the textbook,
namely 24 × 53. However, now students have to unravel this multiplication instead of carrying it out.
By using distributive and associative properties, this can lead to four sub-multiplications, namely 3 × 4,
3 × 20, 50 × 4, and 50 × 20, or in any other order. The teacher hands out the worksheet and checks
students’ responses after class and uses the gained information for decisions about further instruction
in the next lessons.
24×53 means that you have to calculate
This CAT has a red/green card format and is called ‘Completing the ratio table’. It is meant
to challenge students in their work with the ratio table and give teachers extensive information of
students’ understanding of the ratio aspect of multiplication. The difficulty for the students in this CAT
is that the number of pencils in one box is not given. Moreover, they are not allowed to calculate this
number. The numbers in the ratio table have been chosen in such a way that students are prompted to
find other methods to fill in the empty cells. For example, if in six boxes there are 72 pencils, then you
also know how many there are in 12 boxes. Similarly, if in six boxes there are 72 pencils and in 11 boxes
there are 132, then you can also know directly how many pencils there are in 17 boxes. Reasoning and
calculating similar to this means that the ratio table is not only used vertically, but also horizontally.
To a certain degree, the textbook also gives opportunities to elicit this richer method of using the ratio
table, when the students are asked toflect
reflect
onon whatththey discovered
what d ininthe
theratio
ratiotable.
table.In
Inpart
particular,
any case explicitly promoted in CAT-4 and can provide teachers with extra information about their
students’ understanding of the ratio aspect of multiplication.
In this CAT, students have to solve multiplication problems without using the algorithm. The main
idea is that when students cannot solve a multiplication problem without using the algorithm, they will
probably not have a sufficient understanding of multiplication, which might put them in trouble when
learning to solve more complicated multiplication problems with, for example, three-digit numbers or
decimal numbers. Examining the worksheets after class offers the teacher clues about whether and
what instructional supports students need before finishing this chapter.
1. Referring to the mathematical content the CAT is supposed to assess. For this, teachers can use their
own words or give a clear description of the purpose of the CAT by using (partly) the wording
that appeared in the teacher guide. However, this criterion is not met when teachers only refer to
the CAT in general terms not mentioning the mathematics assessed.
2. Providing specific information about students. This includes mentioning the proportion of students
showing a particular performance on the assessed content or describing the difficulties students
encountered with this content.
3. Describing the novelty of the gained information about students. This means that teachers learn
something “new”, “unexpected”, “surprising”, or “that was not known before” about students’
understanding of the assessed content.
4. Explaining an instructional adaptation matching the findings from the CAT. Such an instructional
adaptation has to correspond to the information about the assessed content as revealed by using
the CAT; general phrases such as “providing additional exercises” or “give extra instruction” are
not sufficient.
Showing that one has learned something from doing an assessment is a multifaceted phenomenon.
It can be expressed in different ways. Teachers can say something about the performance of their
students, can emphasize that they discovered new information in the students’ performance, or can
discuss their decisions about further teaching. All these responses can indicate that teachers have
learned something from assessing their students with the CATs. Yet, to fully classify a teacher’s
response as having gained insights from a CAT, a first requirement is that the teacher refers to the
mathematical content the CAT is supposed to assess. Just talking about students’ performance in
general terms is not sufficient. So, our final decision rule to qualify a teacher’s response as having
gained insights is that it should meet Criterion I and at least one from Criterion II, III, and IV. Based
on this decision rule, a final round of checking was carried out by the first author. This resulted in
qualifying 57 teachers’ responses out of the total 200 possible responses (25 teachers x eight CATs)
as having gained insights. Table 1 provides examples of the qualifications of teachers’ responses
about CAT-1.
Educ. Sci. 2019, 9, 150 11 of 19
Table 1. Examples of the teachers’ responses about CAT-1 Family problems, and whether they were
qualified as showing having gained insights. CAT: classroom assessment technique.
3. Results
Table 2. Overview of whether clear evidence of teachers (n = 25) gaining insights into their students’
understanding of multiplication by using the CATs was identified 1 .
CAT
Insight Teacher Total of X
1 4 7 2 5 6 3 8
High H01 ✓ ✓ ✓ ✓ ✓ ✓ ✓ 7
Insight H02 ✓ ✓ ✓ ✓ ✓ ✓ 6
H03 ✓ ✓ ✓ ✓ ✓ 5
H04 ✓ ✓ ✓ ✓ ✓ 5
H05 ✓ ✓ ✓ ✓ ✓ 5
Some S01 ✓ ✓ ✓ 3
Insight S02 ✓ ✓ ✓ 3
S03 ✓ ✓ ✓ 3
S04 ✓ ✓ ✓ 3
S05 ✓ ✓ ✓ 3
S06 ✓ ✓ ✓ 3
S07 ✓ ✓ 2
S08 ✓ ✓ 2
S09 ✓ ✓ 2
S10 ✓ 1
S11 ✓ 1
S12 ✓ 1
S13 ✓ 1
S14 ✓ 1
No N01 0
Insight N02 0
N03 0
N04 0
N05 0
N06 0
Total of ✓ 5 5 5 6 8 8 9 11 57
1Black cell means the teacher did not use the CAT; empty white cell means the teacher used the CAT but no clear
evidence of the teacher gaining insights from using the CAT was identified; white cell with “✓” means the teacher
used the CAT and clear evidence of the teacher gaining insights from using the CAT was identified.
To give more information about the specific insights the teachers appeared to gain into their
students’ mathematical understanding, we now focus on the teachers’ responses to the four earlier
described CATs (1, 3, 4, and 8). Based on the four criteria for gained insights, in the next sections,
we discuss for each of the CATs why we considered particular responses of the teachers as indications
for having gained insights.
contained more than two digits. For three teachers (H01, S06, S14), this came down to reporting
that most students were able to deal with this and that only a few students “could only solve [until]
97 × 80 and 97 × 800” (H01). Three teachers (H03, S09, S14) reported that fewer students provided
correct answers when the numbers became bigger, and that “only a minority of the students could
determine the rule” (S09). This came as a surprise to one teacher (S14), who explained that “in my
expectation, the vast majority of students would find the correct answers without being disturbed by
the increase in the number of zeroes”. Only two teachers (H03, S06) mentioned how they would adapt
their instruction; they were going to include analogous problems for students to practice with.
In 19 teachers’ responses, the mathematical content assessed by CAT-1 was not mentioned.
When teachers described students’ performance, it was in very general terms. Some teachers
reported to be satisfied with their students’ performance; for example, students were able to calculate
“according to the given characteristics” (S05), other teachers pointed out their students’ shortcomings
in understanding “the rule” (H02, H05, S01). Similarly, when teachers wrote about their instructional
adaptations, they often used general terms such as “more exercises” (S07) or “extra instruction”
(N05). Interestingly, two teachers (S11, N04) decided not to adjust their further teaching because they
considered the content of CAT-1 to be too similar to what is in the textbook. In contrast, another teacher
(N01) provided as a reason for not adapting her instruction that “there is no such type of exercise in
the textbook”.
are able to learn more” (N03) or “providing extra exercises to revise this content” (N01). The other
seven teachers would not make any instructional adaptation. Two of them (S11, N04) did this because
their students did not have problems in solving the problems, and the remaining five teachers felt
using the algorithm was more suitable to solve these problems.
the CATs, and this could also be a reason that they did not notice or value the information about
students’ learning. When teachers can take part in the development of the CATs and are supported
regarding seeing the connections between the CATs and the textbook content, this might lead to gaining
more insights into students’ understanding. In addition, that the local teaching office, rather than the
individual teachers, decided to participate in this study may have influenced the teachers’ willingness
regarding using CATs and filling in feedback forms. Also, the felt need of teachers to finish their
planned regular teaching may have been a factor that teachers considered when deciding whether to
use the CATs, and might have led to not always getting the most out of the CATs.
Having only a short intervention and not having the teachers involved in designing the CATs
are shortcomings of our study that should be kept in mind when interpreting our findings. Also,
it is important to take into account that our conclusions about whether the teachers obtained insights
from using CATs were based on their self-reported data. Further data collection, such as observing
teachers using CATs in class and directly asking them about their gained insights, could shed new
light on possibly gathered new information about their students’ mathematics learning. Furthermore,
examining students’ responses and worksheets could be included in order to triangulate teachers’
reported insights. Another limitation of our study is that the CATs used in the context of Chinese
primary mathematics education so far, including the pilot study [39], were designed based on one
particular textbook series and only involved teachers from the city of Nanjing. Whether Chinese
primary teachers who use different mathematics textbook series or who are from different regions
can get new insights from implementing CATs remains unclear. Further studies are necessary in
this respect.
A strong recommendation for additional studies is in any case to have a design in which teachers
are included in developing CATs. Moreover, future research may investigate why some teachers gain
less insights than others. To know more about this, it could be worthwhile to explore whether teachers
with different assessment profiles [38,42] benefit differently from using CATs. For example, recently,
it has been shown that significant differences exist between Chinese expert teachers in primary school
mathematics education and their non-expert colleagues in their perception and reported behavior
of understanding their students’ mathematics thinking [34]. Future research may also focus on how
to use the gained insights for making instructional decisions, because this is an issue to which the
teachers in our study did not pay much attention. In line with this, new research may be conducted to
examine whether and how teachers’ gained insights affects students’ performance.
Despite the aforementioned limitations and the questions that still have to be answered, we think
we can conclude that our study provides evidence that Chinese primary school mathematics teachers
may gain insights into their students’ understanding of mathematics from using assessment techniques
such as CATs. Yet, for the majority of the teachers in our study, it seems to be necessary to offer them
more time and support to get acquainted with this assessment approach. Also, more opportunities
could be provided to support teachers to see the connections between CATs and the textbook.
Using the CATs implies a strong formative approach to assessment. For Chinese primary school
mathematics teachers, who often put more emphasis on providing information to students than
on getting information about students [32], this may mean a change of perspective. Our CATs
may help them develop a more formative approach to assess students’ learning. More research is
certainly necessary, especially studies that investigate how teachers’ culturally-based beliefs about
teaching affects their formative use of assessment and that examine how to support teachers to become
independent users of formative assessment.
Author Contributions: Conceptualization, Methodology, Analyses, Writing: X.Z., M.v.d.H.-P., M.V.; Funding
Acquisition, X.Z. (China Scholarship Council) and M.v.d.H.-P. (Netherlands Organization for Scientific Research).
Funding: This research was funded by China Scholarship Council, grant number 201206860002; and the
Netherlands Organization for Scientific Research, grant number NWO MaGW/PROO: Project 411-10-750.
Acknowledgments: The authors thank Lianhua Ning from Nanjing Normal University, China, for helping to
contact schools, and all the teachers involved in this study for their cooperation and contribution.
Educ. Sci. 2019, 9, 150 17 of 19
References
1. Shavelson, R.J. The Basic Teaching Skill: Decision Making (R & D Memorandum No.104); Stanford University:
Stanford, CA, USA, 1973.
2. Parkes, J. Reliability in classroom assessment. In Sage Handbook of Research on Classroom Assessment;
McMillan, J.H., Ed.; Corwin Press: Thousand Oaks, CA, USA, 2013; pp. 107–123. Available online:
http://sci-hub.tw/10.4135/9781452218649.n7 (accessed on 11 March 2019).
Educ. Sci. 2019, 9, 150 18 of 19
3. Harlen, W. Assessment of Learning; Sage: London, UK, 2007; Available online: http://sci-hub.tw/10.4135/
9781446214695 (accessed on 11 March 2019).
4. Shepard, L.A.; Penuel, W.R.; Pellegrino, J.W. Using learning and motivation theories to coherently link
formative assessment, grading practices, and large-scale assessment. Educ. Meas. Issues Pract. 2018, 37(1),
21–34. [CrossRef]
5. Black, P.; Wiliam, D. Assessment and classroom learning. Assess. Educ. Princ. Policy Pract. 1998, 5, 7–74.
[CrossRef]
6. Brookhart, S.M. Classroom assessment: Tensions and intersections in theory and practice. Teach. Coll. Rec.
2004, 106, 429–458. [CrossRef]
7. De Lange, J. Framework for Classroom Assessment in Mathematics; NICLA; WCER: Madison, WI, USA, 1999.
8. Shepard, L.A. The role of assessment in a learning culture. Educ. Res. 2000, 29(7), 4–14. [CrossRef]
9. Stiggins, R.; Chappuis, J. Using student-involved classroom assessment to close achievement gaps. Theory
Pract. 2005, 44(1), 11–18. [CrossRef]
10. Wiliam, D. Keeping learning on track: Classroom assessment and the regulation of learning. In Second
Handbook of Research on Mathematics Teaching and Learning; Lester, F.K., Ed.; Information Age Publishing:
Greenwich, UK, 2007; pp. 1053–1098.
11. Schoenfeld, A.H. Summative and formative assessments in mathematics supporting the goals of the common
core standards. Theory Pract. 2015, 54, 183–194. [CrossRef]
12. Lin, P.J. Conceptualizing teachers’ understanding of students’ mathematical learning by using assessment
tasks. Int. J. Sci. Math. Educ. 2006, 4, 545–580. [CrossRef]
13. Panizzon, D.; Pegg, J. Assessment practices: Empowering mathematics and science teachers in rural
secondary schools to enhance student learning. Int. J. Sci. Math. Educ. 2008, 6, 417–436. [CrossRef]
14. Gallego-Arrufat, M.J.; Dandis, M. Rubrics in a secondary mathematics class. Int. Electron. J. Math. Educ.
2014, 9, 73–82.
15. Jin, H.; Wong, K.Y. Mapping conceptual understanding of algebraic concepts: An exploratory investigation
involving grade 8 Chinese students. Int. J. Sci. Math. Educ. 2015, 13, 683–703. [CrossRef]
16. Leahy, S.; Lyon, C.; Thompson, M.; Wiliam, D. Classroom assessment: Minute-by minute and day by day.
Educ. Leadersh. 2005, 63(3), 18–24.
17. Andersson, C.; Palm, T. The impact of formative assessment on student achievement: A study of the effects
of changes to classroom practice after a comprehensive professional development programme. Learn. Instr.
2017, 49, 92–102. [CrossRef]
18. Keeley, P.; Tobey, C.R. Mathematics Formative Assessment: 75 Practical Strategies for Linking Assessment,
Instruction, and Learning; Corwin Press: Thousand Oaks, CA, USA, 2011.
19. Veldhuis, M.; Van den Heuvel-Panhuizen, M. Supporting primary school teachers’ classroom assessment
in mathematics education: Effects on student achievement. Math. Educ. Res. J. 2019. Available online:
https://link.springer.com/article/10.1007/s13394-019-00270-5 (accessed on 3 June 2019).
20. Wiliam, D. Embedded Formative Assessment; Solution Tree: Bloomington, IN, USA, 2011.
21. Wylie, E.C.; Lyon, C.J. The fidelity of formative assessment implementation: Issues of breadth and quality.
Assess. Educ. Princ. Policy Pract. 2015, 22, 140–160. [CrossRef]
22. James, M.; McCormick, R. Teachers learning how to learn. Teach. Teach. Educ. 2009, 25, 973–982. [CrossRef]
23. Phelan, J.; Choi, K.; Niemi, D.N.; Vendlinski, T.; Baker, E.L.; Herman, J. The effects of POWERSOURCE©
assessments on middle-school students’ math performance. Assess. Educ. Princ. Policy Pract. 2012, 19,
211–230. [CrossRef]
24. Davidson, A.; Herbert, S.; Bragg, L.A. Supporting elementary teachers’ planning and assessing of
mathematical reasoning. Int. J. Sci. Math. Educ. 2018. [CrossRef]
25. Martino, A.M.; Maher, C.A. Teacher questioning to promote justification and generalization in mathematics:
What research practice has taught us. J. Math. Behav. 1999, 18, 53–78. [CrossRef]
26. Heritage, M.; Heritage, J. Teacher questioning: The epicenter of instruction and assessment. Appl. Meas.
Educ. 2013, 26, 176–190. [CrossRef]
27. Zhang, D.; Lee, P.Y. Examination culture and mathematics teaching. In Proceedings of the ICMI–China
Regional Conference of Mathematical Education, Beijing, China, 5–8 August 1991.
28. Li, Y. Purpose, function and types of mathematics assessment in China. ZDM 2000, 32, 192–200. [CrossRef]
Educ. Sci. 2019, 9, 150 19 of 19
29. Ministry of Education of the People’s Republic of China (MoE). Jichu jiaoyu kecheng Gaige Gangyao (Shixing)
[Curriculum Reform Outline of Basic Education (Trial Version)]. Available online: http://old.moe.gov.cn/
publicfiles/business/htmlfiles/moe/s8001/201404/167343.html (accessed on 11 March 2019).
30. Ministry of Education of the People’s Republic of China (MoE). Quanrizhi Yiwu Jiaoyu Shuxue Kecheng Biaozhun
(Shiyangao); [Mathematics Curriculum Standards of Nine-Year Compulsory Education (Trial Version)];
Beijing Normal University Press: Beijing, China, 2001.
31. Ministry of Education of the People’s Republic of China (MoE). Yiwu jiaoyu shuxue kecheng biaozhun
(2011 nian ban) [Mathematics Curriculum Standards of Nine-Year Compulsory Education (2011 Version)].
Available online: http://ncct.moe.edu.cn/2014/curriculumPolicy_1115/3175.html (accessed on 11 March 2019).
32. Cai, J.; Wang, T. Conceptions of effective mathematics teaching within a cultural context: Perspectives of
teachers from China and the United States. J. Math. Teach. Educ. 2010, 13, 265–287. [CrossRef]
33. Cai, J.; Ding, M.; Wang, T. How do exemplary Chinese and U.S. mathematics teachers view instructional
coherence? Educ. Stud. Math. 2014, 85, 265–280. [CrossRef]
34. Zhu, Y.; Yu, W.; Cai, J. Understanding students’ mathematical thinking for effective teaching: A comparison
between expert and nonexpert Chinese elementary mathematics teachers. Eurasia J. Math. Sci. Technol. Educ.
2018, 14, 213–224. [CrossRef]
35. Li, Y.; Chen, X.; Kulm, G. Mathematics teachers’ practices and thinking in lesson plan development: A case
of teaching fraction division. ZDM 2009, 41, 717–731. [CrossRef]
36. Liu, J. Xiaoxue Dinianji Shuxue Jiaokeshu Zhong De Xiti Yanjiu—Jiyu Zhongguo Xinjiapo Jiaokeshu Bijiao De
Shijiao; [Study in Mathematics Textbook Tasks for Lower Grades in Primary School—From the Perspective
of Comparison Between Textbooks in China and that in Singapore]. Master’s Thesis, Northeast Normal
University, Changchun, China, 2012.
37. Gu, F.; Gu, L. Characterizing mathematics teaching research specialists’ mentoring in the context of Chinese
lesson study. ZDM 2016, 48, 441–454. [CrossRef]
38. Zhao, X.; Van den Heuvel-Panhuizen, M.; Veldhuis, M. Chinese primary school mathematics teachers’
assessment profiles: Findings from a large-scale questionnaire survey. Int. J. Sci. Math. Educ. 2018, 16,
1387–1407. [CrossRef]
39. Zhao, X.; Van den Heuvel-Panhuizen, M.; Veldhuis, M. Teachers’ use of classroom assessment techniques in
primary mathematics education—an explorative study with six Chinese teachers. Int. J. STEM Educ. 2016,
3, 19. [CrossRef]
40. Jiangsu Phoenix Education Publishing House. Sujiaoban Jiaokeshu; (Xiaoxue Shuxue Sannianji Xiace)
[Sujiaoban Textbook (Mathematics Textbook for Grade 3 in Primary Education, Volume 2)]; Jiangsu Phoenix
Education Publishing House: Nanjing, China, 2014.
41. Heritage, M.; Kim, J.; Vendlinski, T.; Herman, J. From evidence to action: A seamless process in formative
assessment? Educ. Meas. Issues Pract. 2009, 28(3), 24–31. [CrossRef]
42. Veldhuis, M.; van den Heuvel-Panhuizen, M. Primary school teachers’ assessment profiles in mathematics
718 education. PLoS ONE 2014, 9, e86817. [CrossRef]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).