| ||||||||

Illustration 4: Supporting teachers in
analyzing the results of a test on area
The episode we report in this section
occurred in the Making Mathematics Reform a Reality (MMRR)
project described in Chapter 2. It was part of the field
experiences that took place in the first year of the
professional development program. In the MMRR project a
mathematics teacher educator was assigned to each school as
school facilitator to support participating teachers as they
implemented innovative instructional experiences in their
classes. The professional development activity described below
took place while one of the school facilitators worked with two
7
^{th} grade teachers implementing their first inquiry
unit, an adaptation of the inquiry on area described in Chapter
1.
The two teachers had designed a
comprehensive paper-and-pencil test to assess what their
students had learned about area at the end of the unit. This
test included items to assess whether students could compute
the area of different figures, describe the strategies they
used to solve these problems and show understanding of some
basic concepts about area. The teachers had already graded
these tests, but when the school facilitator asked them to say
what they thought their students actually learned about area
and what aspects of area might still be a problem, neither
teacher felt able to respond.
The facilitator then suggested that each
teacher select three or four student papers that presented
interesting differences in students responses and re-examine
these tests to determine what each student knew or did not know
about area. In the after-school meeting scheduled to discuss
their findings, both teachers expressed surprise at the
challenge this analysis presented, especially since grading the
test had been rather straightforward. In several cases, they
came to the meeting with just a guess about why a student might
have answered a question in a certain way. The discussion that
developed as everyone tried to make sense of such puzzling
responses was very informative. It often clarified some
mathematical points about area, uncovered the student’s
thinking process and helped teachers further articulate their
instructional goals for the unit. Since some student work
revealed particular misconceptions, the facilitator also asked
both teachers to brainstorm ideas about how to help each
student gain a better understanding, either in individual
after-school sessions or in future classroom instruction.
Although not planned as part of the
professional development program, this experience was an
eye-opener for the both the teachers and the school
facilitator. Among other things, it engendered a greater
appreciation for the importance of analyzing students’
work, and it also called into question the grading process that
the teachers had so far taken for granted as a viable way to
measure student learning.
Main elements and variations
As stated at the beginning of the
chapter, analyzing students’ thinking involves primarily
the in-depth examination and discussion of selected artifacts
of students’ mathematical activity. Effective
implementations of this type of professional development also
require the following:
• Worthwhile student artifacts for analysis. Discussions around the selected artifact will
be rich only when the mathematical task(s) assigned to the
students admit more than one solution and/or methods of
solution, and result in partial or incorrect solutions by some
students.
• Alternative interpretations to be examined. As teachers first analyze the artifacts, they
should be requested to generate a variety of hypotheses about
possible interpretations. The group can then examine each
hypothesis for its likelihood of being correct.
Although analyzing students’
thinking may at first appear straightforward, our illustrations
show that there is not just one way to implement this kind of
professional development. Considerable variations can occur
depending on the kind of student artifacts available, who
provides them, and how teachers analyze them.
For example, teachers can analyze
productively the following kinds of student artifacts:
• Written work students
produce in response to homework assignments or assessments.
• Videotaped “clinical interviews,” where the interviewer presents a student with
a mathematical task and asks probing questions about what the
child is doing and why.
•
Videotaped excerpts and/or written transcripts of actual lessons in
which students actively discuss a mathematical topic, solve
problems in a group or report on the results of individual
and/or small-group work.
•
“Cases” or narratives of classroom experiences
created to highlight the mathematical thinking and activities
of selected students.
The suitability of each type of artifact
depends on the goals of the professional development
experience. For example, among the artifacts listed above,
written work may reveal the least because it is only a product
of student thinking, and even the student’s written
explanation of his/her solution may not always be enlightening.
On the other hand, this kind of artifact presents some unique
advantages, as teachers can quickly skim through the work of
several different students, noting similarities and differences
that can generate interesting questions and speculations.
Clinical interviews are more likely to reveal the thinking
processes of an individual student working to solve a problem
alone. Video excerpts from a mathematics lesson may instead
allow teachers to analyze the interaction among several
learners working on a mathematical task. Finally, while videos
and/or transcripts of a problem-solving session can capture the
actual dialogue of students working on mathematical tasks, they
do not provide background information on the individual
learners or the instructional context to support
interpretations of the learning event. Cases, or classroom
narratives, on the other hand, usually do offer such
information, but they are necessarily based on the
writer’s interpretation of the event, which may unduly
influence the teachers’ analysis of the students’
thinking and reasoning.
Who provided the artifacts to be examined
can also affect the implementation of this type of professional
development. The main options in this case are as follows:
• The facilitator provides
the artifacts, or
• The teachers themselves
collect the artifacts from their own students.
Once again, each option has its strengths
and weaknesses. Only when the facilitator provides the
artifacts can these be carefully selected beforehand to
illustrate specific kinds of student strategies or
misconceptions. Also, some teachers may feel somewhat
uncomfortable and defensive when using their own
students’ work. On the other hand, teachers may be more
interested and motivated in analyzing their own students’
work. Moreover, collecting and making sense of their own
students’ work apprentices teachers immediately to the
daily process of analyzing student thinking. Several programs,
cognizant of the benefits and limitations of each option, do
both. That is, teachers experience a guided analysis of
pre-selected artifacts first, and then they collect and analyze
student work from their own classroom.
How the artifacts are analyzed also
varies, depending on the main goals of the professional
development experience. The most interesting variations occur
along the following dimensions:
• The
extent to which the facilitator structures and focuses the
analysis.
• The
role the facilitator plays in the analysis and/or discussion of
the artifacts.
• The
role that research-based knowledge of student thinking about
the mathematical topic plays in the analysis. It is worth
noting that, while using research is always highly desirable,
to date there are only a few mathematical topics for which
substantial research on student thinking is available.
• The
extent to which instructional implications of the analysis are
explicitly addressed.
• The
nature and extent of follow-up experiences that could extend
what teachers learn from analyzing the artifacts.
Analyzing students’ thinking can
occur in any of the formats we identified in Chapter 3: summer
institutes, university courses, workshops, study groups,
one-on-one interactions with a teacher educator, and
independent work.
Facilitators for this type of
professional development experience are most effective if they
understand clearly the mathematics principles underlying the
tasks being analyzed and know well the research on
students’ thinking in the particular mathematical topic.
Continued
| ||||||||

| ||||||||

CHAPTER 5 continued