Toggle Accessibility Tools


Research, Field Tests, and Evaluation

The development of CMP3 was built on the extensive knowledge gained from the development, research and evaluation of CMP1 and CMP2. It is important to describe the development of CMP1 and CMP2 to gain insights into the important findings from a variety of experts, including extensive field testing that allowed the development of CMP3 to be more focused.

The Development Process

Before starting the design phase for the CMP1 materials, we commissioned individual reviews of CMP material from 84 individuals in 17 states and comprehensive reviews from more than 20 schools in 14 states. Individual reviews focused on particular strands over all three grades (such as number, algebra, or statistics) on particular subpopulations (such as students with special needs or those who are commonly underserved), or on topical concerns (such as language use and readability). Comprehensive reviews were conducted in groups that included teachers, administrators, curriculum supervisors, mathematicians, experts in special education, language, and reading-level analysis, English language learners, issues of equity, and others. Each group reviewed an entire grade level of the curriculum. All responses were coded and entered into a database that allowed reports to be printed for any issue or combination of issues that would be helpful to an author or staff person in designing a Unit.

In addition, we made a call to schools to serve as pilot schools for the development of CMP2. We received 50 applications from districts for piloting. From these applications we chose 15 that included 49 school sites in 12 states and the District of Columbia. We received evaluation feedback from these sites over the five-year cycle of development.

Based on the reviews, what the authors had learned from CMP pilot schools over a six-year period, and input from our Advisory Board, the authors started with grades 6 and 7 and systematically revised and restructured the Units and their sequence for each grade level to create a first draft of the revision. These were sent to our pilot schools to be taught during the second year of the project. These initial grade-level Unit drafts were the basis for substantial feedback from our trial teachers.

Here are examples of the kinds of questions we asked classroom teachers following each revision of a Unit or grade level.

Evaluation Process

"Big Picture" Unit Feedback

  • Is the mathematics important for students at this grade level? Explain.
  • Are the mathematical goals clear to you?
  • Overall, what are the strengths and weaknesses in this Unit?
  • Please comment on your students' achievement of mathematics understanding at the end of this Unit. What concepts/skills did they “nail”? Which concepts/skills are still developing? Which concepts/skills need more reinforcement?
  • Is there a flow to the sequencing of the Investigations? Does the mathematics develop smoothly throughout the Unit? Are there any big leaps where another Problem is needed to help students understand a big idea in an Investigation? What adjustments did you make in these rough spots?

Problem-by-Problem Feedback

  • Are the mathematical goals of each Problem/Investigation clear to you?
  • Is the language and wording of each Problem understandable to students?
  • Are there any grammatical or mathematical errors in the Problems?
  • Are there any Problems that you think can be deleted?
  • Are there any Problems that needed serious revision?


  • Does the format of the ACE exercises work for you and your students? Why or why not?
  • Which ACE exercises work well, which would you change, and why?
  • What needs to be added to or deleted from the ACE exercises? Is there enough practice for students? How do you supplement and why?
  • Are there sufficient ACE exercises that challenge your more interested and capable students? If not, what needs to be added and why?
  • Are there sufficient ACE exercises that are accessible to and helpful to students that need more scaffolding for the mathematical ideas?

Mathematical Reflections and Looking Back/Ahead

  • Are these reflections useful to you and your students in identifying and making more explicit the "big" mathematical ideas in the Unit? If not, how could they be improved?

Assessment Material Feedback

  • Are the check-ups, quizzes, tests, and projects useful to you? If not, how can they be improved? What should be deleted and what should be added?
  • How do you use the assessment materials? Do you supplement the materials? If so, how and why?

Teacher Content Feedback

  • Is the teacher support useful to you? If not, what changes do you suggest and why?
  • Which parts of the teacher support help you and which do you ignore or seldom use?
  • What would be helpful to add or expand in the Teacher support?

Year-End Grade-Level Feedback

  • Are the mathematical concepts, skills and processes appropriate for the grade level?
  • Is the grade-level placement of Units optimal for your school district? Why or why not?
  • Does the mathematics flow smoothly for the students over the year?
  • Once an idea is learned, is there sufficient reinforcement and use in succeeding Units?
  • Are connections made between Units within each grade level?
  • Does the grade-level sequence of Units seem appropriate? If not, what changes would you make and why?
  • Overall, what are the strengths and weaknesses in the Units for the year?

Final Big Question

  • What three to five things would you have us seriously improve, change, or drop at each grade level?


CMP development followed the very rigorous design, field-test, evaluate loop pictured in the diagram.

The Units for each grade level at CMP1 and CMP2 went through at least three cycles of field trials–data feedback–revision. If needed, Units had four rounds of field trials. This process of (1) commissioning reviews from experts, (2) using the field trials with feedback loops for the materials, (3) conducting key classroom observations by the CMP staff, and (4) monitoring student performance on state and local tests by trial schools comprises research-based development of curriculum. This process takes five years to produce the final drafts of Units that are sent to the publisher. Another 18 months is needed for editing, design, and layout for the published Units. This process produces materials that are cohesive and effectively sequenced.