To make sure that their students are on track to meet the standards of the No Child Left Behind Act, schools and districts are increasing use of data about student performance to drive decisions about instructional improvements.
A recent article in American Journal of Education describes how three large urban districts are using data to identify areas of needed improvement in math and English to provide an early warning system for meeting state standards, and to hel teachers and principals in instructional planning and in guiding instruction in the classroom.
Kerri A. Kerr, of the New Leaders for New Schools, and the other researchers write that “many teachers and principals in these three districts felt that state assessment data were not ideal for analyzing student performance and driving instructional decisions.
“School staff reported that state assessment data are not timely or adequately aligned with daily instruction to be particularly useful, are limited in subject and content coverage and often in the grade levels tested, and have a significant time lag before results are released,” the researchers write.
Among the strategies used by the three districts were:
- Developing interim assessments and a system for analyzing and reporting data
- Offering professional development and/or technical assistance on how to interpret and use student test results
- Revamping school improvement planning processes
- Encouraging structured review of student work
- Using a classroom observation protocol, the Learning Walk, to assess the quality of classroom instruction.
Instructional improvement a priority
Researchers Kerr, Julie A. March, Gina Schuyler Ikemoto, Hilary Darilek and Heather Barney focused on three urban districts, which are identified only as Monroe, Jefferson and Roosevelt (no location given) and have minority populations varying from about 70% to 85%.
Two of the districts have approximately 30,000 students, 2,000 teachers and 50 schools. The third has about 80,000 students, 5,000 teachers and 100 schools. The districts were chosen in part because they had made district-wide instructional improvement a high priority.
One of the key questions for the study was: What strategies did districts employ to promote instructional improvement through data-based decision making?
Two of the districts made a bigger commitment than the third in use of data. Monroe, the largest district, administered a comprehensive set of standards-aligned assessments in all grades and core subjects linked to a sophisticated data management system. According to the researchers, the system was designed to provide an early warning system on whether students were meeting state standards. The district’s interim assessments were administered at the beginning and middle of the school year and at the end, if there was not a state test in that subject.
The majority of principals and district staff interviewed reported using the data regularly for a variety of decisions, including identifying students, teachers and schools needing additional support and deciding how to design this support, the researchers say. More than two-thirds of principals believed the assessments were a good measure of student progress and 81% found the data moderately to very useful for making decisions related to instruction.
Teachers were somewhat less enthusiastic, with 59% finding the assessment data moderately or very useful for guiding instruction in their classroom. Many teachers said classroom-based assessments were more thorough and provided more
timely information, or that the district assessments duplicated what they already knew.
School improvement planning Jefferson, one of the two smaller districts implemented a new, data-driven school improvement planning process (SIP). On surveys, 62% of Jefferson teachers reported that SIP had influenced their teaching practice. I contrast, just over a third of teachers at the other two schools said SIP had influenced them. Jefferson teachers were also more likely than teachers at the other schools to identify SIP as a district-wide reform priority and focus of professional development. Teachers and principals in the district did express concern that the SIP process was unnecessarily labor intensive.
Teachers and principals in all three districts generally recognized and valued district efforts to help them with data analysis, but staff at all levels in Jefferson and Monroe reported more extensive and frequent use of data to identify areas of weakness and to guide decisions about instructional approaches. Some 79% of the teachers responding to surveys at Jefferson and 72%
of respondents at Monroe said their principals helped them to shape their teaching practices based on analysis of state or district assessments. This compared with 56% of respondents at Roosevelt.
Data was also collected on the quality of instruction in all three districts with Learning Walks (Institute for Learning). The program provides protocols, tools (e.g. rubrics), and professional development for staff on conducting “walks” brief and frequent visits to the classroom, as well as recording observations, analyzing evidence and evaluating quality of
Among their findings and recommendations:
• Strong commitment from the top is vital to create a data-driven culture.
• To gain staff acceptance, it is important to invest sufficient resources in the data initiative.
• An online data system can play a key role in giving staff access to data in a timely manner and encourage greater individual use.
• Credible data are vital. Doubts about the accuracy and validity of measures greatly affect individual buy-in.
• Teachers need to feel they have discretion to veer from the curriculum when the data suggests another approach. (Many teachers do not feel they have the knowledge to interpret and use data effectively.)
• Assigning individuals with strong data-analysis skills to “filter” data and make them more usable for schools staff can make a
“Strategies to Promote Data Use for Instructional Improvement: Actions, Outcomes and Lessons from Three Urban Districts,” by Kerri A. Kerr et al, American Journal of Education, August 2006, Volume 112, Number 4, pps. 496-518.
Published in ERN, October 2006, Volume 19, Number 7