Call to Order and Roll Call
The meeting of the Education Assessment and Accountability Review Subcommittee was held on Tuesday, December 10, 2013, at 1:00 PM, in Room 129 of the Capitol Annex. Representative Rita Smart, Co-Chair, called the meeting to order, and the secretary called the roll.
Present were:
Members:Senator Mike Wilson, Co-Chair; Representative Rita Smart, Co-Chair; Senator Gerald A. Neal; Representatives Tim Couch, Joni L. Jenkins, and Mary Lou Marzian.
Legislative Guests: Representatives Derrick Graham and James Kay.
Guests: Marty White, Kentucky Association of School Superintendents and Clyde Caudill, Kentucky Association of School Administrators and Jefferson County Public Schools.
LRC Staff: Janet Stevens, Ben Boggs, and Lisa Moore.
Approval of the Minutes, November 18, 2013, Meeting
Representative Marzian moved to approve the minutes, and Representative Jenkins seconded the motion. The minutes were approved by voice vote.
Office of Education Accountability Study Report: Test Familiarity and Performance: Comparing Scores on Kentucky Core Content Tests and Unfamiliar Tests
Deborah Nelson, Research Analyst, Office of Education Accountability (OEA), said state test scores are often assumed to reflect student mastery of state curriculum standards in tested subjects. However, state tests assess some standards only partially and other standards not at all. Untested aspects of standards, such as the ability to read accurately and expressively, or the ability to write a research report, are important to ensure mastery and prepare students for future education and adult life. The gap between what can be tested and what should be learned raises questions about the valid interpretation and use of test scores.
Dr. Nelson said that, when state tests are used to hold schools accountable and become familiar over time, educators may focus more on frequently tested content and less on content that is seldom or less tested. In such a situation, gains observed on state tests may differ from actual gains in student mastery of content standards. If the teaching of tested versus untested content varies among schools, the comparability of test data can be reduced.
Test scores and site visit data were used to examine questions about the interpretation and use of state test scores. The OEA study compared schools’ reading and mathematics scores on the Kentucky Core Content Test (KCCT) to schools’ scores on unfamiliar reading and mathematics tests. The KCCT was the primary measure used to hold schools accountable for students learning from 1999 to 2011. Kentucky’s students’ reading and mathematics KCCT scores were also compared to changes in their reading and mathematics scores on the National Assessment of Educational Progress (NAEP).
Dr. Nelson said comparisons of KCCT results with other standardized test results did not reveal serious overall concerns about the interpretation of KCCT reading and mathematics scores. Rank changes of Kentucky elementary and middle schools were similar, whether from KCCT from one year to the next or from the KCCT to an unfamiliar test; about half of schools stayed within 10 percentile points of their original rank, and most of the rest stayed within 30 percentile points. Changes in the KCCT and NEAP scores did not differ dramatically. Between 2007 and 2011, Kentucky students actually made greater gains on the unfamiliar NAEP reading tests than on the KCCT reading tests.
Dr. Nelson said that, relative to their peers, students in some schools performed worse on unfamiliar tests than on the KCCT. The percentage of schools that dropped by more than 30 percentile points doubled when students took unfamiliar tests, amounting to seven percent or more of all schools. Higher-poverty schools were more likely than other schools to lose rank when students took unfamiliar tests, especially higher-poverty schools that were high performing on the KCCT. Between 2007 and 2011, students’ gains on the eighth grade KCCT mathematics test were more than two-and-one-half times as great as their gains on the NAEP eighth grade mathematics test.
Dr. Nelson said site visit data collected in eight elementary schools suggest that, while local assessment of students on state-tested content and in state-tested formats is common, local assessment varies for content not assessed on state tests. Because current efforts at improving education focus largely on the use of data, it is important that data reflect the state’s curriculum standards. If the data used to make instructional decisions are taken primarily from tests in state test formats, untested aspects of state standards may be overlooked.
Dr. Nelson said site visit data also provided preliminary evidence of another pattern: high state test scores of schools that focus heavily on tested content and formats may not always be entirely reflected on unfamiliar tests of similar content. Practices in higher-performing schools are often adopted by other schools. It is important that these practices be shown to improve student learning of all state standards and not just those that are tested.
Dr. Nelson said the study made three recommendations, which can be located in the study report provided in the meeting folder located Legislative Research Commission LRC library.
Responding to a question from Senator Neal regarding inconsistent test scores among minority students, Dr. Nelson said changes in standard scores associated with the percentage of students in a school who were minorities did not follow consistent patterns. The standard scores of schools with the highest percentages of students who were minorities increased, on average, on the Kentucky Performance Rating for Educational Progress (K-PREP) reading and mathematics tests and did not change substantially from the KCCT to the Iowa Test of Basic Skills (ITBS) mathematics tests. However, the drop in standard scores from the KCCT reading test in 2010 to the ITBS reading test was substantial in schools with high percentages of students who were minorities. She concluded that standard scores of schools serving high percentages of minority students dropped substantially when students took an unfamiliar reading test comprised of multiple choice questions only.
Responding to Senator Neal regarding the foundational skills of struggling students, Dr. Nelson said different schools prepare their students in very different manners. Some schools spend time preparing struggling students to answer questions on statewide assessments, while others teach content that they feel the student will understand and carry over to the next grade level. She said these are two very different approaches, but this particular OEA study did not collect enough data to contrast the two. It can raise a concern that one approach was more successful at raising KCCT scores, but the other approach was more successful when students took an unfamiliar test.
Responding to Representative Kay regarding examples of unfamiliar tests, Dr. Nelson said the unfamiliar tests used for the study were the K-PREP, ITBS, and NAEP tests. “Unfamiliar test” means that students and teachers have not seen the assessment, and teachers cannot teach to the questions.
Responding to Representative Kay’s question about students taking too many assessments, Dr. Nelson said this topic has been popular in the media but was not an issue for this study.
Senator Wilson moved to accept the OEA study report and Representative Marzian seconded the motion. The report was accepted by voice vote.
Representative Smart reminded members that the Kentucky Department of Education’s (KDE) response to the study is in members’ folders. Dr. Nelson said most schools had indicated that instruction time was reduced for science and social studies, as compared to the assessed subjects of reading and mathematics.
With no further business before the committee, the meeting adjourned at 11:10 AM.