Local school officials have already parsed and started taking action based on state test results for 2011-12, but the State Education Department recently released its school report cards for that year to the public.
The reports offer a wealth of information about everything from enrollment numbers to students’ economic or racial background, to the per-pupil cost of education, to graduation rates and teacher certifications in each district. They can be seen at reportcards.nysed.gov.
Third- through eighth-graders take state tests annually in English language arts and mathematics, and fourth- and eighth-graders take them in science as well. They are scored on a scale of 1 to 4: 1 indicates below-standard performance; 2, a partial understanding of the subject matter; 3, proficiency; and 4, a performance that exceeds proficiency.
The state breaks results into percentages of students scoring at levels 2 to 4, 3 to 4, and at level 4 and compares them to results for the same school in the previous year and to other schools statewide.
For example, 96 percent of Sag Harbor’s third-graders scored a 2 or higher in 2011-12, compared to 76 percent in 2010-11; 77 percent scored between 3 and 4, compared to 67 percent in the previous year; and 16 percent scored a 4, compared to 5 percent in the previous year.
Statewide, the average numbers for 2011-12 were 86 percent at 2 or higher, 56 percent at 3 or 4, and 7 percent at 4 for the 2011-12 school year.
Public schools all the way from Eastport-South Manor to Montauk tend to perform above or on par with other students statewide on the tests, which are only one measurement of student progress while they are instructed in each grade.
“These report cards are for the 2011-12 school year, which is two years back,” said Robert Tymann, the East Hampton School District’s assistant superintendent. “The information that the public now has is information we’ve been using all year to improve instruction.”
School districts received the 2011-12 state test results last fall and have already administered state tests for the 2012-13 school year. The State Education Department has been revising the tests each year to bring them more into line with new Common Core standards, which makes it difficult to use the assessments to make year-to-year comparisons.
Local administrators said the 2012-13 tests just administered in April, which follow the Common Core standards, were especially tough and are expected to result in lower scores statewide. “The state is predicting as much as 35 percent lower,” said Amagansett School Superintendent Eleanor Tritt.
“The scores will be dipping for everybody,” agreed Dr. Tymann. “What really matters is not, did you go from a 92-percent level 3 in a state test to an 83-percent level 3 in a state test,” he said. “That drop doesn’t matter as much as how much did everybody drop, and did we drop as much or less than everyone else?”
The reverse is true as well, according to the assistant superintendent. “Look at, let’s say, our science scores—our science scores are always high,” he said. But so are almost everybody’s, he added, so “you can’t say you’re doing fantastic.” Instead, you have to ask, “Is it a rigorous test, or is it an easy test?”
In Amagansett, the percentage of level 4 students went up in 2011-12, and the average score on each test from grades three to six (where the school ends) was above the averages both state- and countywide—“which we were pleased to see,” Ms. Tritt said.
She pointed out, however, that at a school as small as Amagansett’s, where enrollment stood at 101 in 2011-12, according to the state report card, even one child’s score can make a big difference when it comes to percentages.
“If it’s a class of 50 or 60, four kids can be 9 or 8 percent,” pointed out Springs School Principal Eric Casale. “I think … that’s why our curriculum is so much geared toward providing individual needs, and not a one-size-fits-all,” he said. “We do a lot of small-group instruction here.”
To protect children’s privacy, the state suppresses test scores for student groups of under five, a policy often affecting small schools such as Sagaponack, Wainscott and Bridgehampton.
School administrators can explore how their students performed on certain types of questions, allowing teachers to focus instruction on particular skills. Exploring the test results helped the Amagansett School hone its goals for this year—to focus on “math computation fluency and also vocabulary at all three levels,” according to Ms. Tritt.
“Once we look at the scores, we can go in and see how we did on each question,” Dr. Tymann agreed. “Yes, there are some areas we need to work on, as every district would say.” For example, he said, East Hampton has focused this year on teaching students how to read informational texts.
The state tests also break results down by student groups, which helps school officials look for gaps in achievement.
“We look at what the state calls subgroup reports,” said Nicholas J. Dyno, assistant superintendent for instruction at the Southampton School District. The state breaks down student results by gender, race, special- versus general-education status, disabilities, English proficiency, and economic disadvantage, for example.
According to Dr. Dyno, a recent statement from the State Education Department commissioner, Dr. John B. King Jr., acknowledged “the achievement gap being alive and well in New York State.”
Like many other districts, Southampton has seen some gaps and it is working to address that. “I think what we’re learning is that we need to work harder to make sure that our ESL students are performing as well as our native language speakers,” Dr. Dyno said, speaking for schools across the state. He went on to add other groups—special-education students, African-American and Native American students, students who live in poverty—pointing out that many students fit into more than one of the subgroups the state identifies. “And then you have to look at how long a student’s been in a district,” he pointed out.
It can be hard to gauge improvement in closing achievement gaps, just as it is with student improvement by individual or by grade level, simply because the state tests have been changing so much.
“It’s not apples to apples,” Dr. Dyno said. “Different changes along the way have made it difficult to study.”
Despite what Dr. Dyno called growing pains associated with the new standards, he joined other administrators in saying he thought they would ultimately prove good for the students. Increasingly, the state tests measure progress toward meeting the new standards, but Mr. Casale was hardly alone in pointing out that so do in-school tests and just-plain teachers’ observation.
“We collect informal data every day,” the principal said. “Our goal here is to make sure every child is proficient, and we strive for that every single day.”