Who's testing who?

Article Image Alt Text

The results of state-mandated testing should have been in two weeks after the last student pressed send.

That was more than four months ago, and apart from a few concise emails, local teachers and administrators haven’t heard a word. 

Each year, students in grade levels three through eight and high school juniors are required to complete Smarter Balanced Assessment Consortium (SBAC) tests. The tests were given in March and April by assessment company Measured Progress — marking the first year computer-based testing was utilized. However, an array of glitches and delays with testing in three states, including Montana, prompted some districts to opt out of the mandatory testing. 

Federal law requires 95 percent of students to complete the SBAC tests, however in Montana only 82 percent were able to do so — in other words, 62,000 of 76,000 test-takers completed the SBAC. Two Sweet Grass County students experienced testing glitches and were kicked out of the program but were able to re-enter and eventually complete their testing.

The testing cost taxpayers $1.33 million dollars, according to Emilie Ritter Saunders, the communications director for the Montana Office of Public Instruction (OPI). Saunders said OPI will, however, be withholding monthly payments to the vendor until results are received. Per student, the new SBAC costs the state $27, versus $32 per student for last year’s exam. 

On Sept. 4, the Associated Press reported the state will receive a $375,000 credit from Measured Progress to compensate for delayed results — less than 30 percent of the state’s total cost. 

Local school superintendents, Mark Ketcham of Big Timber Grade School and Al Buerkle of Sweet Grass County High School, expect results might appear in November — with an emphasis on “might.” 

“During Montana’s 2014 field test, we were able to use American Institutes for Research proprietary software, which worked flawlessly. However, the state couldn’t afford AIR’s multi-million dollar price tag ... so, we contracted with Measured Progress and Smarter Balanced who in turn contracted with AIR to use an open-source version of its proprietary software. That’s what created the technical challenges — the conversion of data from AIR’s open-source system into Measured Progress’ system,” according to an OPI information sheet, dated Sept. 11, 2015.   

— Why results matter —

The results of state testing help educators and administrators gauge how their students are doing compared to other schools, and help them identify which standards they are excelling in and which ones need improvement. 

But both Buerkle and Ketcham noted that the SBAC isn’t the only assessment tool they have. 

“What our approach has been, and I’ve said this to our staff, is we teach like hell and whatever happens on the test is going to happen,” Buerkle said. “You should teach to a test — here’s my problem, I never felt that the CRTs were a good test to teach to. The state themselves would admit to you … the CRTs didn’t cover all those (state) standards. They were very strong in some areas in math and in other areas, they were totally silent.”

Buerkle was referring to the SBAC’s predecessor — the Criterion Reference Test (CRT). That test was revamped with the statewide adoption of Common Core, resulting in the SBAC. Instead of teaching to what he viewed as an inferior assessment, Buerkle’s staff focused on learner outcomes — a set of objectives for each course compiled by staff and revised every few years. However, Buerkle said the school does pay attention to state standards and seeks to rise above them, rather than teach to a substandard test. Whether the SBAC falls into that category has yet to be seen. 

“We do pay attention to the standards, but I don’t know the SBAC well enough to say whether it’s a good test to teach to — we’ll find out,” Buerkle said. 

First and foremost, they need the results. 

In the meantime, Buerkle will focus on the results from the college entrance test, the ACT, which is taken at the junior level. Buerkle also administers grade-level appropriate variations of the ACT to freshmen and sophomores. That test, unfortunately, has also undergone an overhaul recently, so while those results are more timely, he doesn’t have a pool of years to compare them to. If he were to compare them to last year’s results, it would be akin to comparing apples to oranges. 

“We were told we would have them by the end of the school year — May sometime … and they’re telling us we might get them in November and the word is ‘might,’ there is no guarantee. We might get results in November,” Buerkle said, of the SBAC results. “We’re not sure if it was because of the technology at the time of testing … or if it’s now that the data’s in, they don’t know quite how to handle the data.”

Buerkle also suggested that perhaps, the results are so poor or scattered that Measured Progress doesn’t know how to handle them. 

“This is just me guessing the scenario — back in my science teacher days … If I gave a test and everybody got a 30 on it, I had to go, I didn’t write a very good test,” Buerkle said. 

By Mackenzie Reiss / Pioneer Staff Writer

For more information on how area schools plan to deal with the lack of testing results and other considerations moving forward, pick up a copy of the Oct. 8 edition of the Pioneer or subscribe to our e-edition. Current subscribers are provided access to the e-edition at no additional charge.

Category: