CCSSI Mathematics posted a scathing look at the items released by the Smarter Balanced Assessment Consortium (SBAC). While the rest of the internet seems to be obsessed over Georgia leaving the Partnership for Assessment of College and Careers1, the real concern should be over the quality of these test items.
Although CCSSI also deligently point out questions that are not well aligned to the standards, this is the least of my worries. Adjusting the difficulty of items and better alignment is something that testing companies know how to do and deal with all the time. Computerized testing is the new ground and a big part of why states are, rightfully, excited about the consortium.
The problem with the SBAC items is they represent the worst of computerized assessment. Rather than demonstrating more authentic and complex tasks, they present convoluted scenarios and even more convoluted input methods. Rather than present multimedia in a way that is authentic to the tasks, we see heavy language describing how to input what amounts to multiple choice or fill-in the blank answers. What I see here is not worth the investment in time and equipment that states are being asked to make, and it is hardly a “next generation” set of items that will allow us to attain more accurate measures of achievement.
SBAC looks poised to set up students to fail because of the mechanations of test taking. This is not only tragic at face value, but assures an increase in test-prep as the items are less authentic.