Tuesday, March 11, 2014

Does the SAT meet standard?


Educational testing generates much hand wringing.  Most often, it’s over the effectiveness, usefulness or fairness of standardized testing of K-12 students.  More recently, the College Board stirred the cauldron of frustration and discontent by announcing major changes to their Scholastic Aptitude Test, or SAT--the traditional bellwether of a student’s college preparedness, and anchor of the entrance application.

David Coleman, president of the College Board, announced changes that are supposed to make the test a more realistic and accurate evaluation both of what a student has already learned, and a prediction of how they’ll do in college.

On the reading, for example, it will no longer be adequate to answer a comprehension question correctly.  The tester must also select the passage from the text that supports the answer.  This evidence-based reading (and writing) is supposed to show greater understanding.  

Of course, it’s not at all clear that it will, because every test design has strengths and weaknesses.  In short, there are angles to take, games to play, etc.  Each spring, I show my 8th graders how to answer multiple choice Measurement of Student Progress (MSP) reading comprehension questions without even reading the passage.  They get about 5 out of 8 right (where we would assume roughly 2 corrects, based on random guessing).  

You might call this a “trick of the trade,” the test-taking trade.  The new SAT may shut down some of the tricks on the old SAT, but new angles will open up.  Clever and interested people will find them, we can be sure.  

It’s not hard to imagine, for instance, that if the new SAT answers that demand support from the text offer those in multiple choice form as well, the testing can become an exercise in logic rather than reading.  Read the question, figure out the answer from the most sensible “text support” options.  Select.

But the problem with the SAT runs even deeper.  David Coleman, the new president of the College Board, was purportedly one of the principal authors of the new Common Core State Standards (CCSS), and he has professed his intention to align (match) the SAT to the CCSS.  

Sounds sensible.  But why an SAT prep course that teaches tricks is less appropriate than 13 years of schooling intentionally designed to match the test remains unclear.  After all, it may become that the SAT simply measures certain skills that the CCSS emphasized.  

School lessons and the test, in other words, might become something of a closed loop, whereas before the pervasive use of test prep courses, the SAT was supposed to be snapshot of aptitude, somewhat separate and different from school evaluations.

Aligning the SAT to the CCSS might, in other words, ultimately render the SAT less meaningful, not more, by making it like another round of the standardized tests administered in high school.  Indeed, if the SAT is going to match the standards, why wouldn’t the standardized test results from high school suffice?

In any case, that discussion may be so much intellectual frippery.  A far greater concern is whether both the CCSS and SAT, especially in their intertwining, actually teach and test things that we value.  The jury is still out on this, but the evidence is mixed, at best.

The so-called “deep reading” that many laud in the CCSS, and that David Coleman seems intent to test on the SAT, may not make or prove students ready for college any more than prior standards.  

An informal survey of a few college faculty showed surprise at the idea of extensive re-readings of texts (as posited by the CCSS).  Social science and humanities professors (perhaps except those who specifically employ the scholarly technique of textual analysis) tend to prefer students read widely in a topic to discern the breadth and depth of a “discussion” going on, then find the way into that discussion with the student’s own analytical insights.  Testing aptitude and preparation for this would be lengthy and costly (and is why high school grades are doing a better job of predicting college success.) 

So in the current educational culture we get the CCSS and the new SAT, and find the effort good.  After all, it’s not for lack of effort--as The News Tribune editorialized recently--that Tacoma’s schools aren’t getting better.  Too bad the standards don’t allow credit for effort.

No comments: