Does new S.A.T. help with admissions decisions?
The College Board releases a positive report on the writing section, but many schools are doing their own studies.
For the past three years, when high school students have hit the SAT prep books, that's included a tuneup for a writing section. For colleges trying to predict student performance, the new test has been: (a) helpful, (b) not helpful, (c) both of the above, or d) don't know.
So far, many colleges are answering "d."
The College Board, which administers the SAT, is offering another answer this summer in a much-anticipated study. The report shows the writing section to be more predictive of a student's first-year grades in college than the math or critical reading sections. The board also noted in its study that combining the entire SAT with high school grades was an even stronger predictor of performance.
The board also touts the broader educational benefits of the writing segment. "Since the SAT added writing, high schools in this nation are focusing more on teaching writing," says Laurence Bunin, senior vice president for the SAT. "That's really important for students – for their readiness for college and success in college."
Many schools, though, have been conducting their own examinations of the writing section, rather than depending on the board's study. "For a lot of colleges, the jury is still out," says David Hawkins, director of public policy for the National Association for College Admission Counseling (NACAC) in Alexandria, Va.
Some observers criticize the writing test for not producing a larger boost in the overall value of the SAT as a predictor of performance. The new SAT is just a "longer and more expensive" version of the old one, notes Robert Schaeffer of the National Center for Fair and Open Testing (FairTest), an assessment reform group in Cambridge, Mass.
Some colleges' own studies give a thumbs up for the writing test. The University of Texas at Austin recently decided to count it in admissions decisions, and high scores will even earn credit for a freshman English course.
The Massachusetts Institute of Technology and Harvard University both require students to submit their scores on the SAT or ACT (a test run by a nonprofit of the same name, which offers an optional writing section). Their studies of the writing test will continue into the fall. But preliminary results at Harvard show the SAT writing test to be a good predictor of students' performance at college, says dean of admissions William Fitzsimmons.
That's not surprising, Mr. Fitzsimmons says, because the writing test has similarities to the SAT's Subject Tests, which are generally optional and assess knowledge in many academic areas such as biology and French. Harvard requires three Subject Test scores from applicants and finds those to be among the top predictors of performance.
Studies at the University of California found similar results years ago, and the UC system switched to a heavy emphasis on high school grades and SAT Subject Tests in 2001. UC played down the regular SAT partly because it had a negative impact on minority and low-income students' admissions eligibility.
The College Board added the writing section largely in response to such research. But critics say old, flawed ideas about testing IQ and the capacity to learn are still embedded in the SAT, despite its changes and its dropping of its former name, the Scholastic Aptitude Test.
Particularly in low-performing high schools, the main SAT sends a more daunting message, he says, because of its traditional association with aptitude. "There's a fear of appearing stupid.... Whereas with achievement tests, [a low score simply means] you haven't learned the material," he says.
A steady trickle of schools has gone the SAT-optional route. About 760 four-year schools have reduced their emphasis on the scores or made them optional, according to FairTest.
Some see playing down the SAT as part of their commitment to a fair playing field for low-income and minority students. For instance, Wake Forest University in Winston-Salem, N.C., and Smith College in Northampton, Mass., both cited the correlation of high SAT scores with high family income when announcing their recent decisions to no longer require standardized admissions test scores.
Fourteen years ago, Susquehanna University in Selinsgrove, Pa., decided to let applicants send in a graded non-fiction writing sample from a class in lieu of SAT or ACT scores. The school wanted to "recruit more students who perhaps did well academically but didn't necessarily excel in standardized tests," says admissions director Chris Markle. About 15 percent of applicants choose "The Write Option," he says, and "we've found that the grades and graduation rates of those who applied with SATs were nearly identical to those who applied without them."
While liberal arts colleges may be well equipped to assess each applicant that closely, "larger universities really need this standardized tool [of the SAT] to deal with thousands of applications," says Mr. Bunin of the College Board. "The SAT is a fair national benchmark."
Peter Salins, former provost of the State University of New York and a professor at the Stony Brook campus, agrees. He says that from 1997 to 2001, SUNY campuses that increased selectivity, in part by requiring higher SAT scores, found their graduation rates rising significantly, while similar campuses that didn't boost their SAT profile saw much smaller gains.
Schools don't have to exclude anyone based on lower SAT scores, Mr. Salins says, but they can still use those scores to identify students who might need more help in certain areas.
A new independent study out of the University of Georgia Terry College of Business shows the SAT writing section predicts more than just first-year college grades, at least at large public institutions. For every 100 points more students scored on the 800-point writing test, first-year students gained .07 on a 4-point GPA scale; in English classes, they gained .18 on GPA; and they took .54 more credit hours (a full load is 12 to 15 hours). Researchers controlled for factors such as parents' income and level of education, a methodology that some observers say the College Board should have used in its study.