Thomas More Wrote:
-------------------------------------------------------
Unless Sarah Lawrence is your universe of colleges/universities, the fact is that colleges have NOT dropped standardized tests, and most haven't dropped SATs. The facts are that few highly selective colleges have gone "test optional" and even for those that have gone to "test optional," there are many "shades of gray".
FairTest's list of all SAT-optional colleges/universities, even by their own extremely broad criteria, is missing a few colleges/universities that are, by most people's criteria, "selective."
http://fairtest.org/university/optional
The missing include:
Yale, Stanford, Harvard, Princeton, MIT, CalTech, Amherst, Carleton, Swarthmore, ... (Even Oberlin and Reed are amongst the missing...)
Here's a quote from Bob Schaeffer, spokesman for FairTest (the loudest voice for dropping SATs)
http://voices.washingtonpost.com/college-inc/2010/06/update_test-optional_list_long.html
"Test-optional" has become a term-of-art referring to a range of practices. More precisely, we refer to our list as including schools which have deemphasized their use of ACT and SAT in admissions decisions. It includes colleges and universities which ignore ACT/SAT scores even when they are submitted, those that allow all students to choose whether their scores will be considered, those that extend this option only to applicants who meet other criteria (usually minimum class rank or GPA requirements; and those who allow other types of standardized exams (AP/IB/Subject Tests/Local Exams/Placement Tests)to substitue [sic] for the ACT/SAT. As the Chronicle of Higher Education described it, there lots of "shades of gray" as the movement away from exclusive reliance on the ACT/SAT expands and matures.
Note "range of practices," "deemphasized," "ignor even if submitted," "exclusive reliance"...
Also, this was as FairTest added Colorado College to their list of "optionals." Colorado College's actual statement said -
A publicist said the new rules *do not mean* Colorado College is going "test-optional."
http://voices.washingtonpost.com/college-inc/2010/06/colorado_college_will_accept_a.html#more
Colorado College's actual policy requires tests of some sort that look a lot like SATs -
"The three picks *must include* at least one math test and one verbal or writing test."
In otherwords, FairTest's list of ALL colleges "http://fairtest.org/university/optional" with any "optionality" is simply papering over the fact that most of the colleges on their list use some equivalent or do use it in some way in their process.
And, consider a different fact - You appear to claim colleges are switching to ACTs since they are useful measure. "actual knowledge"...
"switch to the ACT which test
> the actual knowledge acquired in subjects in high
> school instead of attempting to measure abstract
> reasoning as the SAT claims it does. "
OK. Consider a related fact, ACTs and SATs scores are *very* highly correlated. Sufficiently correlated that both College Board and ACT publish a "concordance" showing how SAT and ACT scores can be converted "http://www.act.org/aap/concordance/index.html" ACT goes on to discuss that the tests are different (curriculum-based vs "abstract reasoning". "From a methodological standpoint, it is preferable to interpret and to use ACT and SAT scores separately. However, many institutions cannot develop and maintain separate systems; they must instead find a comparable score using concordance tables."
Or this from UT Austin's Admissions Office on concordance of ACT & SAT scores (needed because UT Austin doesn't require both & statistically valid because UT Austin gets so many applicants with either or both scores.) -
http://www.utexas.edu/student/admissions/research/ACT-SATconcordance.html
"As expected, there are high correlations between comparable ACT and SAT tests. (Composite/Total = .87, English/Verbal = .76, Reading/Verbal = .77, and Math/Quantitative = .86) As stated earlier the English+Reading/Verbal was .82. These correlations are consistent with earlier, national concordance studies. (See Table 4 below.) By contrast, correlations with class rank are lower because of vast differences in the ways the measures are designed and computed. Table 4 presents results from a national concordance study conducted jointly by ACT and ETS in which the population numbered 103,525."
Note, also "class rank" correlation is significantly lower, but similar (between .3 & .4).
On concordance tables, UT Austin states that individual results will vary some between the ACTs and SATs, so a particular score on one does not guarantee the same score on the other. However, they go on to state that
"ACT researchers also measured the strength of the relationship between the tests by computing the consistency of admissions decisions that would be made using ACT or SAT alone. (UT-Austin has never, and will never, use test scores alone to make admissions decisions; such a computation is meant only to illustrate the degree of agreement between the two tests.) The Consistency Rate is defined as the percentage of students for whom similar decisions would be made, given a specific cutoff score. The minimum consistency rates for the two tests were .81 for ACT English/SAT I Verbal, and .82 for the ACT Mathematics/SAT I Quantitative. For the ACT Composite/SAT Total combination, the minimum consistency rate was .84. This implies that, if test scores were used alone, at least 84% of admissions decisions would be the same for the ACT as for the SAT I. This compares to a consistency rate of approximately .89 for two equated and exchangeable versions of the ACT Composite score. Table 9 gives consistency rates for some of the values in the score distribution for the English, Mathematics, and Composite scores."
In other words, even ACT makes it clear that actual acceptance decisions don't vary much given concordant SAT/ACTs.