It probably doesn't compute because none of those bullet points accurately reflect what I've said. EDIT: Here's what I'm saying, neatly bulletized for your pleasure. - The SAT measures many things, including core knowledge, reading comprehension, overall intelligence, and test-taking abilities. Sorting out the impact of all these individual factors on any student's score is tough/impossible. The fact that simply learning strategies for taking the test can improve a students score on a section by 100 points or more (equal to the standard deviation!) indicates to me that this may be the single most important factor. One thing the SAT does NOT measure all that well, at least according to several reputable and impartial sources, is likelihood of college success. - I have no idea how much more students prepare for the SAT now compared to 30 years ago, and never made a claim regarding this statistic one way or another. It's clear, however, that a huge industry has grown around the SAT and standardized testing, and this industry has a vested interest in maintaining/increasing the importance of these tests, as well as making students want to improve their own performance on them. - Check the graph I posted. Math scores have actually gotten HIGHER this decade than the previous two, while critical reading scores have gone down. (Why isn't the College Board trumpeting this improvement? See point 2 above.) - The SAT was never intended to measure knowledge. (At least not the SAT I. The subject-based SAT IIs are another matter.) Furthermore, since scoring procedures, test-taking demographic, normalization, and even the questions themselves are all different year-to-year, drawing significant conclusions from a time-series of scores is shaky at best.
When are we all going to meet up at the park sos I can test your ankle-breaking crossover on the court?