News

“David Coleman and New York Times Tied Together”

By Donna Garner 3.6.14

The March 6, 2014 article in the New York Times (URL posted at the bottom of this page) shows how David Coleman and the liberal media are tied together to try to fool the American public.  David Coleman is the president of The College Board and is “coincidentally” the lead writer of the Common Core English Standards.

I do not want to bore people by repeating what I wrote yesterday. Here is that article: “David Coleman, 2016 SAT: A Sow’s Ear”

 

ADDITIONAL COMMENTS BASED UPON THE NEW YORK TIMES ARTICLE FROM 3.6.14

 

We know what David Coleman’s intent is, and it is the same intent as found in the Common Core Standards – to indoctrinate America’s school children into the Obama administration’s social justice agenda.

 

The Common Core Standards (CCS) are certainly not going to raise students’ academic abilities and, therefore, will not prepare students to take the 2016 SAT. 

 

For one thing, the CCS are not built upon the empirical reading research. Louisa Moats and Marilyn J. Adams started out helping to write the CCS because they hoped to make sure phonemic awareness/decoding skills, systematic reading instruction, cognitive progression of writing skills, and other important skills were placed into the CCS.  That did not happen.  (1.22.14 – Huffington Post –http://www.huffingtonpost.com/mark-bertin-md/when-will-we-ever-learn_b_4588033.html

 

Without those all-important skills, students will not be able to do well in school nor on the 2016 SAT!  By neglecting the most important fundamentals of education, the CCS will create failure for children.

 

The NYT/David Coleman story blows up the essay issue all out of proportion. The present SAT Reasoning Test (2005) has three parts — Math, Critical Reading, and Writing. The Writing section has two sub-scores; the essay counts only 30% with the 49 multiple-choice grammar/usage questions counting 70%. 

 

Any essay of any kind has to be subjectively scored. The new 2016 SAT essay will be no different. No matter how well-defined a rubric is, the value system of the grader is still involved.  From all appearances, the new 2016 SAT will have even more open-ended, subjectively scored questions on it which will decrease – not increase – the test’s credibility.

 

The essay on the present SAT has to be handwritten. The “secret” is that the purpose for the handwritten essay is to make sure students themselves are able to write a cogent paper.  College/university admissions officers have a password that allows them to view each student’s actual SAT essay (https://professionals.collegeboard.com/testing/sat-reasoning/scores/online-essay-viewing ). Therefore, the admissions officers can then compare a student’s admissions essay(s) with his actual verified SAT essay to make sure that the student filled out his admissions essay himself and did not hire somebody else to do it.  Therefore, the 25-minute SAT essay serves two purposes: It gives the student an opportunity to prove that he can write a cogent paper right on the spot, and it also helps college/university admissions officers to make sure students have the English proficiency to be successful in their courses if allowed to enroll.  

 

The College Board produced good research on 6.18.08 to show how valid the new 2005 SAT version was.  The research showed that in predicting success for students in their first year of college, the Writing score was even more predictive than either Math or Critical Reading.  Since the grammar/usage is 70% of the Writing score, it is logical to assume that the strongest predictor of first-year college success depends on how strong a writer a student is, yet the new 2016 SAT intends to take that Writing section with its right-or-wrong grammar/usage questions out and to make the essay optional.  What a loss for students and what a bad decision by The College Board to “dumb down” the 2016 SAT…

 

Below are the two validity studies done by The College Board and based upon the present 2005 SAT version:

 

Sources:

 

Validity of the SAT for Predicting First-Year College Grade Point Average (.pdf/550K) looks at the overall ability of the SAT to predict performance in the first year of college.

Differential Validity and Prediction of the SAT (.pdf/565K) evaluates if the SAT is fair and consistent across the key demographic variables of gender, race/ethnicity, and best language.

The final sample included 151,316 students attending 110 colleges and universities.

Results of the SAT Validity Studies

The College Board research studies analyzed the data submitted by the 110 colleges that participated in the College Board’s Admitted Class Evaluation Service (ACES). These colleges received their ACES study results in the fall and winter of 2007-08. Many other colleges and college systems, such as the University of California system, conducted their own studies. For both the University of California and the College Board studies, the results are similar. Writing is the most predictive section of the SAT, slightly more predictive than either math or critical reading. In the California study, SAT scores were slightly more predictive than high school grade point average (HSGPA). In the College Board analysis of the more than 150,000 students included in all 110 ACES studies, HSGPA was slightly more predictive than SAT scores.

Validity of the SAT for Predicting First-Year College Grade Point Average study

The main analytic method used for this study was the comparison of single and multiple correlations of predictors (SAT scores, HSGPA) with first-year college GPA (FYGPA). All correlations were corrected for range restriction.

The results show that the SAT continues to be a very strong predictor of first-year college performance, and that the changes made to the SAT add to the test’s validity. Read a summary of the key findings. (.pdf/32K)

Differential Validity and Prediction of the SAT study

The purpose of this study was to assess the differential validity and differential prediction of the revised SAT for gender, racial/ethnic, and best language subgroups. Differential validity exists if the magnitude of the test-criterion correlation varies by subgroup. Differential prediction occurs when a test systematically over- or underpredicts the criterion (e.g., FYGPA) by subgroup. The results are similar to prior research indicating that changes to the SAT did not diminish the differential prediction and validity of the test, and the SAT continues to be a fair test for all students. Read a summary of the key findings. (.pdf/49K)

Implications of the studies

Both the College Board and the University of California studies indicate that writing is the most predictive section of the SAT. Colleges not requiring an admissions test with writing, therefore, are overlooking the most useful section of the test and one of the best predictors of college success to which they have access. Writing as a college-level skill is a crucial asset for student success, an important message reinforced by colleges that require admissions tests with a writing section.

=============================================================

 

http://professionals.collegeboard.com/profdownload/FYGPA_Validity_Summary_keyfindings.pdf

 

In March 2005, the College Board introduced a revised SAT, with an additional writing section and minor changes in content to the verbal and mathematics sections.

The results are similar to prior research indicating that changes to the SAT did not diminish the differential prediction and validity of the test, and the SAT continues to be a fair test for all students…

 

Among the three individual SAT sections, SAT writing has the highest correlation with FYGPA [first-year college grade point average]  (r = 0.51). The correlation is 0.48 for SAT critical reading and 0.47 for SAT mathematics. In fact, SAT writing alone has the same correlation with FYGPA as does SAT critical reading and SAT mathematics taken together…

=======

An excellent SAT tutor and experienced English teacher wrote to me yesterday and stated:

 

Quote from NYT article:  “He [David Coleman] said he also wanted to make the test reflect more closely what students did in high school and, perhaps most important, rein in the intense coaching and tutoring on how to take the test that often gave affluent students an advantage.”

This is criminal!  It’s an outright attack on excellence and achievement! For many students, this is the only private tutoring they ever get, and in my state the public schools are so bad that these kids don’t learn the parts of speech, let alone how to think logically, analyze, and write coherent, well-organized essays. And notice: he’s complaining that some students did better on the test than others! He’s complaining about the free market system, about people who have flourished and want to give their children the best education they can, and about people who are not affluent, but make sacrifices to get top quality tutoring for their children. Instead, he wants a test so dumbed down that students don’t need any tutoring — they can pass it in their sleep. ‘Social justice’ seems to mean the lowest common denominator for everyone — EXCEPT the likes of Coleman, the billionaires, corrupt politicians, etc. They will no doubt get top quality education for their offspring. Shameful.

 

========

 

(3.6.14 – “The Story Behind the SAT Overhaul” – by Todd Balfmarch — New York Times —http://www.nytimes.com/2014/03/09/magazine/the-story-behind-the-sat-overhaul.html?_r=1 )

Sign In