One could say that Charles Murray has made a career of dancing on the "third rail" of social research, investigating the subjects of intelligence, the factors that influence it, and the role cognitive abilities play in our lives. It is not a topic for the academically timid.
His most famous work, The Bell Curve (1994) has been famously distorted by the politically correct as a study of race and IQ. In reality, Murray's book is largely an examination of measured intelligence and how it affects our society. Based on years of detailed research, Dr. Murray and his colleage (the late Richard J. Hernstein) found that America is becoming increasingly stratified along the lines of intelligence. Under this market-driven system, the best schools attract the best students, who (in turn) acquire the skills and education needed to enter --and succeed--in the highest-paying professions. As a result, individuals with superior cognitive abilities tend to earn more money, and are far likely to drop out of school, use drugs, or live in poverty, than those at the lower end of the intelligence scale. Murray and Hernstein found that IQ--and not socioeconomic status--is a better indicator of who is most likely to fall victim to these outcomes.
The book's best-known (most controversial) section deals with race and IQ. Based on a meta analysis of then-available data, Murray and Hernstein noted an average difference of one standard deviation between the IQ of an average white person (100 points), and a typical African-American (85 points). But Murray was careful to note that some of the pathologies that have long afflicted the African-American community are the result of other factors other than intelligence, namely racism. More importantly, he found that differences in outcomes disappear when comparing individuals with the same cognitive ability. In other words, a black person and a white individual with the same high IQ (say, 125) will often enjoy similar levels of education, wealth and success. That led Murray and Hernstein to conclude that society ultimately differentiates along intelligence lines, and not racial lines. It's also worth noting that virtually all of Murray's findings have been affirmed by other researchers, most notably Arthur Jensen of Berkeley.
Dr. Murray has long believed that this research should spark a new debate on how we educate our children and prepare them for life in an intelligence-stratified society. He's written a three-part series on that subject for The Wall Street Journal; Part I (which ran on Tuesday), is a pragmatic analysis entitled "Intelligence in the Classroom," which offers this sobering truth: "Half of all children are below average and teachers can only do so much for them." He views programs like "No Child Left Behind" as misguided, even counter-productive, because they assume that all students can elevated to a certain level of ability in subjects like reading and math.
In part two of his series (published today), Murray argues that far too many students are going to four-year colleges. Using IQ (again) as a predictor of success, Dr. Murray believes "it makes sense" for about 15-25% of the population to get a college education. Yet, the latest enrollment figures indicate that about 45% of recent high school graduates enroll in a four-year college. Predictably, many of those students will drop out; others scrape by, earning degrees that require little intellectual rigor, and impart skills with little value in the market-place. Many of these students, according to Murray, would be better off in vocational training, which provides skills that are in high demand, pay well, and "can't be outsourced to India." But many high school graduates eschew vocational skills, which are viewed as a "second class" education.
As a former classroom teacher, I can testify to the essential truths in the Murray series. At one point in my career, I even taught a vocational class that required students to investigate potential jobs, and the education required to fill those positions. My school was located in a poor district in the rural south. Many of my students had limited cognitive abilities; more than a few had learning disabilities. Despite that, virtually all wanted to attend a four-year college, and pursue careers that were (likely) beyond their reach. On the other hand, many had the ability to enter various various trade occupations. But few students had any interest in becoming a plumber, electrician or stone mason--which they viewed, wrongly, as "inferior" jobs.
Part III of Murray's series will be published tomorrow. While he deserves credit for tackling such tough subjects with honest, scientific inquiry, there is one element of the education/cognitive ability debate that he has ignored: the exploding number of "adult" learners who fill programs that provide specialized college degrees, often through an "accelerated" program. Motivated by a desire to improve their lives, these students typically have much higher graduation rates than those entering college from high school.
But studies have shown that cognitive skills don't improve dramatically over the course of your life. Why do many of these adult students succeed, when Murray's intelligence "predictors" would suggest otherwise? Is it a case of underachievers "finally getting with the program?" The result of academic programs that ensure high graduation rates, or simply the example of mature students seizing the opportunity to better themselves? A study of their graduation rates versus cognitive ability levels would be an interesting adjunct to Murray's work. .