Andrew Hacker published an op-ed piece in the New York Times titled "The Wrong Way to Teach Math." What follows is a rebuttal to his op-ed piece. I proceed essentially line-by-line through his article and point out the pieces which are misleading, incorrect, or irrelevant. Hacker's words are in block quotes, my analysis follows.

Here's an apparent paradox: Most Americans have taken high school mathematics, including geometry and algebra, yet a national survey found that 82 percent of adults could not compute the cost of a carpet when told its dimensions and square-yard price. The Organization for Economic Cooperation and Development recently tested adults in 24 countries on basic "numeracy" skills. Typical questions involved odometer readings and produce sell-by tags. The United States ended an embarrassing 22nd, behind Estonia and Cyprus. We should be doing better. Is more mathematics the answer?

It is not clear where Hacker gets the figure of 82 percent. Presumably, this number comes from a test given by the OECD mentioned in the following sentence, but which is not directly referenced. The likely candidate for this reference is the PIAAC international report. But then it's not entirely clear where the 82 percent figure would come from.

In that report (large pdf), the US ranks 22 out of 24 for numeracy. The adults tested are classified into various levels (level 5 at the top, and "below level 1" at the bottom). The question Hacker presents about carpets seems to me it would fall into the Level 2 category. According to the PIAAC report, 28.7% of adults in the US scored below level 2, 61.34% below level 3, and 87.28% below level 5. It could be that somehow that one particular question was tricky for adults in the US, but data is not readily available at that granularity.

Even so, the comparison that Hacker is trying to make is the wrong one from the beginning. If he wants to argue that courses in geometry and algebra are not necessary to answer that question, then he needs to compare the results between adults who successfully completed each of those courses with those who did not. Hacker provides no evidence to support any claim that those with education perform equally poorly as those without.

With no links to the studies mentioned, and for poorly setting up a comparison, the 82% figure should be ignored.

Coming in 22nd out of 24 is indeed embarrassing, though it's strange to single out Estonia and Cyprus. There are, after all, 21 countries ahead of the US which could have been mentioned here.

The only statement in this paragraph that a reader should agree with is that the US should be doing better. Hacker follows this with a rhetorical question which he seems to want to answer by suggesting that what we need is not more mathematics, or even the same amount of mathematics, but in fact less mathematics.

In fact, what’s needed is a different kind of proficiency, one that is hardly taught at all. The Mathematical Association of America calls it “quantitative literacy.” I prefer the O.E.C.D.’s “numeracy,” suggesting an affinity with reading and writing.

That numeracy is hardly taught at all is completely false. Every mathematics course has learning outcomes which amount to demonstrating some degree of numeracy. All students are taking some mathematics, so every student is engaging in developing numeracy.

This is perhaps a minor point, but since Hacker brings it up, I will address it. He prefers the word "numeracy" because it, to him, better suggests an affinity with reading and writing. Proficiency in reading and writing is called literacy. The term "quantitative literacy" has the word literacy built into it. How this term fails to suggest a parallel with reading and writing is baffling.

Calculus and higher math have a place, of course, but it’s not in most people’s everyday lives. What citizens do need is to be comfortable reading graphs and charts and adept at calculating simple figures in their heads. Ours has become a quantitative century, and we must master its language. Decimals and ratios are now as crucial as nouns and verbs.

The juxtaposition of the first two sentences of this paragraph of Hacker's writing can only suggest that he believes these things do not happen in calculus and higher math courses (otherwise, he could simply omit the first sentence and go straight to discussing what citizens need). Courses in calculus and higher math (which I think is just code for anything other than basic statistics, basic algebra, geometry, trigonometry, and calculus) provide a natural setting in which graphs and charts are natural parts of the conversation, and which invite students to calculate simple figures without the aid of a calculator.

It sounds simple but it’s not easy. I teach these skills in an undergraduate class I call Numeracy 101, for which the only prerequisite is middle school arithmetic. Even so, students tell me they find the assignments as demanding as rational exponents and linear inequalities.

Queens College has no courses in their catalog (Undergraduate Bulletin 2015-2016, the most recent available edition at the time of writing) called Numeracy 101, nor does the word "numeracy" appear anywhere in the catalog. The closest thing there is a course offered by the Mathematics department called MATH 110: Mathematical Literacy -- An Introduction to College Mathematics. (Note the word "Literacy" in the title. Also note that the description of the course says it promises to include "[e]xtensive use" of "sophisticated graphing calculators.")

I will note that the MATH 110 course at Queens College is not an unusual course to be offered. Many colleges and universities across the nation offer a similar kind of course.

A look at the course offerings for Spring 2016, Summer 2016, and Fall 2016 indicates that Hacker is not scheduled to teach that course in the near future. In fact, the only courses he's scheduled to teach are ones offered in the Political Science department. That makes sense, since he is a member of the Political Science department at Queens College. And, in fact, his profile page on the Queens College website indicates that the only course he teachings is American Politics and Government (PSCI 100). Which is fine.

Queens College has instructor evaluations for courses publicly available on their website. Hacker indeed taught MATH 110 in Fall 2014 and Fall 2013. The responses from the students are available there. (Click the "Details" tab.) One of the questions on the evaluation was "How difficult is the course? (1=Not at all difficult, 5=Extremely difficult)." According to the data, 8 of the 11 students responded with either a 1 or 2. (I'm eyeballing the data, assuming that the bar for 4 corresponds to one student. It adds up. The vertical axis really should be labeled!) So if the assignments are demanding, it's not because they are difficult for the students.

I’m sometimes told that what I’m proposing is already being covered in statistics courses, which have growing enrollments both in high schools and colleges. In 2015, nearly 200,000 students were taking advanced placement classes in statistics, over three times the number a dozen years ago. This might suggest we are on the way to creating a statistically sophisticated citizenry.

Again, no sources cited, but the information seems to match that on the College Board's website. The increase in enrollment in AP Statistics is outpacing the growth of the general population, so presumably, yes, we are creating a population with a greater understanding of statistics.

So I sat in on several advanced placement classes, in Michigan and New York. I thought they would focus on what could be called “citizen statistics.” By this I mean coping with the numbers that suffuse our personal and public lives — like figures cited on income distribution, climate change or whether cellphones can damage your brain. What’s needed is a facility for sensing symptoms of bias, questionable samples and dubious sources of data.

Where this expectation of his comes from is a mystery. It's fairly obvious that Hacker did not even bother to read the course description for the classes he was sitting in on. The description is publicly available. The AP Statistics Course Description gives a good idea of what to expect in the course on page 6 of the pdf. That document should give a pretty clear idea of what to expect in an AP Statistic classroom.

My expectations were wholly misplaced. The A.P. syllabus is practically a research seminar for dissertation candidates. Some typical assignments: binomial random variables, least-square regression lines, pooled sample standard errors. Many students fall by the wayside. It’s not just the difficulty of the classes. They can’t see how such formulas connect with the lives they’ll be leading. Fewer than a third of those enrolled in 2015 got grades high enough to receive credit at selective colleges.

Describing the syllabus for the course as a "research seminar for dissertation candidates" is some bizarre form of character assassination. The topics listed are completely relevant and appropriate for a course in statistics. Yes, the terms used are technical, and that's because they have to be in order to be accurate. There is an opportunity here to see value in the precision of language associated with quantitative literacy, but that seems to be completely lost on Hacker.

The topics listed naturally invite examples which make connections to the lives the students will be leading. For example, least-square regression lines are useful to determine the level of correlation between two variables. Hacker should recognize them as useful for deciding if, say, global climate change is related to the amount of CO2 released by human civilizations (something he previously mentioned as important for citizens to understand).

According to the data from the College Board, 32.5% of the AP Statistics students received a 4 or 5 on the exam in 2015, and those are the scores typically required by colleges and universities to receive AP credit. The conclusion that the rest of the students therefore "fall by the wayside" is a mischaracterization of the situation. Nothing in that 32.5% figure suggests that the rest of the students fail the course, do not learn anything, or are otherwise harmed by the experience. If Hacker wants to make that claim, the evidence given here does not support it.

Something similar occurred when the Carnegie Foundation for the Advancement of Teaching created a statistics course for 19 community colleges in 2012. It was advertised as an alternative to remedial algebra, with its sadistic attrition rates. In Statways, as it was called, here is some of what students were asked to master: chi-square test for homogeneity in two-way tables, line multiple representation of exponential models. Even with small classes and extra support, almost half of the students got D’s or F’s or dropped the class.

A 50% D/F/W rate is actually better than or equal to the rate for many college algebra courses. And the one-page summary of Statway suggests that the program is three times as successful as the traditional alternative. It is strange to characterize this program as a failure.

Stranger still, the people who bring you Statway also bring you Quantway, which focuses on quantitative reasoning. Wouldn't that be more appropriate to compare in the context of a discussion about effective quantitative literacy education?

And again Hacker attempts to villianize the contents of a statistics course by using the (appropriate and necessary) technical terms in the field. Yes, students are expected to understand chi-squared tests, how to conduct them and how to interpret the results. Will the average citizen need to conduct these tests? Probably not. Will the average citizen need to make a decision based on the results of a statistical test conducted by someone else? Absolutely.

Remember that MATH 110 course Hacker taught in Fall 2014? Well, the college also posts the grade distribution on the internet. In that semester, there were 13 sections of MATH 110 offered. The D/F/W rate for the course that semester was 18% (51/283). For Hacker's section, the D/F/W rate was 0% (0/19). (I have no explanation for 19 grades being assigned and 20 students enrolled. The grade report includes all manner of withdrawls, and Hacker's section shows zero of these.) In fact, no one in Hacker's class received a grade lower than C+ that semester. I wonder what sort of mathematics would be able to determine if that was unusual or not.

The Carnegie and A.P. courses were designed by research professors, who seem to take the view that statistics must be done at their level or not at all. They also know that citizen statistics is not the route to promotions. In the same vein, mathematics faculties at both high schools and colleges dismiss numeracy as dumbing down or demeaning. In fact, figuring out the real world — deciphering corporate profits or what a health plan will cost — isn’t all that easy.

The research professors Hacker is pointing are being unfairly portrayed. One objective of an AP course is to allow high school students to earn credit for a college course. College professors are the appropriate people to determine what that takes since college professors are the ones who offer the college curriculum. Many of the researchers who contribute to designing the AP curriculum actively engage in researching education itself. Hacker wants this to sound inappropriate for some reason, but the situation is completely appropriate and reasonable.

The statement about promotions is vague and unsupported by any of the surrounding text. It's true that quality research is valued above quality teaching at many large research universities. But many colleges value quality teaching more and promotion at these institutions is impossible without a strong teaching record. So, yes, there are professors who care about providing quality education to their students. And his argument only gets weaker when the research being conducted is education research.

Hacker provides not a single example or reference that supports his claim that mathematics faculty look down on numeracy. This statement amounts to defamation.

The last statement here about predicting health care costs is actually a statement in favor of additional mathematics education. Figuring out what health care will cost and what to charge for an insurance policy is the job of actuaries. They use many mathematical tools, but primarily calculus based probability and statistics, two things Hacker seems opposed to.

So what kinds of questions do I ask my students?
One exercise focuses on visualizing data. I have the class prepare a report on how many households in the United States have telephones, land and cell. After studying census data, they focus on two: Connecticut and Arkansas, with respective ownerships of 98.9 percent and 94.6 percent. They are told they have to choose one of the following charts to represent the numbers, and defend their choice.
The first chart suggests a much bigger difference, but is misleading because the bars are arbitrarily scaled to exaggerate that difference.

It is important that people are able to read these two graphs and see the difference in the way the information has been presented. However, the first chart does not suggest a "much bigger difference." The numerical labels suggest what the difference between the two actually is. In fact, it's easier to read the first one to determine the actual percentage point difference between the two data points. Then second provides better context for these numbers, however.

I also ask them to discern and analyze changing trends. Each January, the National Center for Health Statistics releases its hefty “Births: Final Data.” Its rates and ratios range from the ages of parents to methods of delivery. I ask students to scan these columns, looking for patterns. They found, for example, that women in Nebraska are averaging 2.2 children, while Vermont’s ratio is 1.6. Any theories?

OK, great. Hacker's students can determine that 2.2 is larger than 1.6. That's not exactly worthy of bragging about.

The alarming part of this is that the lesson seems to end with speculation. Now that the students see that the birth rate in Nebraska is higher than the one in Vermont, they are encouraged to create theories to explain this. That's fine, but it cannot stop there if it's going to be called numeracy. The hypotheses need to be tested, likely by using other data source to explain them, and that might involve things like least-squares regression or other statistical methods.

Other tables focus on changes over time. Fertility rates for white and black women in 1989 stood at 60.5 and 84.8 per thousand, a discernible difference. By 2014, they were 59.5 and 64.5, a much smaller gap. There’s a story here about how black women are reconfiguring their lives.

Indeed, there is a story here. But, again, stopping here is only encouraging speculation and not engaging in numeracy.

Finally, we talk about how math can help us think about reorganizing the world around us in ways that make more sense. For example, there’s probably nothing more cumbersome than how we measure time: How quickly can you compute 17 percent of a week, calibrated in hours (or minutes, or seconds)? So our class undertook to decimalize time.
Imagine if we had a 10-day week, each day consisting of 10 hours each. The class debated whether to adopt a three-day weekend, or to locate an “off-day” in midweek. Since a decimal week would have 100 hours, 17 percent is a flat 17 hours — no calculator required. You have to think both numerically and creatively if you want to, say, chuck out our current health care system and model the finances of a single-payer plan.

Hooray for the metric system?

How does any of this relate to health care?

And wasn't the main point that the lessons needed to be practical? How practical is it to change the way the world thinks about and measures time? It might make many things simpler, but it's not very likely to happen. Just try to get a US citizen to measure their daily commute in kilometers.

Mathematicians often allude to “the law of the excluded middle” (a proposition must be true or false). The same phrase could be applied to a phenomenon in our own backyard. We teach arithmetic quite well in early grades, so that most people can do addition through division. We then send students straight to geometry and algebra, on a sequence ending with calculus. Some thrive throughout this progression, but too many are left behind.

I have absolutely no idea what point Hacker is trying to make here. It reads as though he is speaking but doesn't realize we will be able to reference his speech later.

The assumption that all this math will make us more numerically adept is flawed. Deborah Hughes-Hallett, a mathematician at the University of Arizona, found that “advanced training in mathematics does not necessarily ensure high levels of quantitative literacy.” Perhaps this is because in the real world, we constantly settle for estimates, whereas mathematics — see the SAT — demands that you get the answer precisely right.

If you are looking to be "numerically adept" then you can only do worse by turning to something other than mathematics. The quote by Hughes-Hallett is completely out of context. In the same paper, Hughes-Hallett declares that "Although the knowledge of basic mathematical algorithms such as how to multiply decimals does not guarantee literacy, the absence of this knowledge makes literacy unlikely, if not impossible." Indeed, Hughes-Hallett is suggesting that mathematical proficiency is a prerequisite for numeracy.

Only Hacker is equating mathematics with the SAT exam.

Indeed, it often turns out that all those X’s and Y’s can inhibit becoming deft with everyday digits.

Hacker seems to be trying to suggest that algebra skills negatively impact numeracy skills. There is no evidence for this claim whatsoever.