Tuesday, November 27, 2018

The Case for Dropping Out of College

During the summer, my father asked me whether the money he’d spent to finance my first few years at Fordham University in New York City, one of the more expensive private colleges in the United States, had been well spent. I said yes, which was a lie.

I majored in computer science, a field with good career prospects, and involved myself in several extracurricular clubs. Since I managed to test out of some introductory classes, I might even have been able to graduate a year early—thereby producing a substantial cost savings for my family. But the more I learned about the relationship between formal education and actual learning, the more I wondered why I’d come to Fordham in the first place.
* * *
According to the not-for-profit College Board, the average cost of a school year at a private American university was almost $35,000 in 2017—a figure I will use for purposes of rough cost-benefit analysis. (While public universities are less expensive thanks to government subsidies, the total economic cost per student-year, including the cost borne by taxpayers, typically is similar.) The average student takes about 32 credits worth of classes per year (with a bachelor’s degree typically requiring at least 120 credits in total). So a 3-credit class costs just above $3,000, and a 4-credit class costs a little more than $4,000.

What do students get for that price? I asked myself this question on a class by class basis, and have found an enormous mismatch between price and product in almost all cases. Take the two 4-credit calculus classes I took during freshman year. The professor had an unusual teaching style that suited me well, basing his lectures directly on lectures posted online by MIT. Half the class, including me, usually skipped the lectures and learned the content by watching the original material on MIT’s website. When the material was straightforward, I sped up the video. When it was more difficult, I hit pause, re-watched it, or opened a new tab on my browser so I could find a source that covered the same material in a more accessible way. From the perspective of my own convenience and education, it was probably one of the best classes I’ve taken in college. But I was left wondering: Why should anyone pay more than $8,000 to watch a series of YouTube videos, available online for free, and occasionally take an exam?

Another class I took, Philosophical Ethics, involved a fair bit of writing. The term paper, which had an assigned minimum length of 5,000 words, had to be written in two steps—first a full draft and then a revised version that incorporated feedback from the professor. Is $3,250 an appropriate cost for feedback on 10,000 words? That’s hard to say. But consider that the going rate on the web for editing this amount of text is just a few hundred dollars. Even assuming that my professor is several times more skilled and knowledgeable, it’s not clear that this is a good value proposition.

“But what about the lectures?” you ask. The truth is that many students, including me, don’t find the lectures valuable. As noted above, equivalent material usually can be found online for free, or at low cost. In some cases, a student will find that his or her own professor has posted video of his or her own lectures. And the best educators, assisted with the magic of video editing, often put out content that puts even the most renowned college lecturers to shame. If you have questions about the material, there’s a good chance you will find the answer on Quora or Reddit.

Last semester, I took a 4-credit class called Computer Organization. There was a total of 23 lectures, each of 75 minutes length—or about 29 hours of lectures. I liked the professor and enjoyed the class. Yet, once the semester was over, I noticed that almost all of the core material was contained in a series of YouTube videos that was just three hours long.

Like many of my fellow students, I spend most of my time in class on my laptop: Twitter, online chess, reading random articles. From the back of the class, I can see that other students are doing likewise. One might think that all of these folks will be in trouble when test time comes around. But watching a few salient online videos generally is all it takes to master the required material. You see the pattern here: The degrees these people get say “Fordham,” but the actual education often comes courtesy of YouTube.

The issue I am discussing is not new, and predates the era of on-demand web video. As far back as 1984, American educational psychologist Benjamin Bloom discovered that an average student who gets individual tutoring will outperform the vast majority of peers taught in a regular classroom setting. Even the best tutors cost no more than $80 an hour—which means you could buy 50 hours of their service for the pro-rated cost of a 4-credit college class that supplies 30 hours of (far less effective) lectures.

All of these calculations are necessarily imprecise, of course. But for the most part, I would argue, the numbers I have presented here underestimate the true economic cost of bricks-and-mortar college education, since I have not imputed the substantial effective subsidies that come through government tax breaks, endowments and support programs run by all levels of government.

So given all this, why are we told that, far from being a rip-off, college is a great deal? “In 2014, the median full-time, full-year worker over age 25 with a bachelor’s degree earned nearly 70% more than a similar worker with just a high school degree,” read one typical online report from 2016. The occasion was Jason Furman, then head of Barack Obama’s Council of Economic Advisers, tweeting out data showing that the ratio of an average college graduate’s earnings to a similarly situated high-school graduate’s earnings had grown from 1.1 in 1975 to more than 1.6 four decades later.

To ask my question another way: What accounts for the disparity between the apparently poor value proposition of college at a micro level with the statistically observed college premium at the macro level? A clear set of answers appears in The Case against Education: Why the Education System Is a Waste of Time and Money, a newly published book by George Mason University economist Bryan Caplan.

One explanation lies in what Caplan calls “ability bias”: From the outset, the average college student is different from the average American who does not go to college. The competitive college admissions process winnows the applicant pool in such a way as to guarantee that those who make it into college are more intelligent, conscientious and conformist than other members of his or her high-school graduating cohort. In other words, when colleges boast about the “70% income premium” they supposedly provide students, they are taking credit for abilities that those students already had before they set foot on campus, and which they likely could retain and commercially exploit even if they never got a college diploma. By Caplan’s estimate, ability bias accounts for about 45% of the vaunted college premium. Which would means that a college degree actually boosts income by about 40 points, not the oft-cited 70.

Of course, 40% is still a huge premium. But Caplan digs deeper by asking how that premium is earned. And in his view, the extra income doesn’t come from substantive skills learned in college classrooms, but rather from what he called the “signaling” function of a diploma: Because employers lack any quick and reliable objective way to evaluate a job candidate’s potential worth, they fall back on the vetting work done by third parties—namely, colleges. A job candidate who also happens to be someone who managed to get through the college admissions process, followed by four years of near constant testing, likely is someone who is also intelligent and conscientious, and who can be relied on to conform to institutional norms. It doesn’t matter what the applicant was tested on, since it is common knowledge that most of what one learns in college will never be applied later in life. What matters is that these applicants were tested on something. Caplan estimates that signaling accounts for around 80% of the 40-point residual college premium described above, which, if true, would leave less than ten percentage points—from the original 70—left to be accounted for. (...)

Till now, I have discussed the value of college education in generic fashion. But as everyone on any campus knows, different majors offer different value. In the case of liberal arts, the proportion of the true college premium attributable to signaling is probably close to 100%. It is not just that the jobs these students seek typically don’t require any of the substantive knowledge they acquired during their course of study: They also aren’t really improving students’ analytical skills, either. In their 2011 book Academically Adrift: Limited Learning on College Campuses, sociologists Richard Arum and Josipa Roksa presented data showing that, over their first two years of college, students typically improve their skills in critical thinking, complex reasoning and writing by less than a fifth of a standard deviation.

According to the U.S. Department of Commerce’s 2017 report on STEM jobs, even the substantive educational benefit to be had from degrees in technical fields may be overstated—since “almost two-thirds of the workers with a STEM undergraduate degree work in a non-STEM job.” Signaling likely play a strong role in such cases. Indeed, since STEM degrees are harder to obtain than non-STEM degrees, they provide an even stronger signal of intelligence and conscientiousness.

However, this is not the only reason why irrelevant coursework pays. Why do U.S. students who want to become doctors, one of the highest paying professions, first need to complete four years of often unrelated undergraduate studies? The American blogger and psychiatrist Scott Alexander, who majored in philosophy as an undergraduate and then went on to study medicine in Ireland, observed in his brilliant 2015 essay Against Tulip Subsidies that “Americans take eight years to become doctors. Irishmen can do it in four, and achieve the same result.” Law follows a similar pattern: While it takes four years to study law in Ireland, and in France it takes five, students in the United States typically spend seven years in school before beginning the separate process of bar accreditation.

by Samuel Knoche, Quillette | Read more:
Image: uncredited