Saturday, May 16, 2026

Why the Future of College Could Look Like OnlyFans

Last week, I asked whether, as a forty-six-year-old father of two, I should keep contributing to my children’s college funds, or if perhaps some combination of anti-establishment fervor, A.I., and a shifting economy could save me some money. I don’t have a particularly good answer yet, at least not one good enough to inspire the purchase of a midlife-crisis car, my son’s and daughter’s futures be damned. But, after wrestling with that query in Part 1 of what will be a series of articles, I think there may be a better one to ask. The question is not, I think, “How will A.I. change higher education?” but rather “What irreversible changes have already taken place, and how will colleges and universities respond to them?”

I wanted to talk with someone who stood outside the polite consensus which holds that college as we know it will survive, if only because, as I wrote last week, humans will always want to differentiate their children from other people’s children. Hollis Robbins, a professor of English and a special adviser in the humanities at the University of Utah, and the former dean of arts and humanities at Sonoma State University, has been writing about A.I. and higher education for years on her Substack, “Anecdotal Value.” Through her writing on the subject, her own experiments with A.I., and her experience at both élite private and regional public universities, she has hashed out a theory of sorts. In Robbins’s opinion, an excessively bureaucratic, increasingly generic, and poorly taught version of higher education has taken hold around the country, and that has made the modern university seriously vulnerable to an A.I. takeover.

What can academics do about this? College, Robbins believes, should be more bespoke; schools should cultivate their own character based on the charisma of professors, the novelty of their inquiries, and the quality of their instruction. Today, thanks in part to the Common Application and to the always increasing pressure for students to go simply to the most prestigious college they can, even élite schools are becoming interchangeable. Brown and the University of Chicago have roughly the same pool of students as, say, Vanderbilt, or Georgia Tech. And, once the unique essence of a school has been lost, and the curricula have been standardized for maximum friendliness to students, who are treated as customer kings, A.I. may come to seem like a plausible alternative. In this view, rampant A.I.-assisted cheating, rapidly declining faith in the value of a college education, and general agita on the part of the nation’s faculty are all symptoms of a larger sickness: an academy that has been stripped of everything that once made it special. [...]

In a widely discussed Substack post from last year, titled “It’s Later Than You Think,” Robbins argued that artificial general intelligence would require a culling of sixty to seventy per cent of the country’s professors, and that every professor who wanted to keep their job should write a memo answering the question “What specific knowledge do I possess that AGI does not?” Faculty members who could not produce a compelling memo “with concrete defensible answers,” she wrote, “have no place in the institution.” The university in the age of A.I. will be leaner, odder, and more differentiated from its peers, she maintains, because “students cannot be expected to continue paying for information transfer that AGI provides freely.” Instead, they will “pay to learn from faculty whose expertise surpasses AI, offering mentorship, inspiration, and meaningful access to AGI-era careers and networks.” Any institution that does not adapt will die. “This isn’t a mere transformation but a brutal winnowing,” Robbins writes. “Most institutions will fail, and those that remain will be unrecognizable by today’s standards.”

I recently asked Robbins about how she came to this conclusion, and what, exactly, those surviving institutions might look like. This interview has been edited for length and clarity.

You’ve written a lot about how the modern university has primed itself for an A.I. takeover. How did that happen?

... The first two years of a college education are now more or less the same, regardless of where you go to school. Courses now need to be equivalent to one another, so that a student at one school will be learning something similar to a student at a different school. What that has done over time is created a system where it doesn’t really matter who is teaching the classes. We tell the student, “You’re special,” and we tell the faculty, “You’re not special.” This is the tension and the problem that is plaguing higher education and what’s made it so vulnerable to A.I. Everything else—whether Trump, the enrollment cliff, or whatever—is secondary to this tension. [...]

I’m not a car person, but I have friends who have fancy BMWs, and they have to go to their fancy BMW place to fix their car, because BMW parts are often very specific to BMWs. So what does it mean for higher ed when all the parts are interchangeable? Almost forty per cent of students transfer at least once from institution to institution, and that places additional pressure to make everything the same. What happens is that colleges make it easier for their students to transfer, because parents want to have some backup plan. The high number of transfers leads to more fungibility and commodification.

In a Substack post from last year, you suggested that sixty to seventy per cent of faculty will ultimately lose their jobs once generative A.I. starts to hit the classroom, and that those who survive will need to explain why they’re still needed. How do you think they should be proving their worthiness?

Higher education and professors can differentiate themselves from all this sameness by teaching at the edges of knowledge. My expertise, for example, is in the African American sonnet tradition. There are probably three people on the entire planet who know as much as I do about this tiny little thing, and so I’ve spent a lot of my time experimenting with these large language models to just see what they know about my field, and where the edges are. Specialists are going to be key to selling education as something the A.I. can’t do. When your daughter is going to go to school, in eight years, you are not going to want, for any money, to have her learn standard educational product that A.I. knows—and A.I. will know so much, right?

I’m not sure about that, because I do think that there’s value in her learning things that a computer knows. Human beings still play chess, even though a human being hasn’t beaten the best chess computers in twenty years—and I would think there’s still value in her understanding the basic theories and foundations of, say, chemistry. Even if A.I. knows all of that, she should probably know it, too, if she wants to understand what those edges of knowledge are, no?

So, in my ideal vision of the academy, you’re going to be in class with a mentor who isn’t going to have to teach you Chemistry 101 but will want to quickly move to where the edges are, to do something new. Maybe they would decide together to 3-D-print some new material that has never been printed before, or what have you. Whatever they decide together will not be something every university is going to be able to do. It will be what’s particular at this place. [...]

Does that lead to a kind of obscurity? It would seem to encourage the esoteric sort of inquiry that the public sometimes resists.

Well, I won’t use the word “obscurity.” I would say “specialization.”

Let me make a couple of predictions and distinctions. Social science is going to matter so much less when your daughter goes to college. It is already on its way out. A.I. can do it. And here’s an example of the type of inquiry I’m talking about: I have a weird, funny Twitter group about life on Mars. Someone will ask, for instance, if it’s true that you’re going to need kidney dialysis on the way back from Mars. Another person is theorizing about a 3-D printer that’s going to use Mars soil, which will allow people to build on Mars using its materials instead of shipping everything there. These sorts of inquiries are obscure, specialist, niche, at the edge. [...]

Does that mean kids will be coming to college with a different baseline of knowledge because of A.I.? That a lot of the canon in whatever field they choose will already have been transferred to their brains? I can’t help but remember my own experience as a freshman in college, being completely unprepared for an upper-level religion course, much less any edge-of-knowledge inquiry.

They’re going to be coming in with a different baseline. Once upon a time, you walked into class and a hundred per cent of what was delivered to you was through your professor. Now, you go to a class, maybe you’ll do the reading, but you’ll also ask ChatGPT or Claude. And so your course content is already coming from somewhere else. This is a problem that higher ed has not addressed substantially. What does it mean for me to grade you on something where you got all your information from somewhere else and not from my reading list? That is a complicated question. The only thing that works is for us to get to the edge quickly.

There’s a growing idea I’ve seen in some circles that college could be replaced by conversations between an A.I. tutor and a student. When I think about your model, I wonder why college even needs to exist. If I can just seek out a tutor, somebody that I like, and they just charge me a little bit, and we go through these edge-knowledge cases together, what’s the degree for? Couldn’t you, as Hollis Robbins—not only a specialist in African American sonnet traditions but also an idiosyncratic thinker on the subject of A.I. and the future of the academy—just set up your own shop?

I was in Austin, Texas, a couple of times in March with a bunch of twenty-five-year-old billionaires. This is what they’re looking at. Instead of having the credential from the institution, why not have the credential from the professor? If you have a Hollis Robbins education, what would that signal? What would that credential mean as opposed to a degree from a university? There was some conversation about what that would look like, and one guy at the end of the dinner said, “Instead of OnlyFans, it’s like OnlyProfessors.”

Do you think an OnlyProfessors model would be good? That the dissolution of the vast majority of the higher-education infrastructure, with this replacing it, would be a good outcome?

I worry about where the great middle of America is going to go. I do think students are going to have to withdraw enrollment from schools unless things change. And I don’t think institutions are going to change themselves. They’re caught up in this bureaucratic system, this transfer system, these standardization agreements across state lines, so that anybody can move anywhere. The idea of delivering a standard education product is so embedded within the current structure that it will never change unless students say, “This is not what I want from going to college.” So, yes, OnlyProfessors is an alternative. [...]

And the death of our current universities? What does that look like?

I think there’s contraction. The big flagships are going to stay the same, because they have the football players and all the other things. I’m at the University of Utah—I think it’s going to be fine. We’re going to pick up the lifeboats from the places that crumble. But, ultimately, at the very top, presidents and provosts are going to have to understand that expertise is their mission. Yale, even, went back to making their mission statement about knowledge, not about making a better world. We’re not in the making-a-better-world game anymore. We’re in the knowledge game, and that means getting rid of some of the feel-good stuff. [ed. Like humanities, civics, history, philosophy, logic...

by Jay Caspian Kang, New Yorker | Read more:
Image: David Rowland/Getty
[ed. Couldn't disagree more. Started writing all the reasons why but then just figured 'eh... what's the use'. This really is a bizarre interview with... whoever this person is. I will say that if having ready information at your fingertips (or some personal estoteric knowledge) were all it took to be educated, Google would've put universities out of business a long time ago. There's a reason (with all the instructional videos on YouTube) that people still go to teachers.]