Under new changes proposed by the Trump administration, over 3 million struggling parents, children, people living with disabilities, and older American may lose access to food stamps, also known as the Supplemental Nutrition Assistance Program (SNAP). Children in families who are slated to lose their SNAP benefits will also lose critical school-lunch assistance.
The Trump administration wants to eliminate an eligibility criterion known as broad-based categorical eligibility (BBCE), which enables states to expand access to those in need of food assistance based on other programs they qualify for.
By eliminating it, the administration is effectively creating a benefits cliff, where a parent’s small raise at work — or a modest amount of savings — could end up disqualifying a family from SNAP entirely. That leaves them poorer for getting a raise or saving money, or else puts them at risk of their food aid falling through the bureaucratic cracks.
Failed twice
Trump and the Republicans attempted to get this reduction in the nation’s most effective social safety-net program rammed through Congress last year — and failed. They also failed in their attempt to significantly defund the program. So now Trump is attempting to reduce food access to families in need via executive fiat.
SNAP reduces poverty more simply and directly than nearly any other program. Because it’s responsive to the overall economy, it expands during economic downturns and contracts when poverty levels fall. This enables people to weather temporary economic hardship, stay above the official poverty level, and gets money more quickly into the economy.
It also literally puts food into children’s mouths, while their parents work and save.
Why would the administration want to take critical food assistance away from children and families who need it? The administration has claimed ineligible people are using the program, perhaps fraudulently. But that’s unlikely.
Looking at figures through 2016, Forbes contributor Simon Constable calculated potentially fraudulent SNAP expenditures at under 1% of the cost of the program — a minuscule amount compared to behemoth agencies like the Pentagon, which can’t even pass an audit, and which nonetheless keeps getting budget increases.
Rigorous standards
SNAP, by contrast, “has some of the most rigorous program integrity standards and systems of any federal program,” adds Robert Greenstein of the Center of Budget and Policy Priorities, including for recipients who qualify by their participation in other programs.
According to the center, SNAP is one of the most effective economic stimulators per federal dollar spent of any program. During the economic downturn of 2009, for example, Moody’s Analytics estimated that for every dollar increase in SNAP benefits that year, $1.70 economic activity was generated.
Jill Lepore’s new “little book” is a historian’s attempt to mobilize her knowledge to political effect. Last year Lepore published These Truths: A History of the United States, a monumental and brilliantly assembled work of political history that “is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions.” The ideological essence of that work has been distilled in This America: The Case for the Nation. In a New York Times Op-Ed that accompanied its publication, Lepore urged Democratic presidential candidates to “speak with clarity and purpose about what’s at stake: the liberal nation-state itself.” Lepore went on:
The hard work isn’t condemning nationalism; it’s making the case for the liberal nation-state.
This is an argument of political necessity and moral urgency. So far, Democrats haven’t made it. Instead, in much the same way that they gave up the word “liberalism” in the 1980s, they’ve gotten skittish about the word “nation,” as if fearing that to use it means descending into nationalism.
Whether it is electorally efficient, in the short term, to revamp our use of the word “nation” is of course debatable. But the argument, as I understand it, is that icebergs of nationalism have been an ever-present, indeed defining feature of American history; and that to avoid them we must resolutely navigate by our best national ideals—“a revolutionary, generous, and deeply moral commitment to human equality and dignity.” (...)
As Lepore acknowledges, the equation of “nationalism” with hate and bigotry is far from universal: in postcolonial countries, the term is benignly connected to the enlightened (if mythic) conception of nationhood as the starting point of self-determination. America was once a colonial place, too, but its sense of itself as a nation, Lepore believes, was developed ex post facto. Nowhere in the Declaration of Independence, the Articles of Confederation, or the Constitution is the United States described as a nation. (Contrast this with, for example, the Republic of Ireland: the 1916 Proclamation of the Irish Republic and the 1937 Constitution begin with powerful assertions of Irish nationhood.) It took a great effort of politicking to unite states that did not much identify with one another, in spite of having in common the English language, whiteness, and Christianity. The US, Lepore says, is that rare thing: a state-nation. (...)
Lepore is aware of this fact—there’s little she isn’t aware of, one senses—and makes it integral to her argument, which is that the age-old struggle between illiberal and liberal tendencies is constitutive of the nation. Nationalism is currently thriving, she believes, because the discourse of American liberalism is deficient. First, that discourse undervalues the radicality and relevance of the country’s founding ideals; second, the preoccupation with the rights of subgroups is essential, certainly, but politically inadequate; third, and here I put the matter much more crudely than Lepore would, liberals must in some sense do battle for possession of the Stars and Stripes. However gauche or complicit it may seem, they must understand and unapologetically frame their values—which currently have a niche, somewhat subversive emphasis—as our core national values:
This America is a community of belonging and commitment, held together by the strength of our ideas and by the force of our disagreements. A nation founded on universal ideas will never stop fighting over the meaning of its past and the direction of the future…. The nation, as ever, is the fight.
When I moved to the United States in 1998, the nation was fighting with itself. My introduction to the country was framed by the televised impeachment proceedings against President Clinton. It was all very gripping—a kind of crash course in politics and government. Then came the 2000 election. What struck me, in the chaos that followed, was that the Republican Party enjoyed a mystifying presumption of legitimacy. Bush had prematurely positioned himself as the president-elect, and the media had largely deferred to him in this. It made no sense. Gore had won the popular vote by more than half a million; there were strong reasons to believe that the Democratic tally in Florida had been erroneously reduced by a faulty ballot design; black Floridians had experienced outrageous voting problems; and, astonishingly, the Republicans were actually trying to prevent an accurate count of the vote.
Why had Gore so quickly phoned Bush to concede an undecided election (a concession he soon retracted)? Why the curious timidity of Democrats in Florida and the unaccountable self-righteousness of their aggressive Republican counterparts? Were my eyes and ears fooling me, or was everybody somewhat scared of the Republicans? The penny finally dropped when the Republican majority in the Supreme Court incoherently decided, in Bush v. Gore, to halt the vote-counting while their candidate still held a lead. Oh, I thought to myself. It’s a deep-state thing. (...)
The term originates in Turkey. Like the United States, Turkey is a constitutional republic. Its democratic progress has been something of a bumpy ride. There have been three military coups since 1961, each more or less accepted by the Turkish people. They understood (if sometimes disputed) that the armed forces enjoyed an extralegal, almost spiritual authority to safeguard the legacy of Kemal Atatürk and, if necessary, to suspend the constitutional order when that legacy was threatened by civil unrest or dangerous political developments. The military—together with its allies in the state security and legal apparatus—came to be described as constituting, and acting on behalf of, the “deep state.”
The United States has secretive agencies that do legally dubious things, but it doesn’t have a deep state in the Turkish sense. It may be said to have a deep state in another sense, however: America. America preceded, and brought into being, the republic we now live in—the United States of America. Almost everyone still talks about America, not about the United States; about Americans, not USAers. America, in short, was not extinguished by the United States. It persists as a buried, residual homeland—the patria that would be exposed if the USA were to dissolve. Primordial America (at least in the popular imagination) was where folks prayed hard, worked hard on the land, and had rightful recourse to violence. In this imaginary place, people were white, Christian, English-speaking. They had God-given dominion over all the earth, and over every creeping thing that creepeth upon the earth. All of this inevitably informs the way American nationals apprehend one another and their country. They feel in their bones that some people are Americans and other people are merely citizens of the United States.
Our deep state doesn’t require conspiracies or coups or even self-awareness. It is a permanent ideological feature, like gravity. It reveals itself in our politics. A common trope—“Imagine if a Democrat did that”—refers to a state of affairs in which one party is bound by norms and rules, and the other party less so. One president must constantly generate his legitimacy, even as he excellently complies with the rules; another president benefits from a legitimacy so profound that his rule-breaking has the effect of rule-making. One group is perceived to be synthetic and unpatriotic, another as authentic and patriotic. This guy is a snowflake, that guy is a victim of persecution. And so on.
The unspoken ratio decidendi of Bush v. Gore is that, when it comes to the crunch, America trumps the United States and its papery constitutional affirmations. Democrats get this as much as Republicans do. Consciously or unconsciously, they know the score. They experience this knowledge mostly as fear.
This has implications for Lepore’s argument. She believes that, as a practical matter, liberal political messaging should vigorously equate our founding ideals with our sense of nationality. If anyone has done that, it is Obama. In 2009 (during a trip to Turkey, as it happens), he declared:
One of the great strengths of the United States is—although, as I mentioned, we have a very large Christian population—we do not consider ourselves a Christian nation or a Jewish nation or a Muslim nation; we consider ourselves a nation of citizens who are bound by ideals and a set of values.
This hasn’t quite panned out—neither the thesis, nor the political messaging. The problem isn’t rhetorical. It’s structural.
The cornerstone assertion of the Declaration of Independence is that government exists in order to secure the equal, inalienable rights of persons. This is the formal raison d’être and official ideology of the United States. It follows that those who fully embrace those rights—liberals—have political and patriotic legitimacy, and those who reject them lack legitimacy. Psychically, liberals often don’t seem to believe this. A deference to “Americans” inheres in their worldview, even if the Americans in question aspire to subvert our democracy. The “heartland” and “Middle America” (concepts that bring to mind the idea of la France profonde) still form a crucial part of the liberal political vocabulary, which continues to attach an emphatically American identity to the country’s white provincial population. If, as Lepore urges, we must think hard, even dangerously, about the nation and its history, the distinction between America and the United States should probably be reckoned with.
While there are significant similarities between Big Pharma and Big Tobacco, there is also a key difference that makes today’s story of corporate malfeasance even worse: namely, that the supply chain for tobacco is much simpler than opioids, which are, theoretically, tightly controlled substances that pass through a dizzying array of actors and regulators.
First, a doctor must write a prescription, which must be filled at a pharmacy, and is likely paid for by an insurance company. Depending on the needs of their customers, pharmacies place orders for these drugs (customers, it turns out, need a lot of them). Shipping companies then go between the pharmacy and the drug manufacturers. Overseeing this entire system is the DEA, which sets the quota for how many opioids a company is allowed to manufacture, and tracks where those pills go.
While politicians are making hay out of Big Pharma’s wanton greed and recklessness, far less attention has been paid to the DEA. Attorneys general suing Big Pharma recently unearthed a database that both the corporations and the government—each for their own self-interested reasons—fought to keep sealed, called the Automation of Reports and Consolidated Orders System (ARCOS). Mammoth in size and granular in detail, ARCOS tracks the shipments of every single controlled substance, from the company that manufactured it, to the company that shipped it, to the pharmacy that received it. It is the world atlas for how the opioid crisis began.
All told, from 2006 to 2012, roughly 76 billion oxycodone and hydrocodone pills criss-crossed America, according to a Washington Post analysis. While many of these pills went to legitimate patients, millions more were showered on troubled communities with a voracious thirst for pain relief. While drug manufacturers produced more and more opioids (approved by the DEA), and distributors shipped those pills to pharmacies all over the country (tracked by the DEA), drug companies saw record profits—and America’s overdose death rate soared off the charts.
“I think this [database] brings home what we all knew,” says Corey Davis, an attorney and public health expert at the Network for Public Health Law. “This wasn’t just incompetence on the part of the DEA and the Department of Justice, it was knowing and intentional failure to do what most people think is their jobs.”
What is the DEA’s job, exactly? Its first task, and the one most associated with the agency, is the Sicario-esque disruption of illicit flows of drugs coming into the U.S. from abroad, like intercepting speedboats filled with cocaine. Its other major responsibility is controlling licit pharmaceuticals. “The whole goal of the prescription system is to make sure that patients are getting their medications, and that medications are not going to those who aren’t patients,” which is called “diversion,” says Bryce Pardo, a drug policy researcher at the RAND Corporation. “That’s the whole point of the system, which was invented a hundred years ago. Clearly, the system broke. The system failed.” (...)
In drug policy scholarship, there is a concept called the “balloon hypothesis.” When one end of a balloon gets squeezed, the air inside, rather than disappearing, rushes to fill the other end of the balloon. The balloon hypothesis is used to describe, often critically, America’s drug enforcement strategy. If cocaine production in Colombia is stamped out, production will shift to, say, Peru. If the Dark Web’s Silk Road gets shut down, a new Dark Web market pops up. The air has to go somewhere.
The balloon hypothesis also applies to the ever-shifting demand for drugs. “Over a period of 20 years, the DEA provided the green light to a 39-fold increase in the oxycodone quota and a 12-fold increase in the hydrocodone quota, even as our opioid epidemic unfolded,” Senator Dick Durbin wrote in a letter to the editor to The Washington Post.
In other words, the prescription balloon expanded, under the DEA’s watch, big time. But starting in 2011, the prescription market finally began to shrink after Purdue Pharma reformulated its blockbuster drug OxyContin with so-called abuse deterrent technology, and pill mills serving the black market were shut down. The supply was squeezed. The air still had to go somewhere, and it rushed to deadlier opioids like heroin spiked with illicit fentanyl. With enforcement focused on prescription opioids, the overdose crisis got worse.
Dan Ciccarone, a physician-researcher at the University of California, San Francisco who studies heroin use, says the crisis unfolded in three waves: Prescription painkillers gave way to old-fashioned heroin, which gave way to illicit fentanyl. “Big Pharma has egg on its face,” Ciccarone says. “It obviously could have played a more responsible role here. But at the same time, I don’t want the buck to stop there.”
“The only solution to the puzzle is to focus on demand,” he adds. “And we’ve been avoiding this for years. We need to structurally reduce demand through a healthier society.” (...)
The Big Pharma lawsuits and the ARCOS database show how these pills landed in Mingo County. But they do not answer why. To researchers like Ciccarone, it’s no mystery. “Disenfranchisement, loneliness, lack of purpose, multigenerational job loss, lack of hope, and the lack of future,” he said, listing off the “social determinants of health” that determine why some people get addicted and not others.
Since Amazon has changed the way many Americans buy things, most of the big-box retailers have struggled. Fox Business recently reported that Walmart and its ilk are battling mightily to “fend off retail apocalypse.” But before the zombies come, the superstore giants want you to gas up and bravely drive over to your local strip center—at least once more, dear consumers—as legacy retailers try to create a new paradise in their old parking lots. Walmart executives announced late last year that they were “reimagining” the typical dull parking lots by building “town centers” that could “provide community space,” and “areas for the community to dwell.” And at least one new Walmart “experience” is now open for business in Temple, Texas.
The irony is thick, and not only built up under the numerous layers of asphalt that have paved tens of thousands of acres of formerly verdant pastures; the superstores killed off remaining family hardware stores and groceries on nearby Main Streets. Over the past few decades, Walmart’s easy, free parking and low prices kept many consumers buying there—at least until the e-commerce revolution of Amazon and its imitators challenged the old superstore model of cheap, easy suburban growth.
Now Walmart is trying desperately to think outside its big boxes, yet these attempts at “reimagining” can only lead to more decline for the old paradigm of giant retail. The stagnant model of suburban real-estate development may also be a harbinger of long-term American economic decline—unless government officials and policymakers reorient incentives and make smarter infrastructure choices. It isn’t Walmart alone that is struggling to adapt the old model, but Walmart’s prominent experiment at revitalizing its big boxes is worth following closely.
Today, as the superstores gesture toward creating more human-scaled places, Walmart appears to be finally catching on to New Urbanism, the movement that has promoted the return of traditional neighborhood planning for over a quarter century. Will Walmart, with all its resources and capital, be able to recreate all the benefits of a mixed-use Main Street in an old parking lot?
The renderings provided by architects suggest that existing Walmart parking lots might be rebuilt into retail and restaurant space alongside what in more urban settings are sometimes called “pocket parks,” small patches of grass and foliage that provide some relief from the asphalt. And there may be even more. Dog parks, food trucks, and European-style food halls all abound. (One proposal in the prosperous Kansas City suburb of Lee’s Summit, Missouri even imagines a grand piano atop an outdoor stage, as if a performer will belt out some show tunes or an operetta. So it’s not exactly Nordstrom, but they’re trying.) (...)
Still, for all of Walmart’s creative attempts at “reimagining,” which try to humanize suburban parking lots, they start to look a lot like typical shopping malls. Acres of parking remain, and huge arterial roads act as moats that prevent pedestrians from easily traveling beyond Walmart’s property. (...)
For decades, big-box stores were made possible by pouring billions into new suburban infrastructure, all of it subsidized by taxpayers. The good times of the 1980s and 1990s fueled the building of thousands more big-box stores, including the now-ubiquitous local Walmart. In the short term, jobs were created, and everyone was getting cheaper TVs and any number of other goods.
Of course there was never a free lunch, even at the stunningly cheap cafeterias of Ikea, those ultimate big boxes that outdo Walmart’s typical square footage with five football fields of space. Historians and economists may soon look back at these hubristic years, just prior to and after the new millennium, as the peak of the big-box sprawl experiment.
One landmark moment should not be forgotten from the annals of 1992, when President George H.W. Bush traveled on Air Force One to Arkansas to pay homage to ailing Walmart founder Sam Walton (and present him with the Presidential Medal of Freedom). Increased trade liberalization had begun during the elder-Bush years, and then in 1993, Clinton signed the North American Free Trade Agreement (NAFTA), for which Walmart had intensely lobbied, hoping to open thousands of stores south of the border.
After NAFTA, the decline of domestic manufacturing only accelerated. At the same time, desperate local municipalities seeking big-box employers gladly provided “tax incentives.” These big breaks, of course ultimately funded by taxpayers, created new roads—including traffic signals, power, water, and policing. Decades later, the deferred maintenance is coming due, and who will foot the bill for all the infrastructure spent to prop up the big-box experiment?
In America’s wealthier metropolitan zip codes, Walmart can “reimagine” a new place, seeking consumers to sip five-dollar lattes and buy artisanal burgers. But in many less prosperous burghs, stagnant wages won’t support such reinventions or fuel economic growth. Hundreds of Walmarts first creatively destroyed local businesses, but the company later shuttered some of its under-performing superstores themselves—sometimes only a decade or two after they were built—leaving many towns that were already struggling even poorer.
by Lewis McCrary, The American Conservative | Read more: Image:Nicholas Eckhart/Flickr
The ideal woman has always been generic. I bet you can picture the version of her that runs the show today. She’s of indeterminate age but resolutely youthful presentation. She’s got glossy hair and the clean, shameless expression of a person who believes she was made to be looked at. She is often luxuriating when you see her – on remote beaches, under stars in the desert, across a carefully styled table, surrounded by beautiful possessions or photogenic friends. Showcasing herself at leisure is either the bulk of her work or an essential part of it; in this, she is not so unusual – for many people today, especially for women, packaging and broadcasting your image is a readily monetizable skill. She has a personal brand, and probably a boyfriend or husband: he is the physical realization of her constant, unseen audience, reaffirming her status as an interesting subject, a worthy object, a self-generating spectacle with a viewership attached.
Can you see this woman yet? She looks like an Instagram – which is to say, an ordinary woman reproducing the lessons of the marketplace, which is how an ordinary woman evolves into an ideal. The process requires maximal obedience on the part of the woman in question, and – ideally – her genuine enthusiasm, too. This woman is sincerely interested in whatever the market demands of her (good looks, the impression of indefinitely extended youth, advanced skills in self-presentation and self-surveillance). She is equally interested in whatever the market offers her – in the tools that will allow her to look more appealing, to be even more endlessly presentable, to wring as much value out of her particular position as she can.
The ideal woman, in other words, is always optimizing. She takes advantage of technology, both in the way she broadcasts her image and in the meticulous improvement of that image itself. Her hair looks expensive. She spends lots of money taking care of her skin, a process that has taken on the holy aspect of a spiritual ritual and the mundane regularity of setting a morning alarm.
The work formerly carried out by makeup has been embedded directly into her face: her cheekbones or lips have been plumped up, or some lines have been filled in, and her eyelashes are lengthened every four weeks by a professional wielding individual lashes and glue. The same is true of her body, which no longer requires the traditional enhancements of clothing or strategic underwear; it has been pre-shaped by exercise that ensures there is little to conceal or rearrange.
Everything about this woman has been pre-emptively controlled to the point that she can afford the impression of spontaneity and, more important, the sensation of it – having worked to rid her life of artificial obstacles, she often feels legitimately carefree. The ideal woman can be whatever she wants to be – as long as she manages to act upon the belief that perfecting herself and streamlining her relationship to the world can be a matter of both work and pleasure, or, in other words, of “lifestyle”. The ideal woman steps into a stratum of expensive juices, boutique exercise classes, skincare routines and vacations, and there she happily remains.
Most women believe themselves to be independent thinkers. Even glossy women’s magazines now model skepticism toward top-down narratives about how we should look, who and when we should marry, how we should live. But the psychological parasite of the ideal woman has evolved to survive in an ecosystem that pretends to resist her. If women start to resist an aesthetic, like the overapplication of Photoshop, the aesthetic just changes to suit us; the power of the ideal image never actually wanes. It is now easy enough to engage women’s skepticism toward ads and magazine covers, images produced by professionals. It is harder for us to suspect images produced by our peers, and nearly impossible to get us to suspect the images we produce of ourselves, for our own pleasure and benefit – even though, in a time when heavy social media use has become broadly framed as a career asset, many of us are effectively professionals now, too.
Today’s ideal woman is of a type that coexists easily with feminism in its current market-friendly and mainstream form. This sort of feminism has organized itself around being as visible and appealing to as many people as possible; it has greatly over-valorized women’s individual success. Feminism has not eradicated the tyranny of the ideal woman but, rather, has entrenched it and made it trickier. These days, it is perhaps even more psychologically seamless than ever for an ordinary woman to spend her life walking toward the idealized mirage of her own self-image. She can believe – reasonably enough, and with the full encouragement of feminism – that she herself is the architect of the exquisite, constant and often pleasurable type of power that this image holds over her time, her money, her decisions, her selfhood and her soul.
Figuring out how to “get better” at being a woman is a ridiculous and often amoral project – a subset of the larger, equally ridiculous, equally amoral project of learning to get better at life under accelerated capitalism. In these pursuits, most pleasures end up being traps, and every public-facing demand escalates in perpetuity. Satisfaction remains, under the terms of the system, necessarily out of reach.
But the worse things get, the more a person is compelled to optimize. I think about this every time I do something that feels particularly efficient and self-interested, like going to a barre class or eating lunch at a fast-casual chopped-salad chain, like Sweetgreen, which feels less like a place to eat and more like a refueling station. I’m a repulsively fast eater in most situations – my boyfriend once told me that I chew like someone’s about to take my food away – and at Sweetgreen, I eat even faster because (as can be true of many things in life) slowing down for even a second can make the machinery give you the creeps. Sweetgreen is a marvel of optimization: a line of 40 people – a texting, shuffling, eyes-down snake – can be processed in 10 minutes, as customer after customer orders a kale caesar with chicken without even looking at the other, darker-skinned, hairnet-wearing line of people who are busy adding chicken to kale caesars as if it were their purpose in life to do so and their customers’ purpose in life to send emails for 16 hours a day with a brief break to snort down a bowl of nutrients that ward off the unhealthfulness of urban professional living.
The ritualization and neatness of this process (and the fact that Sweetgreen is pretty good) obscure the intense, circular artifice that defines the type of life it’s meant to fit into. The ideal chopped-salad customer needs to eat his $12 salad in 10 minutes because he needs the extra time to keep functioning within the job that allows him to afford a regular $12 salad in the first place. He feels a physical need for this $12 salad, as it’s the most reliable and convenient way to build up a vitamin barrier against the general malfunction that comes with his salad-requiring-and-enabling job. As Matt Buchanan wrote at the Awl in 2015, the chopped salad is engineered to “free one’s hand and eyes from the task of consuming nutrients, so that precious attention can be directed toward a small screen, where it is more urgently needed, so it can consume data: work email or Amazon’s nearly infinite catalog or Facebook’s actually infinite News Feed, where, as one shops for diapers or engages with the native advertising sprinkled between the not-hoaxes and baby photos, one is being productive by generating revenue for a large internet company, which is obviously good for the economy, or at least it is certainly better than spending lunch reading a book from the library, because who is making money from that?”
On today’s terms, what Buchanan is describing is the good life. It means progress, individuation. It’s what you do when you’ve gotten ahead a little bit, when you want to get ahead some more. The hamster-wheel aspect has been self-evident for a long time now. But today, in an economy defined by precarity, more of what was merely stupid and adaptive has turned stupid and compulsory.
In his only novel, Seventy-Two Virgins, published in 2004, Boris Johnson uses a strange word. The hero, like Johnson himself at the time, is a backbench Conservative member of the House of Commons. Roger Barlow is, indeed, a somewhat unflattering self-portrait—he bicycles to Westminster, he is unfaithful to his wife, he is flippantly racist and politically opportunistic, and he is famously disheveled:
In the fond imagination of one Commons secretary who crossed his path he had the air of a man who had just burst through a hedge after running through a garden having climbed down a drainpipe on being surprised in the wrong marital bed.
Barlow, throughout the novel, is in constant fear that his political career is about to be ended by a tabloid scandal. In a moment of introspection, he reflects on this anxiety:
There was something prurient about the way he wanted to read about his own destruction, just as there was something weird about the way he had been impelled down the course he had followed. Maybe he wasn’t a genuine akratic. Maybe it would be more accurate to say he had a thanatos urge. [Emphases added]
The novel is a mass-market comic thriller about a terrorist plot to capture the US president while he is addressing Parliament in London. The Greek terms stand out. In part, they function as signifiers of social class within a long-established code of linguistic manners: a sprinkling of classical phrases marks one out as a product of an elite private school (in Johnson’s case, Eton) and therefore a proper toff. (Asked in June during the contest to replace Theresa May as Tory leader to name his political hero, Johnson chose Pericles of Athens.) The choice of thanatos is interesting, and the thought that he might have a death wish will ring bells for those who have followed the breathtaking recklessness of Johnson’s career. But it is akratic that intrigues.
The Leave campaign that Johnson led to a stunning victory in the Brexit referendum of June 2016 owed much of its success to its carefully calibrated slogan “Take Back Control.” Akrasia, which is discussed in depth by Socrates, Plato, and especially Aristotle in the Nicomachean Ethics, is the contrary of control. It means literally “not being in command of oneself” and is translated variously as “weakness of will,” “incontinence,” and “loss of self-control.” To Aristotle, an akratic is a person who knows the right thing to do but can’t help doing the opposite. This is not just, as he himself seems to have intuited, Boris Johnson to a tee. It is also the reason why he embodies more than anyone else a Brexit project in which the very people who promised to take back control are utterly incapable of exercising it, even over themselves. “Oh God, oh Gawd,” asks Barlow in a question that now echoes through much of the British establishment, “why had he done it? Why had he put himself in this ludicrous position?”
To grasp how Johnson’s akratic character has brought his country to a state approaching anarchy, it is necessary to return to the days immediately before February 21, 2016, when he announced to an expectant throng of journalists that he would support the Leave campaign. This was a crucial moment—polls have since shown that, in what turned out to be a very close-run referendum, Boris, as the mayor of London had branded himself, had a greater influence on voters than anyone else. “Character is destiny, said the Greeks, and I agree,” writes Johnson in The Churchill Factor, his 2014 book about Winston Churchill, which carries the telling subtitle “How One Man Made History.” While the book shows Johnson to be a true believer in the Great Man theory of history, his own moment of destiny plays it out as farce, the fate of a nation turning not on Churchillian resolution but on Johnsonian indecision. For Johnson was, in his own words, “veering all over the place like a shopping trolley.” On Saturday, February 20, he texted Prime Minister David Cameron to say he was going to advocate for Brexit. A few hours later, he texted again to say that he might change his mind and back Remain.
Sometime between then and the following day, he wrote at least two different columns for the Daily Telegraph—his deadline was looming, so he wrote one passionately arguing for Leave and one arguing that the cost of Brexit would be too high. (Asked once if he had any convictions, Johnson replied, “Only one—for speeding…”) Then, early on Sunday evening, he texted Cameron to say that he was about to announce irrevocably that he was backing Leave. But, as Cameron told his communications director, Craig Oliver, at the time, Johnson added two remarkable things. One was that “he doesn’t expect to win, believing Brexit will be ‘crushed.’” The other was staggering: “‘He actually said he thought we could leave and still have a seat on the European Council—still making decisions.’”
The expectation—perhaps the hope—of defeat is telling. Johnson’s anti-EU rhetoric was always a Punch and Judy show, and without the EU to play Judy, the show would be over. But the belief that Britain would keep its seat on the European Council (which consists of the leaders of each member state and makes most of the EU’s big political decisions), even if it left the EU, is mind-melting. Not only was Johnson unconvinced that he was taking the right side on one of the most important questions his country has faced since World War II, but he was unaware of the most basic consequence of Brexit. Britain had joined the Common Market, as it was then called, in 1973 precisely because it was being profoundly affected by decisions made in Brussels and was therefore better off having an equal say in those decisions. Johnson’s belief that Britain would continue to have a seat at the European table after Brexit suggested a profound ignorance not just of his country’s future but of its entire postwar past.
This ignorance is not stupidity—Johnson is genuinely clever and, as his fictional alter ego Barlow shows, quite self-aware. It is the studied carelessness affected by a large part of the English upper class whose manners and attitudes Johnson—in reality the product of a rather bohemian bourgeois background—thoroughly absorbed. Consequences are for the little people, seriousness for those who are paid to clean up the mess. (...)
Johnson has always understood that a vivid lie is much more memorable than a dull truth. He is a product of the tight little world of English class privilege in which the same people move from elite schools to elite universities to (often interchangeable) careers in politics and the media. (Johnson’s contemporaries at Oxford included David Cameron, a fellow member of the aggressively elitist Bullingdon Club; his own main rivals for the Tory leadership, Jeremy Hunt and Michael Gove; and the political editors of the BBC and Channel 4 who now report on him.) From Oxford he soon sailed into a position as a graduate trainee at The Times. It was there that he learned a valuable lesson: it pays to fabricate stories. The Times had to fire him because he sexed up a dull story by inventing lurid quotes and attributing them to a real Oxford historian (who happened to be his own godfather). Instead of ending his journalistic career, this was the seed from which it blossomed.
by Fintan O’Toole, NYRB | Read more: Image: Andrew Parsons/i-Images/eyevine/Redux
There is an old puzzle that goes like this: When you get to heaven, how will you know which man is Adam? The answer is that Adam will be the one man without a navel—he was never connected to an umbilical cord.
In the 1850s, evidence was mounting that life on Earth had existed longer than the six or seven thousand years that could be extracted from the Bible. Naturalist Philip Henry Gosse was a man of science as well as a committed Christian, and although he couldn’t deny this accumulation of facts, his allegiance to the Bible made it impossible for him to accept a multi-million-year evolution. His solution was to assert that God had brought the world into being entire on that first biblical morning, and with it the whole backstory of life on Earth: the Neanderthal skeletons in German caves, the sabertooths in California tar pits, all the countless remains of extinct shellfish, all the strata that only apparently took millennia to be laid down as seas rose and shrank away—and also the navel on a grown-up just-awakened Adam, and the beard on his face for that matter. God could do that, and he did. Why? Perhaps to challenge our faith: credo quia absurdum, as the early Christians could say about the weird things they were offered in sacred teachings—I believe it because it’s impossible. Gosse called the book in which he laid out this notion Omphalos, Greek for “navel.” And of course it can neither be proved nor disproved; no facts can be proffered that make it impossible, and none that can sustain it.
Ted Chiang, best known for the story that formed the basis of the Academy Award–winning film Arrival (2018), has brought out a new volume of stories called Exhalation, and in it is a story called “Omphalos”, which takes up Gosse’s paradox and reverses it. In Chiang’s story, Adam indeed had no navel—or rather the earliest human bodies, made in the beginning by God and recently discovered mummified and complete, have no navels. Ancient trees that God brought into being full-grown have not experienced the seasons, droughts, and injuries that create the rings by which their age can be calculated. Only after their instantaneous creation did they begin to produce those rings. Other lifeforms exhibit the same effect: they came to be full-grown, and only began to change in time. The proof of God’s love for his creations, including human beings, is found in close study of such natural phenomena. So this is not our world, and a scientist who works in it addresses her questions about it to God himself in prayer; like Gosse she is a careful, thorough, dedicated researcher. Nothing she knows or learns in her work can suggest any doubts to her, and she is right not to doubt. When doubts come, they come from discoveries in astronomy, not biology: God, it seems, has other worlds he is interested in.
Chiang used to worry if he could make it as a writer because of how slowly he works—as it stands, all his published stories fit in two slim volumes, Exhalation and his first collection, Stories of Your Life and Others (2002). Yet from the first stories he published, Chiang established a style of storytelling that is his alone. Many of the stories turn on a possibility in physics or mathematics or biology that either creates a world unlike ours, or shows us that our own world is not what we think it is. In science fiction (SF)—which is what Chiang writes, though sometimes just barely—the science-fictional things are what bear the meaning and produce the emotional force of the story, and Chiang’s science-fictional things are like no other writers’, even when they turn on much-used (and abused) concepts such as quantum mechanics, time reversals, or alien contact.
Take the title story in Exhalation, which describes an enclosed universe of beings living lives entirely different from and yet precisely like our own. “It has long been said that air (which others call argon) is the source of life,” a narrator begins. “This is not in fact the case.” In (on?) this world it will turn out to be not the case, but for the beings of this world it might as well be. They are apparently metallic, and their “air” comes to them from tanks, called “lungs” herein, which are installed in their bodies and, when emptied each day, must be removed and replaced with full ones, available at public stations, where they also meet neighbors and chat. The argon (for that’s what it actually is) is piped up from a vast underground reservoir—“the great lung of the world.” When this is established for the reader, the biology gets stranger. These durable people rarely die; their titanium skins cover elaborate and delicate systems of rods and pistons by which they move and act. Brains, however, are more difficult to study, and the narrator of the story has to remove the plate at the back of his head, and with mirrors look in wonder and delight into the mechanism. At the same time (there is often an at-the-same-time in Chiang’s stories), inexplicable slowings of time are being observed; the differences of air pressure within persons and air pressure outside, which allow motion and even thought, are inching, through entropy, toward equivalence, thus death: there is no stopping it. The narrator is setting down his personal account of this for a probably imaginary future visitor, to whom he writes that “the tendency toward equilibrium is not a trait peculiar to our universe but inherent in all universes.” The same fate faces us. (...)
Chiang’s SF differs from most SF in many ways, but the most striking—and pleasing—difference is that there are almost no villains in his stories. He shares this with Ursula K. Le Guin, who wrote: “Herds of Bad Guys are the death of a novel. . . . Whether they’re labeled politically, racially, sexually, by creed, species, or whatever, they just don’t work.” The only true villain in this collection is Morrow, deceitful dealer in “paraself” technology in “Anxiety is the Dizziness of Freedom,” a story which could be called noir. Mostly Chiang’s characters tend to think, intensely; they explore, go wrong, puzzle out, work through—not only science problems but personal ones, though many of the latter are the result of the former.
by John Crowley, Boston Review | Read more: Image: Jiuguang Wang
On the morning of journalist Rachel Syme's 36th birthday, she took to Twitter to ask: "I feel like 33-38 is a really tough age for a lot of women I know; feels like so many big decisions and future plans have to be squeezed into this lil window; just me?
"It's not just a baby decision which, yes, is huge in those years and looms over everything. It feels like all my friends this year are doing this huge re-evaluation of everything. It's a time of lurches and swerves."
It turned out, that no, it was not just her.
Instead, she had touched a nerve and was sent an avalanche of shared experiences and advice by a swathe of strangers from around the world who understood exactly how she felt. There were hundreds of responses, just under 1,000 retweets and 9,000 likes.
Rachel told the BBC: "The messages, both private and public, just don't stop coming.
"I felt like somewhere in my youth, I decided that 36 was my 'scary age' but now it feels like I'm here and while things are coalescing both in good ways professionally and personally, it's also in a scary way."
She added that the people contacting her were "describing how they were 'going through the swerve' so that's what I'm now calling it".
Rachel said she would look around and see her friends who were in the same age bracket all experiencing this "unspoken period of change" involving major life decisions.
Some were new mums, others were breaking up with their long-term partners and others were moving across the country.
"I feel like nobody talks to you about what it's like to be this age. We have the youth; spunk, energy, beauty, and there's so many things people feel like they must do - but where are the conversations about all of the big decisions we need to make?"
Although the New Yorker hoped her vulnerability on Twitter would be a "generative exercise", she never expected it to spark such a global conversation.
"I read so many articles about people who live with their parents for longer than before, while we also know our generation has such little job security," she said.
She added that people take longer to settle down, live longer and have more choice. "Basically there is just so much going on."
by Dhruti Shah, BBC | Read more: Image: Rachel Syme
[ed. Certainly was a swerve time for me.]
Looks directly into camera: Did you really think we’d choose another show?
No, but seriously. We considered other very good series for this honor but kept coming back to Fleabag, the same way Fleabag, the character created and played by the magnificent Phoebe Waller-Bridge, keeps going back to the Priest during the perfect second season of this fantastic series. The attraction can’t be denied.
The six episodes that comprise season two landed on Amazon Prime on May 17, two months after its initial U.K. airing on BBC, and the same weekend that the Game of Thrones finale aired. After a couple days of GOT-ending outrage and disappointment, Fleabag took over the TV discourse. The most massive show on television, one with dragons and battles that take days to shoot and has millions upon millions of viewers, was quickly overshadowed by a series about a woman resisting her feelings for a priest.
When people finished bingeing that second season, it was as if they wanted to shout their love for it from rooftops. The day after one of my best friends made her way through it, she texted me, “I finished Fleabag. Nothing will ever be that good again.” It didn’t even sound like hyperbole.
So what makes Fleabag season two elicit such responses at a time when it’s harder than ever for a single work of television to capture public attention? If I had to single out one thing, aside from Andrew Scott, a.k.a. the Hot Priest, it’s how unbelievably tight the show is. There are just six episodes of Fleabag. Each one lasts 27 minutes or less. From the very beginning, it drops us into a moving car and never lets up on the gas. In an extremely efficient kickoff, it recaps the major moments of the first season, advises in a single title card that season two begins exactly 371 days, 19 hours, and 26 minutes after that previous season ended, and shows us Fleabag in a bathroom, wiping a bloody nose for reasons we don’t yet understand. “This,” Fleabag explains, breaking the fourth wall in her signature fashion, “is a love story.” We don’t yet know why her face is bloody, or why there’s another bleeding woman in the bathroom with her, or who is standing right outside the door asking, “Can I do anything?” Smiles. Charm. Off we go.
Mainstream comedy tends to move at a much quicker clip than it did even a decade ago, the result, perhaps, of shorter attention spans, and the influence of lickety-split television like Arrested Development and 30 Rock. But some sitcoms move quickly simply to prove they can exceed the speed limit. Fleabag, on the other hand, has its own rhythms and invites us to keep up. Season two is really a dance, between Fleabag and her sister Claire, Fleabag and the audience, Fleabag and the Priest.
Oh, lordy, the Priest. The fascination with his character can seemingly be explained in the simplest of terms — he’s hot — but that doesn’t quite capture it. It’s the way that Scott and Waller-Bridge, who have enough chemistry to ignite several biology labs’ worth of Bunsen burners, relate to each other that makes him sexy. As he and Fleabag become more intimate, we, as viewers, palpably feel like we are part of this relationship as well. That’s a testament to the performances of the two actors, but it also speaks to the way that Waller-Bridge has orchestrated our relationship to Fleabag.
By turning us into her confidantes, she draws us into her reality, and therefore her new relationship, too. As the only person who notices that Fleabag regularly winks and comments to some unseen presence, the Priest also becomes aware of us. And because both Fleabag and the Priest are aware of us, we feel seen, in a way that few television shows ever see us. What might have been a clever little narrative device on another show suddenly has much deeper resonance because Waller-Bridge uses it with such smart and specific intent. She has made her two leads fall in love with each other, but she’s also made us fall in love with them and a whole season of television she’s created.
Life teaches us not to expect perfection. No relationship is perfect. No job is perfect. No movie or TV show is perfect. But then along comes something like Fleabag that says, actually, every once in a while, you get to have this. You get to have perfect.
Soft robots are getting more and more popular for some very good reasons. Their relative simplicity is one. Their relative low cost is another. And for their simplicity and low cost, they’re generally able to perform very impressively, leveraging the unique features inherent to their design and construction to move themselves and interact with their environment. The other significant reason why soft robots are so appealing is that they’re durable. Without the constraints of rigid parts, they can withstand the sort of abuse that would make any roboticist cringe.
In the current issue of Science Robotics, a group of researchers from Tsinghua University in China and University of California, Berkeley, present a new kind of soft robot that’s both higher performance and much more robust than just about anything we’ve seen before. The deceptively simple robot looks like a bent strip of paper, but it’s able to move at 20 body lengths per second and survive being stomped on by a human wearing tennis shoes. Take that, cockroaches. (...)
To put the robot’s top speed of 20 body lengths per second in perspective, have a look at this nifty chart, which shows where other animals relative running speeds of some animals and robots versus body mass:
This chart shows the relative running speeds of some mammals (purple area), arthropods (orange area), and soft robots (blue area) versus body mass. For both mammals and arthropods, relative speeds show a strong negative scaling law with respect to the body mass: speeds increase as body masses decrease. However, for soft robots, the relationship appears to be the opposite: speeds decrease as the body mass decrease. For the little soft robots created by the researchers from Tsinghua University and UC Berkeley (red stars), the scaling law is similar to that of living animals: Higher speed was attained as the body mass decreased.
If you were wondering, like we were, just what that number 39 is on that chart (top left corner), it’s a species of tiny mite that was discovered underneath a rock in California in 1916. The mite is just under 1 mm in size, but it can run at 0.8 kilometer per hour, which is 322 body lengths per second, making it by far (like, by a factor of two at least) the fastest land animal on Earth relative to size. If a human was to run that fast relative to our size, we’d be traveling at a little bit over 2,000 kilometers per hour. It’s not a coincidence that pretty much everything in the upper left of the chart is an insect—speed scales favorably with decreasing mass, since actuators have a proportionally larger effect. (...)
The researchers also put together a prototype with two legs instead of one, which was able to demonstrate a potentially faster galloping gait by spending more time in the air. They suggest that robots like these could be used for “environmental exploration, structural inspection, information reconnaissance, and disaster relief,” which are the sorts of things that you suggest that your robot could be used for when you really have no idea what it could be used for.
I sometimes think that if you could look in the safe behind Jeff Bezos’s desk, instead of the sports almanac from Back to the Future, you’d find an Encyclopedia of Retail, written in maybe 1985. There would be Post-It notes on every page, and every one of those notes has been turned into a team or maybe a product.
Amazon is so new, and so dramatic in its speed and scale and aggression, that we can easily forget how many of the things it’s doing are actually very old. And, we can forget how many of the slightly dusty incumbent retailers we all grew up with were also once radical, daring, piratical new businesses that made people angry with their new ideas.
This goes back to the beginning of mass retail. In Émile Zola’s Au Bonheur des Dames, a tremendously entertaining novel about the creation of department stores in 1860s Paris, Octave Mouret builds a small shop into a vast new enterprise, dragging it into existence through force of will, inspiration, and genius. In the process, he creates fixed pricing, discounts, marketing, advertising, merchandising, display, and something called "returns." He sends out catalogs across the country. His staff is appalled that he wants to sell a new fabric at less than cost; "that’s the whole idea!" he shouts. Loss leaders are nothing new.
Meanwhile, the other half of the story follows the small, traditional shopkeepers in the area, who are driven out of business one by one. Zola sees them as part of the past to be swept away. They’re doomed, and they don’t understand—indeed, they’re both baffled and outraged by Mouret's new ideas. Here’s the draper Baudu:
The place would soon be really ridiculous in its immensity; the customers would lose themselves in it. Was it not inconceivable? In less than four years they had increased their figures five-fold… They were always swelling and growing; they now had a thousand employees and twenty-eight departments. Those twenty-eight departments enraged him more than anything else. No doubt they had duplicated a few, but others were quite new; for instance a furniture department, and a department for fancy goods. The idea! Fancy goods! Really those people had no pride whatever, they would end by selling fish.
Mouret had a catalogue, but it was Sears Roebuck that used catalogs to transform retail again. The pages below come from the retailer's 1908 catalog; white label and private label products are not new either, and you can bet that Sears was using sales data to decide what market segments to enter next.
Amazon, of course, is the Sears Roebuck of our time, but it’s more than that. Amazon is systematically going through every branch of the idea tree around what retail is, and doing it without any pride. It’s trying everything that anyone has ever tried before, and anything else that it can think of that might make sense, as well. There is no-one saying "that’s a good idea, but we’re a website so we wouldn’t do that."
The clearest place to see this is in Amazon’s moves into physical retail. This is the opposite of pride or "principle." Amazon’s job is "to get you the thing," not "to be a website," so what are the best ways to do it? What else might work? The project to make a convenience store with no human checkout process is an obvious experiment, now that machine learning and computer vision offer a route to make it work. (There are a number of startups pursuing all the possible vectors to doing this.)
More interesting, though, are the Amazon Four-Star stores, physical retail stores —currently in New York and Berkeley, California—that only sell products rated highly by users on its site. I joked on Twitter that they feel as though they were designed by very clever people who have seen shops in Google Street View, but never actually been inside one. There's a sense of cognitive dissonance: the selection of products appears to be completely random. There’s a rice cooker, a Harry Potter Lego set, a cushion, a Roomba, a mixing bowl, a book about trees... It makes no sense. (In the words of Zola's Baudu, “Those people have no pride!”)
Of course, sometimes "it makes no sense" is the right reaction (remember the Fire Phone, after all). But when clever people do things that make no sense, it can be worth looking twice. Is this a new discovery model? A different way to change how people think about purchasing? Well, it’s another experiment.
All of this reminds me of stories about early Google, and how the company systematically rethought everything from first principles. Sometimes this was just a painful waste of time, as it learned the lessons everyone else had already learned, but sometimes the result was Gmail or Maps.
Sometimes the experiment is still in progress: though Amazon has managed to put Alexa into more than 50 million homes, it’s not yet clear what strategic value it will gain (I wrote about this here). But it’s better to own the experiment and get the option value than to sit on the business you already have and watch someone else try something new.
On the other hand, it’s interesting that Amazon seems to be doing as much experimentation as possible around the logistics model—from stores to drones to warehouse robots of every kind—but much less around the buying experience, other than small-scale tests of the Four-Star stores. After all, historically, department stores were about pleasure as much as they were about convenience or price. They changed what it meant to "go shopping" and helped turn retail into a leisure activity.
MOSCOW - Pushing back against charges that Senator Mitch McConnell is a Russian asset, the Russian President, Vladimir Putin, said on Tuesday that McConnell “has never been an asset to any country.”
“You can scour the four corners of the globe, and you will not find a nation that would ever in a million years consider Mitch McConnell an asset,” Putin said.
The Russian President urged pundits who have called McConnell a Russian asset “to look up the word ‘asset’ in the dictionary.”
“You will find that ‘asset’ means a useful or valuable thing,” Putin said. “The only part of that definition that fits McConnell is ‘thing.’ ”
Pressing his case further, he said that it was debatable whether McConnell was even an asset to his home state of Kentucky. “Maybe compared to Rand Paul he is, but that’s setting the bar ludicrously low,” he said.
Concluding his remarks, Putin said that people who ask, “Who does Mitch McConnell work for?” are asking the wrong question. “The question should be ‘When has Mitch McConnell ever worked?’ ” he said.
Meet The Right-Wing Consultant Who Goes From State To State Slashing Budgets
A few days after a powerful earthquake hit the state last November, Alaska Gov. Mike Dunleavy (R) issued an order increasing the power of the state’s budget office, led at the time by a woman who had lived in Alaska a mere two weeks.
In her newly empowered role, Donna Arduin — an infamous budget-slashing expert — and Dunleavy went on cut to hundreds of millions from the state budget. They aim to trim even more in her second year in the remote state.
It’s hardly Arduin’s first rodeo. The budget consultant has served in several Republican-led governor’s offices, slashing state expenses while cutting or resisting efforts to increase tax revenue. (...)
Since arriving in Alaska last year, Arduin has led the governor’s attempt to cut a whopping $1.6 billion in spending from education, social services, the arts, and nearly every other corner of state government. Between legislative cuts and line-item vetoes, Dunleavy has so far cut “almost $700 million” from the budget in his first year, Arduin said in an interview earlier this month, despite a failed recent attempt by the legislature to override his vetoes.
“We’re about halfway solved,” she said. “We’re going to be looking towards reducing the budget another $700 million next year.”
The University of Alaska’s Board of Regents, at a meeting in which they declared financial exigency last week, sounded less enthusiastic. The institution has been “crippled,” its president said, by the governor cutting roughly 40% of the school’s state funding — over $130 million. Thousands of students across the state found their state-funded scholarships suddenly defunded with the school year looming. “We will not have a university after February if we don’t make a move,” one regent noted.
Another Alaskan who had scheduled a dentures appointment four weeks after having his teeth extracted was left with gums flapping in the wind, after the governor eliminated Medicaid dental coverage for adults. That saved the state $27 million.
But the steep cuts aren’t surprising to Americans in several other states. Following an internship in the Reagan-era Office of Management and Budget and stints at Morgan Stanley and elsewhere, Arduin has crisscrossed the country slashing state budgets left and right.
“I have no sympathy for people who want handouts from the government,” Arduin told Duke Magazine for a 2006 profile.
It shows. (...)
As then-Florida Gov. Jeb Bush’s budget director, Arduin pushed a plan that would have empowered a statewide board of appointed doctors, pharmacists and others to decide which drugs could be prescribed using Medicaid funds. To make her point, Arduin pointed to HIV-positive men receiving Viagra prescriptions. “If it were up to me, the state wouldn’t pay for it at all,” she said.
In the process of cutting $8.1 billion over five years in Florida, the Los Angeles Times later reported, “Florida eliminated money for eyeglasses, hearing aids and dentures for poor seniors and forced 55,000 low-income children onto health insurance waiting lists.”
At that point, Arduin was “on loan,” from Bush’s office to then-California-governor-elect Arnold Schwarzenegger’s as an unpaid budget expert, and then as the state’s full-time budget director. Arduin ultimately left California after 11 months as Schwarzenegger’s adviser. An initial budget proposed under Arduin’s leadership cut $274 million from programs for developmentally disabled people, the Los Angeles Times later reported, until furious protest led the governor to reconsider. Overall, the budget deficit she’d sought to tackle only got worse.
“We didn’t solve the problem. We made it worse,” Michael Genest, who worked with Arduin when she was California Department of Finance director and later held the same post, told the Anchorage Daily News in a profile of Arduin last week. “That was the tradeoff.”
One of her partners at the consulting firm, Moore, commented at the time to the Los Angeles Times, “I think that her attitude is, I’ve come and rescued California, and pretty soon it’s time to pass the baton to someone else and go back to Florida or privatize herself in some way.”
Later, in Illinois, Arduin spent just eight months as Republican Gov. Bruce Rauner’s budget adviser, receiving an estimated $165,000 for her work after failing to come up with a budget the Democratic legislature found palatable. Rauner called Arduin “the smartest state government budget person in America.” (He was subsequently defeated in a reelection bid.)
Just as Arduin’s bids to slash budgets in California and Illinois met resistance outside the governor’s office, many Alaskans are frustrated with her lack of familiarity with their unique state.
The leader of a tribal consortium, Melanie Bahnke, told Arduin not to “use the word ‘our’ when referring to our people, our state and our issues” at an event sponsored by Americans For Prosperity, the Anchorage Daily News noted.
That might sound withering, but by now, Arduin has grown used to such critiques, as she noted to Duke Magazine in 2006.
Image: Screenshot/YouTube, "Governor Perry"
[ed. Alaska is a complete mess right now. Even the ferries are out of commission due to strikes. All this because a new governor promised to dole out more money each year to residents from the state's Permanent Fund (an oil royalty savings account) while cutting state services. See also: The lunacy of the PFD fight (ADN).]