Tuesday, October 24, 2017

Ticked Off: What We Get Wrong About Lyme Disease

My sister Camilla and I stepped off the passenger ferry onto the dock at Vineyard Haven, Martha’s Vineyard’s main port, with a group that had already begun their party. They giggled, dragging coolers and beach chairs behind them. We competed to see how many items of Nantucket red we could spot.

Not that we were wearing any. Camilla wore shorts with white long underwear underneath, and I wore beige quick-dry hiking pants. Both of us had on sneakers with long white socks. It was late June, perfect beach weather. The water sparkled. But we weren’t headed toward the ocean. We were there to hunt for ticks.

On the island, we hopped in a cab. Camilla looked longingly out the window as we passed the turns for the town beach and Owens Park Beach. The driver pointed out the location of the famous shark attack beach from Jaws. We drove on south to Manuel Correllus State Forest, an unremarkable park in the center of the island and the farthest point from any beach.

Deer ticks, or blacklegged ticks, are poppy-seed sized carriers of Lyme disease. We needed to collect 300 before the last ferry returned to Woods Hole, Massachusetts that night. We each unfurled a drag cloth—a one-meter square section of once-white corduroy attached to a rope—and began to walk, dragging the cloth slowly behind us as if we were taking it for a stroll. The corduroy patch would rise and fall over the leaves and logs in the landscape, moving like a mouse or a chipmunk scurrying through the leaf litter. Ticks, looking for blood, would attach to the cloth. Every 20 meters, we’d stoop to harvest them.

Tick collecting made it to Popular Science’s 2004 list of worst science jobs alongside landfill monitor and anal wart researcher. On cool days, though, sweeping the forest floor, kneeling to pluck ticks from corduroy ridges, the job became rhythmic. I felt strangely close to the forest. As I soon found out, the work got me closer to people, too.

Sometimes hikers would stop by, curious, then repulsed. They would want confirm the proper way to pull off ticks (with tweezers planted close to skin, perpendicularly), or to tell us about their diagnoses. Lyme disease isn’t like many of the diseases studied by my friends in the epidemiology department, where I was a doctoral student. No one talks about their grandmother’s syphilis infection, caused by Treponema pallidum, another spirochete bacterium.

But once people heard what Camilla and I were collecting, stories of brushes with ticks and family members’ diagnoses were shared freely. I quickly became the “tick girl.” When I started my dissertation I was preoccupied by the ecological question: How have humans altered the environment and triggered a disease emergence? By the time I finished, I realized that far more interesting were the rich and revealing tick stories shared with us along the way.

Illness makes us talk. “This is true of all forms of pain and suffering,” Arthur Kleinman, an anthropologist and physician at Harvard University, told me. We talk about illness “to seek assistance, care, and in part to convey feelings about fear, anxiety, or sadness.” In his book, The Illness Narratives, Kleinman writes that “patients order their experience of illness … as personal narratives.” These narratives become a part of the experience of being sick. “The personal narrative does not merely reflect illness experience, but rather it contributes to [it].”

The result is a peculiar togetherness. Once, a friend’s mom emailed that she’d just pulled off her first tick of the season, from her pubic hair: “I’m guessing it doesn’t it surprise you to hear, Katie, that you came to mind almost immediately when I discovered the little bugger? I’m afraid that ticks and you will be forever linked in my mind.” Naturally some took the motif too far. One creepy grad student thought that, because I was standing in front of a tick poster at an academic conference, I’d want to hear about the time he pulled a tick off his dick.

The country singer Brad Paisley romances the tick: “I’d like to see you out in the moonlight / I’d like to kiss you way back in the sticks / I’d like to walk you through a field of wildflowers / And I’d like to check you for ticks.” I’m with Paisley here. Creeps aside, tick grooming is an act of love. My sister and I were diligent in the tick checks we gave ourselves and each other. Most nights, we’d pull off several at the campsite showers. (...)

The idea that the natural and human exist in separate realms is the very “trouble with wilderness,” the environmental historian William Cronon wrote in his 1995 book Uncommon Ground. The wilderness that we’ve feared, romanticized, and valorized over the last few hundred years, he says, is a fantasy:
[Wilderness] is quite profoundly a human creation—indeed, the creation of very particular human cultures at very particular moments in human history … Wilderness hides its unnaturalness behind a mask that is all the more beguiling because it seems so natural. As we gaze into the mirror it holds up for us, we too easily imagine that what we behold is Nature when in fact we see the reflection of our own unexamined longings and desires.
In the stories told by our doctors, our parks, and the CDC, ticks are invaders. To defend ourselves, we use insect repellent, clothing, and prophylactic antibiotics; fences, signs, and pesticides. “When it comes to pesticides, the environmental toxin par excellence, Lyme patients are often its greatest proponents,” writes Abigail Dumes, an anthropologist at Michigan State University. We prefer the risk posed by pesticides to the fear of Lyme, Dumes explained to me. They let us become actors instead of victims. By dosing ourselves with pesticides (or antibiotics), we gain control of our risks. Ticks, on the other hand are uncontrollable. “It’s difficult to live with the idea that there are enormous threats and many can’t be controlled,” Kleinman tells me.

The problem is our defensive barriers aren’t working particularly well. Deer ticks are now established across 45 percent of United States counties. Their range has more than doubled in the last 20 years. Reported cases of Lyme disease have more than tripled since 1995 and the CDC estimates that more than 300,000 Americans fall ill each year. The story of tick-as-invader isn’t particularly helpful—or complete.

by Katharine Walter, Nautilus | Read more:
Image: Katharine Walter

Jan Tarasin, Records series 1991
via:

Up The Irony

There’s no limit to the waves of embarrassment that old metal dudes will rain down on their beleaguered, black-clad fans. Every time a hesher bangs his head, some fogey from the first or second gen kills the buzz: here’s Phil Anselmo doing a white power sign like a dickwad, or how about Gene Simmons’s failed bid to trademark the horns? Or, Ted Nugent, still sucking air? It certainly does suck. More recently, Dee Snider, the one-time frontman of the ’80s glam band Twisted Sister, who made an entire career out of appropriating crossdressing, is pissed about the “new” trend of non-metal fans wearing metal tees.

On October 17, Snider tweeted, “Gotta say, this new trend of non-metal fans wearing vintage metal T’s if [sic] pretty sickening. Metal is not ironic! Dicks.” Where the hell do I even begin? Old Man Snider is so completely out of touch with the culture he was once a part of, that he thinks a) the trend of wearing heavy metal shirts ironically is new, and b) that heavy metal itself is not ironic.

During the mid to late ’90s (heavy metal’s dark ages), every high school emo band on earth had at least one kid who donned an ironic Maiden tee. While no one could expect the lead singer of a band that barely classifies as metal to be aware of this particular phenomenon, the completely unavoidable resurgence of heavy metal in the early to mid 2000s (which is the only reason anyone under the age of 50 might give a damn about what Dee Snider thinks) was led by bands like Mastodon and The Sword, whose stock in trade was equal parts sincerity and irony. They showed a level of care and appreciation for old metal that led them to craft intricate, loud compositions that sounded fresh and exciting compared to whichever snooze fest James Mercer was packaging as a Shins album that year. Coupled with the fact that what those bands were saying was just fucking funny was what made it so timely and welcome. This balance is the only thing that allowed heavy metal to force its way back into any kind of cultural relevance.

Snider followed his first ill-conceived tweet on the subject with this the next day:

“It’s not just the wearing of our metal T’s, it’s their cherry picking of our style #skulls#metalhorns These are OUR symbols; OUR image.”

And that’s where his rant takes the all-too predictable turn from stupid to problematic. Who is this OUR that Dee Snider imagines? And who is the THEY? The group that Dee Snider fancies himself a part of is a genre of music that wouldn’t exist had white people not stolen the blues and early rock and roll from people of color. Heavy metal is just one step in a long line of music born of that initial theft. The heavy metal that I know and love does not discriminate. It’s inclusive and open-minded, and it must reckon with a past that includes Pantera proudly waving confederate flags and Lemmy sporting a straight-up Nazi uniform for half a century. It’s a heavy metal that struggles to turn a profit and can ill afford to alienate any person who wants to buy a shirt and wear it in public. People are being bombed and gunned down at concerts across the world, and no one is referring to them as pop or country music fans. An old white guy crying cultural appropriation reeks of hypocrisy. It’s time to let go of the ‘who is a true metal fan?’ debate and worry about things that actually fucking matter.

by John Dziuban, The Awl | Read more:
Image: uncredited

Nobody Thinks About eBay

One of those things that so many brands want is scale: eBay is enormous. It has 171 million users, with 1.1 billion listed items at any given time. But it’s also no longer the only game in town. There’s competition from all over, most notably from eBay's great rival to the north, Amazon; Brooklyn-based crafts giant Etsy; and venture-backed consignment sites like The Real Real and Poshmark. Deering may talk of the company’s advancements, but the truth is, eBay has fallen far behind.

It’s dedicated to remaining an online marketplace — nothing more than a platform on which buyers and sellers can interact — a position that’s hard to justify as it’s become less enticing to both kinds of users. It hasn’t invested in warehouses or inventory; it hasn’t introduced competitive shipping programs. It now needs to both differentiate and elevate itself, and then it must communicate all of that to the customer.

These days, 88 percent of postings are “Buy It Now” items, not at all tied to the auction function eBay is known for, and 81 percent of what’s available for sale is new. To eBay, new means unopened, never-used items; this claim is murky, though, as most items are still coming from third-party sellers and not from brands themselves. In fact, eBay has become a haven for flipping, a practice in which users sell in-demand merchandise at exponentially higher prices, further adding to eBay’s sometimes-dubious reputation.

eBay also thinks it’s positioned to acquire Millennial and Gen Z customers who have largely ignored the site. “Younger customers don’t have misperceptions of eBay — they don't have any perceptions,” says Deering. “We’re not even in their awareness at all.”

The company’s research has found that a younger audience wants unique products and “is searching for items that push against conformity.” In this way, Deering believes eBay can be something of a foil to Amazon: “People felt like they were becoming anti-human because Amazon is so habitual, but that isn’t us. If you love Converse, you come to our site because there’s every color, every graffiti-ed version, vintage. You’re not going to get that if you go onto Amazon or into a department store.”

The goal, as eBay’s vice president of merchandising Jay Hanson puts it, is to get customers to think of eBay as the first shopping site they should visit, no matter what they’re looking to buy. But can the once-dominant company actually take that crown from Amazon? Or compete with much nimbler startups that offer white-glove services and more curated and easily navigated shopping experiences? It seems wildly unlikely, but eBay’s determined to try. (...)

In the shadow of competitors big and small, eBay has remained stagnant. Its 2016 net revenue hovered just under $9 billion — a significant figure, but one that's only marginally risen over the last for five years. Analysts attribute eBay’s shortcomings to outdated technology and a confounding user experience.

“The eBay site hasn’t really gone through dramatic levels of change since the beginning, and if there was any change, it was subtle and not slick enough,” says Sean Maharaj, a director at global management consulting firm AArete. “The customer interface, the website, it is not on par with some other companies that are coming at this with a new digital strategy. It’s unfriendly and it’s not easy to navigate.”

One of eBay’s main bragging points — that it has 1.1 billion listings — has also become a source of weakness for the company. Organizing that many items is a herculean task, particularly when the bones of the site are now decades old. Greg Portell, a retail partner at global strategy and management consulting firm A.T. Kearney, understands why eBay positioned itself for so long as a marketplace for everything and anything, as opposed to offering a more edited selection of products.

“If you think of successful retailers like T.J. Maxx or Marshall’s, you can justify it,” says Portell. “They aren’t known to be very neat and tidy, and they do extremely well. My guess is eBay was emphasizing this attitude, as opposed to a clean, personalized experience you’d expect from a well-curated store. eBay’s problem now, though, is they are stuck in a middle space where it’s hard to differentiate. You have smaller, niche sites that can curate and provide a less cluttered environment, which is what’s growing right now.”

by Chavie Lieber, Racked |  Read more:
Image: Christie Hemm Klok

Cash Prizes for Bad Corporate Citizenship, Amazon Edition

Everyone in the urban space is busy handicapping the Amazon horserace, to see which city will land Amazon’s HQ2, which promises to be the biggest economic development prize of the 21st century. Amazon’s RFP, issued last week, invites metro areas with a million or more population to submit their entries.

Prominent among them: Show us your incentive packages:
Capital and Operating Costs – A stable and business-friendly environment and tax structure will be high-priority considerations for the Project. Incentives offered by the state/province and local communities to offset initial capital outlay and ongoing operational costs will be significant factors in the decision-making process. 
Incentives – Identify incentive programs available for the Project at the state/province and local levels. Outline the type of incentive (i.e. land, site preparation, tax credits/exemptions, relocation grants, workforce grants, utility incentives/grants, permitting, and fee reductions) and the amount. The initial cost and ongoing cost of doing business are critical decision drivers.
There’s actually very little to add to the speculation about which city has the inside edge. Plenty has been written that makes the most obvious points. Brookings’ Joseph Parilla narrows the list to 20 cities that have the size to accomodate the company. Richard Florida makes a strong case for the top half dozen. The New York Times Upshot has gone so far as to pick a winner (Denver); although their article is actually more helpful for thinking about the winnowing process than handicapping the eventual winner.

A common refrain is that this beauty contest is ultimately revealing as to 21st century corporate decision making factors. While there’s a lot of detail here, the factor that’s going to make the most difference is the availability of talent. When you’re hiring upwards of 50,000 highly trained workers, as we’ve said before, the location decision is going to be made by the HR department. A city has to have a substantial base of talent–especially software engineers–and be a place that can easily attract and accomodate more. Beyond these the availability of talent, it’s likely that analysts are reading too much into the criteria laid out in the RfP. The request for proposals was not drawn up to reveal Amazon’s decision criteria. It was drawn up to solicit the maximum number of credible incentive packages.

If you’ve been around the economic development fraternity for long, you’ll know that this is just the latest in a series of similar high profile corporate gambits to generate state and local subsidies. Back the the 1980s, states and cities were throwing themselves at GM’s newly minted Saturn division (remember them?), offering up subsidies for Microelectronics and ComputerTechnology Corporation(MCC) and submitting bids to be the home of the SuperConducting Super Collider. All three of these supposedly world-changing enterprises have since expired or been absorbed into other organizations.

Amazon–who after all, makes its business knowing the decision preferences of tens or hundreds of millions of customers–is hardly likely to rely on cities for the information to make its decision. In all likelihood, the company already has in mind a preferred site, or perhaps two. The whole point of this exercise is to improve the company’s bargaining position for the location it wants. (...)

Corporations have choices. They could go about their business, and simply choose the best location, the one that makes the greatest business sense, and invest accordingly. Or they can as Amazon, GE, and dozens of others, go through the ritual of pretending to entertain a wide range of proposals, and use the leverage of competing bids to sweat the best possible deal out of their preferred location. The net result of our current approach is to provide giant cash rewards to those who engage in the most cynical behavior. As a result, while Amazon may turn out to be a winner, it may come at the cost of fiscally impoverishing the city that it chooses to locate in. The other losers will be all the businesses against which Amazon competes, who are too small to have the leverage to insist on a comparable level of public subsidy for their similar operations.

by Joe Cortright, City Commentary | Read more:
Image: Amazon
[ed. See also: This Is What Really Happens When Amazon Comes to Your Town.]

Monday, October 23, 2017

What the Washington Post/CBS DEA Investigation Tells You About Congress: It’s Really bad

Recently, the Washington Post and CBS teamed up on an investigation that has now cost Congressman Tom Marino (R-PA) his nomination as the next Drug Czar. It is an incredible report, deeply sourced, with amazing details on how industry worked with Congress to gut DEA’s ability to prosecute drug trafficking abuses and deepened a horrific opioid epidemic in the U.S.

Many watchers have already latched on to the financial ties between industry and government. However, as troubling as they may be, it appears no laws were broken. No bribes were reported. Campaign contributions appear to conform to the letter of the law. Yet something feels clearly corrupt in this story. Which brings us to why the report is devastating on another front: it is a stunning display of institutional incompetence.

Congress as an institution, in a bipartisan fashion, both professionally and politically, failed. And its failure illustrates by far the most common form of influence in today’s Congress. This was not a case where 535 members of Congress were corrupted by a few thousand dollars of campaign contributions. This was a case where 535 members and their staffs didn’t know any better. This investigation did not uncover a crime; it exposed an institution in decline. Congress, in this instance, was unable to prevent the worsening of a national crisis because it didn’t know what it was doing.

Lobbying Influence


The bill at the heart of the investigation is not a major bill. It is short and obscure. If you read it, you would likely have no clue what it does. And it’s a perfect example of where lobbying has the greatest influence.

Contrary to popular belief, lobbyists often do not have their biggest impact on major legislation. The intense scrutiny major legislation receives and the rich information environment in which it is debated means much more competition for lobbyists trying to affect legislation. Multiple studies illustrate that Congress is not a vending machine: money in does not necessarily equal results out. So while lobbyists have an impact on major legislation, it is not often where the lobbying industry thrives. Instead, complex, low-salience issues are where lobbyists wield the most potent influence.

These minor bills comprise the overwhelming bulk of legislation Congress passes each year. Most of their work is on issues you’ve likely never heard of: removing restrictions on land transfers; improving services for older youth in foster care; increasing helium reserves at hospitals. These obscure and generally uncontroversial bills rarely make national headlines but take up the lion's share of Congress’s time.

This was true -- until a few days ago -- of the law that was the subject of the WaPo/CBSinvestigation. The bill was obscure when Congress passed it, and it remained so when President Obama signed it into law. Unless you have intimate knowledge of the authorizing statutes for the Drug Enforcement Agency, the internal mechanisms of the Department of Justice, and the Controlled Substances Act, you probably had no clue what this bill did. And that’s the point.

One of the investigation’s key characters is Linden Barber, a former DEA official who was the top lawyer at the Office of Diversion Control, which is charged with litigating abuses from industry and distributors. Barber left government for a better paying job in the private sector. Ultimately, he became a lobbyist that pushed the idea for legislation on Capitol Hill, finding champions for a bill that would weaken his former office’s ability to enforce the law.

The revolving door is nothing new to federal government, and it’s a fact of life on Capitol Hill. Staff and federal officials frequently leave government for the private sector, bringing with them deep expertise and valuable insider knowledge. This gives former government officials critical information advantages and can enable them to maneuver easily on complex issues most people do not understand.

But this was more than just a case of prior experience giving a lobbyist a leg up. This was a situation where expertise in the private sector far outweighed the expertise in either chamber of Congress, laying bare a knowledge gap between the two that many fear is widening.

The way this bill passed illustrates exactly how large - and how dangerous - this knowledge gap has become.

by Joshua C. Huder, Legbranch.com | Read more:

[ed. The Washington Post does no one any favors by installing a hard paywall (not even themselves, I'd suspect, even if they are making money). It'd be nice if Jeff Bezos' bottom line were a little more nuanced (especially for issues affecting society in general, as the Guardian and NY Times have attempted to do) but it's not. Bezos, and his company Amazon, apparently have no problem using loss leaders if they ultimately result in cornering whatever business category they're focused on, but that doesn't seem to apply to the Washington Post. There's a lot of good journalism being produced there these days and the company could be a lot more influencial if it wanted to be, but it isn't. Because... subscriptions. Hard paywall. Here's another link to the article in question: maybe it'll work, maybe it won't. Probably not, and if so, you'll never know what this story is about.]

Why Those Looking for the Next Crisis May be Looking in the Wrong Places

Despite, or more accurately, because so many markets are at high levels, often on thin trading volumes, many investors are edgy. Even though markets famously climb a wall of worry, I can’t recall a time when there have been so many skeptical long investors.

For instance, even though the famed FAANG keep racing to even loftier levels, a US stock market crash would be unlikely to do a lot of damage. Unlike the 1929 crash, this rally isn’t fueled mainly by money borrowed from banks. And unlike the dot-com bust, speculative stocks are not being used as a form of payment. Recall that companies that should have known better, such as Lucent (this BTW was Carly Fiorina’s doing) and McKinsey were taking equity instead of cash, meaning as consideration for services. Informed insiders say McKinsey had to write off $200 million of stock it took in lieu of fees; the actual number may be higher given that McKinsey could have discounted its fees. That practice was sufficiently widespread to give the dot-com crash a tad more sting than it might otherwise have had. Even so, there was no blowback to the payment system, and the early 2000s recession was not terrible by historical standards.

This is far from a complete list, but investors are worried about ETFs, Deutshe Bank, festering banking problems in Italy, and China’s debts, and a longer than usual list of exogenous risks, including nasty events resulting from increasing hostilities with North Korea, Russia, and Iran, perhaps a nuclear disaster resulting from wild weather, and further down the road, a disorderly Brexit doing more damage to Europe and it not-so-solid banks.

The reason this situation is so striking is that historically, crises that did real damage hurt financial institutions. In the Great Depression, banks all over the world failed, wiping out depositors’ funds, big chunks of the payment system, and the resulting downdraft correctly made the survivors too fearful to lend. In the US, a lot of traditional lending has been displaced by securitization, so investors taking losses or simply getting nervous could damage credit creation.

If one were to step back, and this is hardly a novel thought, the root of investor nervousness is the sustained and extreme intervention by central banks all around the world in financial markets. No one in 2008 would have thought it conceivable that less than a decade later, one quarter of the world economy would have set negative policy interest rates. Even though markets only occasionally pay attention to fundamentals, sustained super low interest rates, by design, have sent asset prices of all kinds into nosebleed territory.

The Fed seemed to be the first to recognize that its monetary experiments had done little for the real economy, save allow for some additional spending via mortgage refis. It had done more to transfer income and wealth to the top 1%, and even more so to the top 0.1%, and enrich banks, all of which are hindrances to long-term growth. Yet Bernanke announced his intention to taper in 2014, and how far has the Fed gotten in getting back to normalcy? The answer is not very. And that’s because central bankers fear that their policies are asymmetrical: they can do more to dampen activity by increasing rates than they can to spur growth by lowering them. As we’ve repeatedly pointed out, businessmen do not go out and expand because money is on sale. They expand when they see commercial opportunity. The exception is in industries where the cost of money is one of the biggest costs of production…such as in financial services and levered speculation.

However, from what I can tell, the Fed’s desire to raise rates is driven by its perception that it need to have short term rates meaningfully higher, as in 2% or higher, so as to have room for cuts if the banking system gets wobbly. That is why it keeps treating a flaccid but less terrible than in the past labor market as robust.

But the potentially more interesting contradiction is in the posture of conservative businessmen. Higher interest rates will hurt their stock portfolios and the value of their homes. It will also hurt fracking, which is very dependent on borrowed money. Yet Republicans are more eager than Democrats to raise interest rates, apparently out of the misguided belief that low interest rates help labor, as opposed to capital (the Fed’s using the state of the labor market as its indicator as to whether to increase interest rates or not no doubt feeds this belief). Similarly, Republicans are far more exercised about the size of the Fed’s balance sheet and want it smaller. Again, there’s no logical reason for this move. The Fed’s assets will liquidate over time. They may not do much additional good sitting there (save the remittance payments back to the Treasury), but they aren’t doing any harm either.

In other words, the varying views about what to do about central bank interest rates and their holdings in many, too many, cases have to do with political aesthetics that often run counter to economic interests. A big reason that conservatives don’t like the Fed’s big balance sheet, even though the Fed is the stalwart friend of banks and investors, is that they still see the Fed as government, and government intervening in the economy offends them, even when it might benefit them. (Mind you, this is not the same as business exploiting government via “public private partnerships” or other approaches where commercial interests have their hand on the steering wheel). (...)

Now you might ask, how does this relate to the original question, that market mavens might be looking for the next crisis in all the wrong places?

The first is that despite widespread worries about a crisis, you don’t need to have a crisis to have a bubble deflate. In the runup to 2008, I expected the unwind of reckless lending spree to look like that of Japan’s. Japan’s joint commercial and residential real estate bubbles were much larger relative to the GDP than those in the US. Yet instead of a dramatic bust, the economy contracted like a car with no wheels banging down a steep slope. A mini-crisis of sorts did occur in 1997, when the authorities made the mistake of thinking the economy was strong enough to take some tightening, which kicked off a series of financial firm failures. So even if it turns out things do end badly, you can have the real economy suffer without having the financial system have a heart attack.

The second is that with some significant exceptions like Deutsche Bank, the authorities have succeeded in moving risk out of the financial system and more and more onto the backs of investors. That means the rich, but it also means pension funds, insurance companies, endowments, foundations, and sovereign wealth funds. Investors have already taken a toll via super low interest rates; economist Ed Kane estimated that in the US alone, that represented a $300 billion per annum subsidy to banks.

So even if we were to have something crisis-like, as in a sudden ratchet down in asset prices that stuck, it isn’t clear that the damage to critical financial plumbing would be significant.

by Yves Smith, Naked Capitalism |  Read more:
Image: Getty via:

Lembrou Canela
via:

Sunday, October 22, 2017

After the End of the Startup Era

There’s a weird feeling afoot these days, in the Valley, and in San Francisco. Across the rest of the world — Denver, Santiago, Toronto, Berlin, “Silicon Glen,” “Silicon Alley,” “Silicon Roundabout“, Station F — it seems every city still wants to be a startup hub, dreaming of becoming “the new Silicon Valley.” But in the Valley itself? Here it feels like the golden age of the startup is already over.

Hordes of engineering and business graduates secretly dream of building the new Facebook, the new Uber, the new Airbnb. Almost every big city now boasts one or more startup accelerators, modeled after Paul Graham’s now-legendary Y Combinator. Throngs of technology entrepreneurs are reshaping, “disrupting,” every aspect of our economy. Today’s big businesses are arthritic dinosaurs soon devoured by these nimble, fast-growing mammals with sharp teeth. Right?

Er, actually, no. That was last decade. We live in a new world now, and it favors the big, not the small. The pendulum has already begun to swing back. Big businesses and executives, rather than startups and entrepreneurs, will own the next decade; today’s graduates are much more likely to work for Mark Zuckerberg than follow in his footsteps.

The web boom of 1997-2006 brought us Amazon, Facebook, Google, Salesforce, Airbnb, etc., because the Internet was the new new thing, and a handful of kids in garages and dorm rooms could build a web site, raise a few million dollars, and scale to serve the whole world. The smartphone boom of 2007-2016 brought us Uber, Lyft, Snap, WhatsApp, Instagram, Twitter, etc., because the same was true of smartphone apps.

Because we’ve all lived through back-to-back massive worldwide hardware revolutions — the growth of the Internet, and the adoption of smartphones — we erroneously assume another one is around the corner, and once again, a few kids in a garage can write a little software to take advantage of it.

But there is no such revolution en route. The web has been occupied and colonized by big business; everyone already has a smartphone, and big companies dominate the App Store; and, most of all, today’s new technologies are complicated, expensive, and favor organizations that have huge amounts of scale and capital already.

It is no coincidence that seed funding is down in 2017. It is no coincidence that Alphabet, Amazon, Apple, Facebook, and Microsoft have grown from “five big tech companies” to “the five most valuable public companies in the world.” The future belongs to them, and, to a lesser extent, their second-tier ilk.

It is widely accepted that the next wave of important technologies consists of AI, drones, AR/VR, cryptocurrencies, self-driving cars, and the “Internet of Things.” These technologies are, collectively, hugely important and consequential — but they are not remotely as accessible to startup disruption as the web and smartphones were.

AI doesn’t just require top-tier talent; that talent is all but useless without mountains of the right kind of data. And who has essentially all of the best data? That’s right: the abovementioned Big Five, plus their Chinese counterparts Tencent, Alibaba, and Baidu.

Hardware, such as drones and IoT devices, is hard to prototype, generally low-margin, expensive to bring to market, and very expensive to scale. Just ask Fitbit. Or Jawbone. Or Juicero. Or HTC. (However, in fairness, software and services built atop newly emerging hardware are likely an exception to the larger rule here; startups in those niches have far better odds than most others.)

Self-driving cars are even more expensive: like biotech, they’re a capital-intensive battle between huge companies. A few startups may — will — be expensively acquired, but that’s not the same as having a realistic chance of actually becoming major competitors themselves.

AR/VR is already far behind its boosters’ optimistic adoption predictions, and is both an expensive hardware problem and a complex software problem. Magic Leap has raised almost two billion dollars without releasing a product (!), but is by most (admittedly sketchy) accounts struggling. Meanwhile, Microsoft’s HoloLens, Google’s Cardboard / Tango / ARCore, and Apple’s ARKit continue to build successfully on their existing platforms.

Cryptocurrencies aren’t about making startups valuable; they’re about the making the currencies themselves, and their decentralized ecosystems, valuable. The market capitalization of Bitcoin vastly exceeds that of any Bitcoin-based startup. The same is true for Ethereum. True believers argue that cryptocurrencies will overturn everything, in time, but read this Twitter thread and see if, like me, you can’t help but finding yourself nodding along, even if, like me, you truly want the Internet and its economy to be decentralized:

So where does all this leave tech startups? Struggling, and probably hoping to be acquired by a larger company, ideally one of the Big Five. While some breakout startups will still doubtless arise, they’ll be far rarer than they were during the boom years.

by Jon Evans, TechCrunch | Read more:
Image: Wikipedia Commons

Why I Support Longreads


People, often lament about lack of good journalism and diveristy in their information diet. I see that as a side effect of living inside the filter bubbles, a phenomenon exacerbated by the emergence of the social web. In reality, a lot of good journalism is happening all around us — we just don’t have the ability to find it easily, because of the information utilities — Facebook, Twitter and Google work.

And perhaps that is why, more than ever we need new ways to find, curate and disseminate good journalism, great writing, and diverse opinions. Email newsletters and specialist blogs are still an effective way to get more brain food. I won’t be able to go through a single day without my friend Jason Hirschhorn’s newsletters. My understanding of IOT would be a lot less if it was not for Stacey Higginbotham’s newsletter. And there is Techmeme, which has evolved with the times and become a great resource for technology news.

And then there is Longreads. It is an old-fashioned blog, that curates some of the best writing on the Internet. It is not for a quick hit, but instead, it is about taking time to consume good solid journalism. There are no advertisements. And the site is very mobile friendly. I typically save most of the stories in Pocket and then read them when enjoying my tea.

It is a great resource for non-obvious stories. They work hard to find great journalism worth reading from across the web. And that is not all — they are paying writers to create originals. That is why it is important for me to support them, both with my attention and my dollars. I am a long time supporter. They are currently running a fundraising drive of $25,000 — and each dollar they raised is matched by another $3 from their corporate benefactor, Automattic. If you love good reads, then it might be a great idea for you to support them — with as little as $5 or as much as you feel like.

From my vantage point, we need more of these curated and focused newsletters and websites. With smarter curation comes better information diet.

by Om Malik |  Read more:
Image: uncredited
[ed. Hey, Om. Maybe you should get out a little more? Check out Duck Soup, it's only been going for what, five or six years now? With no advertisements (but you can donate if you want to). Tell your friends.]

North Korea Is Playing a Longer Game Than the U.S.

If we think through the North Korea nuclear weapons dilemma using game theory, one aspect of the problem deserves more attention, namely the age of the country’s leader, Kim Jong Un: 33. Because peaceful exile doesn’t appear to be an option -- his escaping the country safely would be hard -- Kim needs strategies for hanging on to power for 50 years or more. That’s a tall order, but it helps us understand that his apparently crazy tactics are probably driven by some very reasonable calculations, albeit selfish and evil ones.

It is very difficult to predict the world a half-century out. Fifty years ago, China was just coming out of the Cultural Revolution, and Japan’s rise was not yet so evident. North Korea was possibly still richer than the South, which in 1960 was one of the poorest countries in the world. It’s unlikely anyone had a reasonable inkling of where things would stand today.

So if you are a dictator planning for long-term survival under a wide range of possible outcomes, what might you do? You don’t know who your enemies and your friends will be over those 50 years, so you will choose a porcupine-like strategy and appear prickly to everyone.

We Americans tend to think of Kim as an irritant to our plans, but his natural enemy in the long run is China. It is easier for North Korea to threaten Chinese cities with weapons, and its nuclear status stands in China’s way of becoming the dominant regional power in East Asia. Chinese public opinion has already turned against North Korea, and leaders wonder whether a more reliable, pro-Chinese option to Kim might be installed. Since assuming power, Kim has gone after the generals and family members with the strongest ties to China.

One way to interpret Kim’s spat with U.S. President Donald Trump is that he is signaling to the Chinese that they shouldn’t try to take him down because he is willing to countenance “crazy” retaliation. In this view, Beijing is a more likely target for one of his nukes than is Seattle.

More radically, think of Kim as auditioning to the U.S., Japan, South Korea and India as a potential buffer against Chinese expansion. If he played his hand more passively and calmly, hardly anyone would think that such a small country had this capacity. By picking a fight with the U.S., he is showing the ability to deter just about anyone.

Another possible scenario from Kim’s perspective is that external pressures and sanctions rise, and North Korea can’t survive as a regional nuclear pariah. If this doesn’t seem likely today, remember we are talking about the next 50 years. So if Kim’s belligerence induces Japan or maybe South Korea to develop nuclear deterrents, that would take some of the pressure off him, as nuclear proliferation would become the regional default. Kim is probably more concerned with sheer survival than with managing other shifts in the balance of power.

by Tyler Cowan, Bloomberg |  Read more:
Image: Korean Central News Agency

Debby Mason, Scad (Horse Mackerel)
via: 

Saturday, October 21, 2017

The White-Minstrel Show

Ice-T never received an Academy Award, which makes sense inasmuch as his movies have been for the most part crap. But as an actor, you have to give the man credit: Along with other gangster rappers such as Ice Cube, he turned in such a convincing performance — amplifying negative stereotypes about black men and selling white people their own Reagan-era racial panic back to them in a highly stylized form — that people still, to this day, believe he was the guy he played on stage. One social-media critic accused him of hypocrisy for having recorded the infamous song “Cop Killer” before going on to a very lucrative career playing a police officer on television. Ice-T gave the man an honest answer: “It’s both acting, homie.”

Acting, indeed.

Pretty good acting, too, across the board in the rap world. Consider the strange evolution of Tupac Shakur, who went from the quiet, effeminate young man seen in this interview — a former acting and ballet student at the Baltimore School for the Arts apparently pointed like a rocket at a career in musical theater — to the “Thug Life” antihero persona that made him famous in a remarkably short period of time. He played tough-guy Roland Bishop in Juice and basically stayed in character for the rest of his public life. As with Ice-T, many of his fans assumed the stage persona was the real man. There’s a whole weird little racial dynamic in there waiting for some doctoral student to sort it out. Nobody expects Anthony Hopkins to eat a census worker.

A theater critic can’t really begrudge a performer for making a living, and Ice-T put on a great show. I do wonder how much damage those performers did by reinforcing and glamorizing criminal stereotypes of black men. And I do mean that I wonder — I do not know. Maybe the act is more obvious if you are the sort of person who is being dramatized or caricatured. (I experience something like that when I hear modern country songs on the radio, all that cheerful alcoholism and casual adultery and ridiculous good-ol’-boy posturing.) It would be weird to describe black men as “acting black,” but whatever they were up to was the opposite of “acting white.”

There’s a certain kind of conservative who loves to talk about “acting white,” i.e., about the legendary social sanction purportedly applied to African Americans who try too hard in school or who speak in an English that is too standard or who have interests and aspirations other than the ones that black people are stereotypically supposed to have. (“Acting white” isn’t a complaint exclusive to African Americans. My friend Jay Nordlinger relates a wonderful story about the American Indian educator Ben Chavis, who once was accused by a sister of “acting white.” His reply: “‘Acting white’ is not enough. I’m acting Jewish. Or maybe Chinese.”) Oh, how we love to knowingly tut-tut about “acting white,” with the obvious implication that black Americans corporately would be a good deal better off if they would do a little more acting white. That sort of thing is not entirely unique to conservatives, of course: Nine-tenths of all social criticism involving the problems of the American underclass consists of nice college graduates and policy professionals of many races and religions wondering aloud why they can’t be more like us, which is why so much social policy is oriented toward trying to get more poor people to go to college, irrespective of whether they want to do so or believe they would benefit from it.

Conservatives have a weakness for that “acting white” business because we are intellectually invested in emphasizing the self-inflicted problems of black America, for rhetorical and political reasons that are too obvious to require much elaboration. It’s a phenomenon that may or may not be exaggerated. John McWhorter argues that it is a real problem, and makes a pretty good case. So did President Barack Obama, who called on the nation to “eradicate the slander that says a black youth with a book is acting white.” I am not sure that a white man from Lubbock, Texas, has a great deal to add to President Obama’s argument there.

But I do have something to say about the subject of white people acting white.

We rarely used to put it in racial terms, unless we were talking about Eminem or the Cash-Me-Ousside Girl or some other white person who has embraced (or affected) some part of black popular culture. With the Trump-era emergence of a more self-conscious form of white-identity politics — especially white working-class identity politics — the racial language comes to the surface more often than it used to. But we still rarely hear complaints about “acting un-white.” Instead, we hear complaints about “elitism.” The parallels to the “acting white” phenomenon in black culture are fairly obvious: When aspiration takes the form of explicit or implicit cultural identification, however partial, with some hated or resented outside group that occupies a notionally superior social position, then “authenticity” is to be found in socially regressive manners, mores, and habits. It is purely reactionary.

The results are quite strange. Republicans, once the party of the upwardly mobile with a remarkable reflex for comforting the comfortable, have written off entire sections of the country — including the bits where most of the people live — as “un-American.” Silicon Valley and California at large, New York City and the hated Acela corridor, and, to some extent, large American cities categorically are sneered at and detested. There is some ordinary partisanship in that, inasmuch as the Democrats tend to dominate the big cities and the coastal metropolitan aggregations, but it isn’t just that. Conservatives are cheering for the failure of California and slightly nonplussed that New York City still refuses to regress into being an unlivable hellhole in spite of the best efforts of its batty Sandinista mayor. Not long ago, to be a conservative on Manhattan’s Upper East Side was the most ordinary thing in the world. Now that address would be a source of suspicion. God help you if you should ever attend a cocktail party in Georgetown, the favorite dumb trope of conservative talk-radio hosts.

We’ve gone from William F. Buckley Jr. to the gentlemen from Duck Dynasty. Why?

American authenticity, from the acting-even-whiter point of view, is not to be found in any of the great contemporary American business success stories, or in intellectual life, or in the great cultural institutions, but in the suburban-to-rural environs in which the white underclass largely makes its home — the world John Mellencamp sang about but understandably declined to live in.

Shake your head at rap music all you like: When’s the last time you heard a popular country song about finishing up your master’s in engineering at MIT?

White people acting white have embraced the ethic of the white underclass, which is distinct from the white working class, which has the distinguishing feature of regular gainful employment. The manners of the white underclass are Trump’s — vulgar, aggressive, boastful, selfish, promiscuous, consumerist. The white working class has a very different ethic. Its members are, in the main, churchgoing, financially prudent, and married, and their manners are formal to the point of icy politeness. You’ll recognize the style if you’ve ever been around it: It’s “Yes, sir” and “No, ma’am,” but it is the formality of soldiers and police officers — correct and polite, but not in the least bit deferential. It is a formality adopted not to acknowledge the superiority of social betters but to assert the equality of the speaker — equal to any person or situation, perfectly republican manners. It is the general social respect rooted in genuine self-respect.

Its opposite is the sneering, leveling, drag-’em-all-down-into-the-mud anti-“elitism” of contemporary right-wing populism. Self-respect says: “I’m an American citizen, and I can walk into any room, talk to any president, prince, or potentate, because I can rise to any occasion.” Populist anti-elitism says the opposite: “I can be rude enough and denigrating enough to drag anybody down to my level.” Trump’s rhetoric — ridiculous and demeaning schoolyard nicknames, boasting about money, etc. — has always been about reducing. Trump doesn’t have the intellectual capacity to duke it out with even the modest wits at the New York Times, hence it’s “the failing New York Times.” Never mind that the New York Times isn’t actually failing and that any number of Trump-related businesses have failed so thoroughly that they’ve gone into bankruptcy; the truth doesn’t matter to the argument any more than it matters whether the fifth-grade bully actually has an actionable claim on some poor kid’s lunch money. It would never even occur to the low-minded to identify with anybody other than the bully. That’s what all that ridiculous stuff about “winning” was all about in the campaign. It is might-makes-right, i.e., the politics of chimpanzee troupes, prison yards, kindergartens, and other primitive environments. That is where the underclass ethic thrives — and how “smart people” came to be a term of abuse.

This involves, inevitably, a good deal of fakery.

The man at the center of all this atavistic redneck revanchism is a pampered billionaire real-estate heir from New York City, and it has been something to watch the multi-millionaire populist pundits in Manhattan doing their best impersonations of beer-drinkin’ regular guys from the sticks. I assume Sean Hannity picked up his purported love for country music in the sawdust-floored honky-tonks of . . . Long Island.

As a purely aesthetic enterprise, none of this clears my poor-white-trash cultural radar. I’m reminded of those so-called dive bars in Manhattan that spend $150,000 to make a pricey spot in Midtown look like a Brooklyn kid’s idea of a low-rent roadside bar in Texas. (There’s one that even has Lubbock license plates on the wall. I wonder where they got them — is there some kind of mail-order dive-bar starter kit that comes with taxidermy, Texas license plates, and a few cases of Lone Star? Maybe via Amazon Prime?) The same crap is there — because the same crap is everywhere — but the arrangement isn’t quite right. 

The populist Right’s abandonment of principle has been accompanied by a repudiation of good taste, achievement, education, refinement, and manners — all of which are abominated as signs of effete “elitism.” During the Clinton years, Virtue Inc. was the top-performing share in the Republican political stock exchange. Fortunes were made, books were sold by the ton, and homilies were delivered. The same people today are celebrating Donald Trump — not in spite of his being a dishonest, crude serial adulterer but because of it. His dishonesty, the quondam cardinals of Virtue Inc. assure us, is simply the mark of a savvy businessman, his vulgarity the badge of his genuineness and lack of “political correctness,” and his pitiless abuse of his several wives and children the mark of a genuine “alpha male.” No less a virtue entrepreneur than Bill Bennett dismissed those who pointed out Trump’s endless lies and habitual betrayals as suffering from “moral superiority,” from people on “high horses,” and said that Trump simply is “a guy who says some things awkwardly, indecorously, infelicitously.”

Thus did the author of The Book of Virtues embrace the author of “Grab ’Em By the P***y.” (...)

Ludwig von Mises was as clear-eyed a social critic as he was an economist, and he noted something peculiar about the anti-Semitism of the Nazi era: In the past, minority groups were despised for their purported vices — white American racists considered African Americans lazy and mentally deficient, the English thought the Irish drank too much to be trusted to rule their own country, everybody thought the Gypsies were put on this Earth to spread disease and thievery. But the Jews were hated by the Nazis for their virtues: They were too intelligent, too clever, too good at business, too cosmopolitan, too committed to their own distinctness, too rich, too influential, too thrifty.

Our billionaire-ensorcelled anti-elitists take much the same tack: Anybody with a prestigious job, a good income, an education at a selective university, and no oxy overdoses in the immediate family — and anybody who prefers hearing the New York Philharmonic at Lincoln Center to watching football on television — just doesn’t know what life is like in “the real America” or for the “real men” who live there. No, the “real America,” in this telling, is little more than a series of dead factory towns, dying farms, pill mills — and, above all, victims. There, too, white people acting white echo elements of hip-hop culture, which presents powerful and violent icons of masculinity as hapless victims of American society.

by Kevin D. Williamson, National Review |  Read more:
Image: Jonathan Ernst

What Are We Doing Here?

There is a great deal of questioning now of the value of the humanities, those aptly named disciplines that make us consider what human beings have been, and are, and will be. Sometimes I think they should be renamed Big Data. These catastrophic wars that afflict so much of the world now surely bear more resemblance to the Hundred Years’ War or the Thirty Years’ War or the wars of Napoleon or World War I than they do to any expectations we have had about how history would unfold in the modern period, otherwise known as those few decades we call the postwar.

We have thought we were being cynical when we insisted that people universally are motivated by self-interest. Would God it were true! Hamlet’s rumination on the twenty thousand men going off to fight over a territory not large enough for them all to be buried in, going to their graves as if to their beds, shows a much sounder grasp of human behavior than this. It acknowledges a part of it that shows how absurdly optimistic our “cynicism” actually is. President Obama not long ago set off a kerfuffle among the press by saying that these firestorms of large-scale violence and destruction are not unique to Islamic culture or to the present time. This is simple fact, and it is also fair warning, if we hope to keep our own actions and reactions within something like civilized bounds. This would be one use of history. (...)

I am not speaking here of the usual and obvious malefactors, the blowhards on the radio and on cable television. I am speaking of the mainstream media, therefore of the institutions that educate most people of influence in America, including journalists. Our great universities, with their vast resources, their exhaustive libraries, look like a humanist’s dream. Certainly, with the collecting and archiving that has taken place in them over centuries, they could tell us much that we need to know. But there is pressure on them now to change fundamentally, to equip our young to be what the Fabians used to call “brain workers.” They are to be skilled laborers in the new economy, intellectually nimble enough to meet its needs, which we know will change constantly and unpredictably. I may simply have described the robots that will be better suited to this kind of existence, and with whom our optimized workers will no doubt be forced to compete, poor complex and distractible creatures that they will be still.

Why teach the humanities? Why study them? American universities are literally shaped around them and have been since their founding, yet the question is put in the bluntest form—what are they good for? If, for purposes of discussion, we date the beginning of the humanist movement to 1500, then, historically speaking, the West has flourished materially as well as culturally in the period of their influence. You may have noticed that the United States is always in an existential struggle with an imagined competitor. It may have been the cold war that instilled this habit in us. It may have been nineteenth-century nationalism, when America was coming of age and competition among the great powers of Europe drove world events. Whatever etiology is proposed for it, whatever excuse is made for it, however rhetorically useful it may be in certain contexts, the habit is deeply harmful, as it has been in Europe as well, when the competition involved the claiming and defending of colonies, as well as militarization that led to appalling wars.

The consequences of these things abide. We see and feel them every day. The standards that might seem to make societies commensurable are essentially meaningless, except when they are ominous. Insofar as we treat them as real, they mean that other considerations are put out of account. Who died in all those wars? The numbers lost assure us that there were artists and poets and mathematicians among them, and statesmen, though at best their circumstances may never have allowed them or us to realize their gifts. (...)

A great irony is at work in our historical moment. We are being encouraged to abandon our most distinctive heritage—in the name of self-preservation. The logic seems to go like this: To be as strong as we need to be we must have a highly efficient economy. Society must be disciplined, stripped down, to achieve this efficiency and to make us all better foot soldiers. The alternative is decadence, the eclipse of our civilization by one with more fire in its belly. We are to be prepared to think very badly of our antagonist, whichever one seems to loom at a given moment. It is a convention of modern literature, and of the going-on of talking heads and public intellectuals, to project what are said to be emerging trends into a future in which cultural, intellectual, moral, and economic decline will have hit bottom, more or less.

Somehow this kind of talk always seems brave and deep. The specifics concerning this abysmal future are vague—Britain will cease to be Britain, America will cease to be America, France will cease to be France, and so on, depending on which country happens to be the focus of Spenglerian gloom. The oldest literature of radical pessimism can be read as prophecy. Of course these three societies have changed profoundly in the last hundred years, the last fifty years, and few with any knowledge of history would admit to regretting the change. What is being invoked is the notion of a precious and unnamable essence, second nature to some, in the marrow of their bones, in effect. By this view others, whether they will or no, cannot understand or value it, and therefore they are a threat.

The definitions of “some” and “others” are unclear and shifting. In America, since we are an immigrant country, our “nativists” may be first- or second-generation Americans whose parents or grandparents were themselves considered suspect on these same grounds. It is almost as interesting as it is disheartening to learn that nativist rhetoric can have impact in a country where precious few can claim to be native in any ordinary sense. Our great experiment has yielded some valuable results—here a striking demonstration of the emptiness of such rhetoric, which is nevertheless loudly persistent in certain quarters in America, and which obviously continues to be influential in Britain and Europe.

Nativism is always aligned with an impulse or strategy to shape the culture with which it claims to have this privileged intimacy. It is urgently intent on identifying enemies and confronting them, and it is hostile to the point of loathing toward aspects of the society that are taken to show their influence. In other words, these lovers of country, these patriots, are wildly unhappy with the country they claim to love, and are bent on remaking it to suit their own preferences, which they feel no need to justify or even fully articulate. Neither do they feel any need to answer the objections of those who see their shaping and their disciplining as mutilation. (...)

What are we doing here, we professors of English? Our project is often dismissed as elitist. That word has a new and novel sting in American politics. This is odd, in a period uncharacteristically dominated by political dynasties. Apparently the slur doesn’t stick to those who show no sign of education or sophistication, no matter what their pedigree. Be that as it may. There is a fundamental slovenliness in much public discourse that can graft heterogeneous things together around a single word. There is justified alarm about the bizarre concentrations of wealth that have occurred globally, and the tiny fraction of the wealthiest one percent who have wildly disproportionate influence over the lives of the rest of us. They are called the elite, and so are those of us who encourage the kind of thinking that probably does make certain of the young less than ideal recruits to their armies of the employed.

If there is a point where the two meanings overlap, it would be in the fact that the teaching we do is what in America we have always called liberal education, education appropriate to free people, very much including those old Iowans who left the university to return to the hamlet or the farm. Now, in a country richer than any they could have imagined, we are endlessly told we must cede that humane freedom to a very uncertain promise of employability. It seems most unlikely that any oligarch foresees this choice as being forced on his or her own children. I note here that these criticisms and pressures are not brought to bear on our private universities, though most or all of them receive government money. Elitism in its classic sense is not being attacked but asserted and defended.

If I seem to have conceded an important point in saying that the humanities do not prepare ideal helots, economically speaking, I do not at all mean to imply that they are less than ideal for preparing capable citizens, imaginative and innovative contributors to a full and generous, and largely unmonetizable, national life. America has known long enough how to be a prosperous country, for all its deviations from the narrow path of economic rationalism. Empirically speaking, these errancies are highly compatible with our flourishing economically, if they are not a cause of it, which is more than we can know. The politicians who attack public higher education as too expensive have made it so for electoral or ideological reasons and could undo the harm with the stroke of a pen. They have created the crisis to which they hope to bring their draconian solutions.

by Marilynne Robinson, NYRB |  Read more:
Image: Alexis de Tocqueville; portrait by Théodore Chassériau, 1850

Friday, October 20, 2017

Durand Jones & The Indications

Breaking News: Trump Resigns! (Well, Not Yet)

It’s been a bleak decade since President Donald J. Trump put his hand on the Bible eight months ago. After the Charlottesville debacle, former Vice President Al Gore offered Trump a one-word piece of advice: “Resign.” Tony Schwartz, ghostwriter of The Art of the Deal, claimed resignation would come before the end of the year. And Steve Bannon reportedly thinks Trump has just a 30 percent chance of finishing out his term.

While we wait for special counsel Robert Mueller’s investigation into money laundering, bank fraud, foreign influence, election rigging, and hotel-mattress wetting, I asked eight TV and screenwriters and astute observers of human behavior to come up with two scenarios of how Trump will leave the Oval Office. I offered these examples: (...)

Danny Zuker: [Executive producer, Modern Family, five-time Emmy Award winner. The president of the United States once tweeted at him: “Danny—you’re a total loser.”]

Plausible scenario:

I don’t think he’ll leave over collusion, conflicts of interest, or even the release of the pee-pee tape. (Although one can dream.) I think he will ultimately resign because the job is harder than he thought. He’s discovering that he can’t simply put a TRUMP sign on the White House and pretend to be president the way he puts one on a building and pretends to be a builder. He’ll say something like, “Over the last nine months, I took a country where the streets were literally full of sewage and crime and people with accents and turned it into a paradise kingdom that rivals heaven itself. Better than heaven, because we all have guns. So tremendous is my creation that it basically runs itself. No president can rule for more than eight years, and I’ve already squozen a decade’s worth of achievements into my first year—and it’s not even Thanksgiving. So, I’m leaving office to spend more time with my son . . . (Melania whispers in his ear) Barron.”

Writer-enhanced scenario:

Fade in: intelligence briefing. We are close on Trump’s bloated, porcine face, the kind of face that would immediately disqualify a person from judging others’ appearances. He yawns, wipes some KFC extra-crispy batter from his most northern chin. Then he gets an idea. A light-bulb moment. Not a bright light bulb—more like the bulb in that emergency flashlight you find buried in your junk drawer. He stands up and exclaims . . .

TRUMP: I quit.
INTELLIGENCE OFFICER: Wah wah wah wah wah?
TRUMP: I SAID, I QUIT!

He races out of the briefing room and makes his way outside, where we see a HELMETED FIGURE on a motorcycle.
TRUMP: I did it! I QUIT.
The helmeted figure takes off the helmet and we see SARAH PALIN
SARAH PALIN: Good boy. Hop on.

Trump hops on the back of the hog and the two quitters drive off into the sunset. FADE OUT:

Parting shot:

I’m moving outta here like a bitch. (...)

Megan Amram: [Writer for The Good Place and Silicon Valley]

Plausible scenario:

Donald Trump will be impeached after evidence surfaces that he met with Russians clandestinely on multiple occasions specifically to sabotage Hillary’s run for president. This will occur approximately one week before the election in 2020. By then, cities won’t exist, and the average temperature in America will be 130 degrees Trump (the new nomenclature for Fahrenheit).

Writer-enhanced scenario:

Donald Trump will resign after a secret Russian sex tape surfaces, one that involves Trump sexually harassing his daughter Ivanka. He will then brag that he was the “fastest president ever,” and that he can resign since he’s brought back “all of the jobs. Literally all of them. Look at them—they’re all back now.” He will spend the rest of his days doing exactly what he did in the presidency, playing golf and pretending to drive fire trucks.

Parting shot:

“Ffffffffpllllplplplplplplplppppluuuuuuuuuuugggffffffff.” (This is the sound of Donald Trump publicly shitting himself at a rally, then trying to cover his butt with Mike Pence’s sweater, but the sweater isn’t big enough to cover his big butt, so he slips and falls and can’t get up ’cause he’s covered in his own shit, so he’s pulled off by the Secret Service, never to be seen again.)

Andy Bobrow: [Executive producer, The Last Man on Earth]

Plausible scenario:

I remember learning that when L. Ron Hubbard died, they announced to the rank-and-file Scientologists that he had merely “discarded his body” so he could continue his work on other planes of existence. I’m not being hyperbolic when I say I believe this is how Trump’s impeachment and resignation will go. I think he’ll call it something else, and Congress will happily play along. An impeachment will be called a “Constitutional Hearing,” or a “Congressional Adjustment,” or an “Unholy Witch Hunt.” A resignation will be called an “Executive Realignment,” or a “Presidential Ascension,” or simply a “Nothingburger.” So my most plausible scenario is that something happens that’s not an impeachment, and he does something that’s not a resignation. And he lives many more years acting like he is still president, and the whole country silently agrees to never talk about that one time we had a constitutional crisis and pretended we didn’t.

Writer-enhanced scenario:

A second White House will be built a few blocks from the official White House, and Trump will stay there three days a week. This new White House will be a full replica, but five-times bigger and gold.

Parting shot:

This one’s easy. The quote will be “I’m still president.”

I mean, that’s what the NYT headline will be. The full quote will not be so pithy.

“Am I resigning? No. Where did you hear that, by the way? That’s, if you believe that, I’ll sell you a bridge on top of the World Trade Center. Which, terrible deal by the way. Whoever built that, I like buildings that don’t collapse, O.K.? Terrible deal. They got a lot of things (garbled). It’s nuts. And I hear everyone asking “is he resigning, is he impeaching?” I’m not impeaching, O.K.? I’m president. They still call me president, don’t they? Everybody calls me President Trump. You hear it everywhere you go, President Trump this, President Trump that, President Trump, I love you, President Trump, don’t go. So I’m president. It’s silly. It’s dumb (garbled). Mike Pence is a helluva guy. Mike Pence, President Pence if you wanna call him that. Great guy, terrific guy. I also heard there’s gonna be a new vice president, which you can do. A lot of people don’t know that. You can bring the vice president up to president, I just learned this, a lot of people don’t know. And then he can bring up a guy. I don’t know who they’ll choose, but it should be my daughter. Not the ugly one. (Large applause). No, come on. Come on. You’re nasty. So I’m gonna travel and do great things. Dubai. Russia. China. And wherever I go, I’m the president there, too, they love me there and we’re only gonna make it bigger. Maybe I’ll do another TV show, would you like that? I’ll do a TV show, “where’s Hillary?” Has anyone seen her? She’s gone, maybe she’s in jail, I don’t know. They tell me (garbled) and all of this and that. But she’s not in jail and I’m gonna put her in jail. Maybe she’s with ISIS (huge applause). I beat ISIS. ISIS is no longer a threat because of me. But they’re still a threat and I’ll continue to beat them. But as to the question, who’s president? I’m president. They call Obama president and he was never even president. So believe me, I’m still president.” (...)

by Nell Scovell, Vanity Fair |  Read more:
Image: Mathieu De Muizon

[ed. See also: "I Hate Everyone in the White House!" Trump Seethes as Advisers Fear the President is "Unraveling". Assuming he was ever raveled in the first place.]

Robert Bowen, Lucky strike - 7- Reef baby
via:

The Social Life of Opioids

In the story of America’s opioid crisis a recent tripling in prescriptions of the painkillers is generally portrayed as the villain. Researchers and policy makers have paid far less attention to how social losses—including stagnating wages and fraying ties among people—can increase physical and emotional pain to help drive the current drug epidemic.

But a growing body of work suggests this area needs to be explored more deeply if communities want to address the opioid problem. One study published earlier this year found that for every 1 percent increase in unemployment in the U.S., opioid overdose death rates rose by nearly 4 percent.

Another recent study from researchers at Harvard University and Baylor College of Medicine reported U.S. counties with the lowest levels of “social capital”—a measure of connection and support that incorporates factors including people’s trust in one another and participation in civic matters such as voting—had the highest rates of overdose deaths. That review of the entire U.S. mined data from 1999 through 2014 and showed counties with the highest social capital were 83 percent less likely to be among those with high levels of overdose. Areas with low social capital, in contrast, were the most likely to have high levels of such “deaths of despair,” with overdose alone killing at least 16 people per 100,000

Overdose is now the nation’s leading cause of death for people in the prime of life. And suicide- and alcohol-related deaths have also risen—most dramatically in regions with the highest levels of economic distress. “It will be hard to address the addiction and overdose crisis without better understanding and addressing the neurobiology linking opioids, pain and social connectedness," says Sarah Wakeman, medical director of the Substance Use Disorder Initiative at Massachusetts General Hospital and an assistant professor of medicine at Harvard Medical School.

Connecting opioid use to social stress is not a new idea. Forty years ago the late neuroscience pioneer Jaak Panksepp first proposed the now widely accepted hypothesis that our body’s naturally produced opioids—endorphins and closely related enkephalins—are critical to the nurturing bonds that develop between parents and offspring and also between monogamous mates in mammals. Panksepp’s work and that of others showed that blocking one opioid system in the brain—which relies on the mu-opioid receptor—increased the distress calls of infants separated from their mothers in species as varied as dogs, rats, birds and monkeys. Giving an opioid drug (in doses too low to produce sedation) reduced such cries.

Panksepp also observed similarities between maternal love and heroin addiction. In each situation animals would persist in a behavior, despite negative consequences, in order to gain access to solace from the partner—or the drug. But, as Panksepp (who died in April) said in an interview several years ago, major journals rejected his paper in the 1970s because editors said the idea that motherly love was similar to heroin addiction was “too hot to handle.”

Since then, however, data supporting the link between opioids and bonding has only grown. It has been expanded on by researchers including Thomas Insel, former head of the National Institute on Mental Health; Robin Dunbar at the University of Oxford; and Larry Young, professor of psychiatry at Emory University.

Young showed that oxytocin, a hormone previously linked mostly with labor and nursing, is crucial to the formation of pair bonds as well as bonds between parents and infants. “The feelings that infants or adults feel when being nurtured—warmth, calmness and peacefulness—come from a combination of opioids and oxytocin,” he says. “These are the same feelings that people who take opioids report: a feeling of warmth and being nurtured or loved.” When a social bond is formed, oxytocin reconfigures the mu-opioid system so that a loved one’s presence relieves stress and pain—and that person’s absence, or a threat to the relationship, increases distress. (...)

Recent human studies have specifically found that a partner’s presence can reduce pain, and supportive touching such as hugging is linked to activation of mu-opioid receptors in the brain. In addition, a studypublished last year found that administering an opioid blocker decreased people’s feelings of social connectedness—both when they were in the lab receiving e-mails of support from close friends or relatives and when they were at home during the four days they took the drug—compared with when they took a placebo. And, whereas the drug reduced overall levels of positive emotion, it had a larger effect on positive emotions related to feeling connected and loved.

All of this suggests that recognizing the connections between bonding, stress and pain could be critical to effectively addressing the opioid crisis. “Understanding the biology and commonalities between trusting social relationships and the opioid system can change the way we think about treatment,” Young says, noting that neither the punitive approach of the criminal justice system nor harsh treatment tactics are likely to increase connectedness. In essence, if we want to have less opioid use, we may have to figure out how to have more love.

by Maia Szalavitz, Scientific American | Read more:
Image: Anita Hernadi Getty Images