Thursday, October 26, 2017

Dallas Killers Club

There were three horrible public executions in 1963. The first came in February, when the prime minister of Iraq, Abdul Karim Qassem, was shot by members of the Ba’ath party, to which the United States had furnished money and training. A film clip of Qassem’s corpse, held up by the hair, was shown on Iraqi television. “We came to power on a CIA train,” said one of the Ba’athist revolutionaries; the CIA’s Near East division chief later boasted, “We really had the Ts crossed on what was happening.”

The second execution came in early November 1963: the president of Vietnam, Ngo Dinh Diem, was shot in the back of the head and stabbed with a bayonet, in a coup that was encouraged and monitored by the United States. President Kennedy was shocked at the news of Diem’s gruesome murder. “I feel we must bear a good deal of responsibility for it,” he said. “I should never have given my consent to it.” But Kennedy sent a congratulatory cable to Henry Cabot Lodge Jr., the ambassador to South Vietnam, who had been in the thick of the action. “With renewed appreciation for a fine job,” he wrote.

The third execution came, of course, later that month, on November 22. I was six when it happened. I wasn’t in school because we were moving to a new house with an ivy-covered tree in front. My mother told me that somebody had tried to kill the president, who was at the hospital. I asked how, and she said that a bullet had hit the president’s head, probably injuring his brain. She used the word “brain.” I asked why, and she said she didn’t know. I sat on a patch of carpeting in an empty room, believing that the president would still get better, because doctors are good and wounds heal. A little while later I learned that no, the president was dead.

Since that day, till very recently, I’ve avoided thinking about this third assassination. Any time I saw the words “Lee Harvey Oswald” or “grassy knoll” or “Jack Ruby,” my mind quickly skipped away to other things. I didn’t go to see Oliver Stone’s JFK when it came out, and I didn’t read DeLillo’s Libra, or Gaeton Fonzi’s The Last Investigation, or Posner’s Case Closed, or any of the dozens of mass-market paperbacks—many of them with lurid black covers and red titles—that I saw reviewed, blamed, praised.

But eventually you have to face up to it somehow: a famous, smiling, waving New Englander, wearing a striped, monogrammed shirt, sitting in a long blue Lincoln Continental next to his smiling, waving wife, has his head blown open during a Texas parade. How could it happen? He was a good-looking person, with an attractive family and an incredible plume of hair, and although he wasn’t a very effective or even, at times, a very well-intentioned president—he increased the number of thermonuclear warheads, more than doubled the budget for chemical and biological weapons, tripled the draft, nearly got us into an end-time war with Russia, and sent troops, napalm, and crop defoliants into Vietnam—some of his speeches were, even so, noble and true and ringingly delivered and permanently inspiring. He was a star; they loved him in Europe. And then suddenly he was just a dead, naked man in a hospital, staring fixedly upward, with a dark hole in his neck. Autopsy doctors were poking their fingers in his wounds and taking pictures and measuring, and burning their notes afterward and changing their stories. “I was trying to hold his hair on,” Jacqueline Kennedy told the Warren Commission when they asked her to describe her experience in the limousine. She saw, she said, a wedge-shaped piece of his skull: “I remember it was flesh colored with little ridges at the top.” The president, the motorcade he rode in, the whole country, had been, to use a postmortem word, “avulsed”—blasted inside out.

Who or what brought this appalling crime into being? Was it a mentally unstable ex-Marine and lapsed Russophile named Oswald, aiming down at the back of Kennedy’s head through leafy foliage from the book depository, all by himself, with no help? Many bystanders and eyewitnesses—including Jean Hill, whose interview was broadcast on NBC about a half an hour after the shooting, and Kennedy advisers Kenny O’Donnell and Dave Powers, who rode in the presidential motorcade—didn’t think so: hearing the cluster of shots, they looked first toward a little slope on the north side of Dealey Plaza, and not back at the alleged sniper’s window.

A young surgeon at Parkland Memorial Hospital, Charles Crenshaw, who watched Kennedy’s blood and brains drip into a kick bucket in Trauma Room 1, also knew immediately that the president had been fatally wounded from a location toward the front of the limousine, not from behind it. “I know trauma, especially to the head,” Crenshaw writes in JFK Has Been Shot, published in 1992, republished with updates in 2013. “Had I been allowed to testify, I would have told them”—that is, the members of the Warren Commission—“that there is absolutely no doubt in my mind that the bullet that killed President Kennedy was shot from the grassy knoll area.”

No, the convergent gunfire leads one to conclude that the shooting had to have been a group effort of some kind, a preplanned, coordinated crossfire: a conspiracy. But if it was a group effort, what affiliation united the participants? Did the CIA and its hypermilitaristic confederates—Cold Warrior bitter-enders—engineer it? That’s what Mark Lane, James DiEugenio, Gerald McKnight, and many other sincere, brave, long-time students of the assassination believe. “Kennedy was removed from office by powerful and irrational forces who opposed his revisionist Cuba policy,” writes McKnight in Breach of Trust, a closely researched book about the blind spots and truth-twistings of the Warren Commission. James Douglass argues that Kennedy was killed by “the Unspeakable”—a term from Thomas Merton that Douglass uses to describe a loose confederacy of nefarious plotters who opposed Kennedy’s “turn” towards reconciliatory back-channel negotiation. “Because JFK chose peace on earth at the height of the Cold War, he was executed,” Douglass writes.

This is the message, also, of Oliver Stone’s artful, fictionalized epic JFK: Kennedy shied away from the invasion of Cuba, he wanted us out of Vietnam, he wouldn’t bow to the military-industrial combine, and none of that was acceptable to the hard-liners who surrounded him—so they had him killed. “The war is the biggest business in America, worth $80 billion a year,” Kevin Costner says, in JFK’s big closing speech. “President Kennedy was murdered by a conspiracy that was planned in advance at the highest levels of our government, and it was carried out by fanatical and disciplined cold warriors in the Pentagon and CIA’s covert-operation apparatus.”

Well, there’s no question that the CIA was and is an invasive weed, an eyes-only historical horror show that has, through plausibly deniable covert action, brought generations of instability and carnage into the world. There is no question, either, that under presidents Truman, Eisenhower, and Kennedy, the CIA’s string of pre-Dallas coups d’état—in Africa, in the Middle East, in Southeast Asia, in Latin America—contributed to an international climate of political upheaval and bad karma that made Kennedy’s own violent death a more conceivable outcome. There’s also no question that the CIA enlisted mobsters to kill Castro—Richard Bissell, who did the enlisting, later conceded that it was “a great mistake to involve the Mafia in an assassination attempt”—and no question that the CIA’s leading lights have, for fifty years, distorted and limited the available public record of the Kennedy assassination, doing whatever they could to distance the agency from its demonstrable interest in the accused killer, Oswald. It’s also true, I think, that there were some CIA extremists, fans of “executive action,” including William Harvey and, perhaps, James Jesus Angleton, that orchid-growing Anubis of spookitude, who were secretly relieved that Kennedy was shot, and may even have known in advance that he was probably going to die down south. (“I don’t want to sober up today,” Harvey reportedly told a colleague in Rome. “This is the day the goddamned president is gonna get himself killed!” Harvey also was heard to say: “This was bound to happen, and it’s probably good that it did.”) We are in debt to the CIA-blamers for their five decades of work, often in the face of choreographed media smears. They have brought us closer to the truth. But, having now read less than one-tenth of one percent of the available books on the subject, I believe, with full consciousness that I’m only a newcomer, that they’re barking up the wrong conspiracy. I think it was basically a Mafia hit: Kennedy’s death wouldn’t have happened without Carlos Marcello.

The best, saddest, fairest assassination book I’ve read, David Talbot’s Brothers, provides an important beginning clue. Robert Kennedy, who was closer to his brother and knew more about his many enraged detractors than anyone else, told a friend that the Mafia was principally responsible for what happened November 22. In public, for the five years that remained of his life, Bobby Kennedy made no criticisms of the nine-hundred-page Warren Report, which pinned the murder on a solo killer, a “nut” (per Hoover) and “general misanthropic fella” (per Warren Committee member Richard Russell) who had dreams of eternal fame. Attorney general Kennedy said, when reporters asked, that he had no intention of reading the report, but he endorsed it in writing and stood by it. Yet on the very night of the assassination, as Bobby began his descent into a near-catatonic depression, he called one of his organized-crime experts in Chicago and asked him to find out whether the Mafia was involved. And once, when friend and speechwriter Richard Goodwin (who had worked closely with JFK) asked Bobby what he really thought, Bobby replied, “If anyone was involved it was organized crime.”

To Arthur Schlesinger, Bobby was (according to biographer Jack Newfield) even more specific, ascribing the murder to “that guy in New Orleans”—meaning Carlos Marcello, the squat, tough, smart, wealthy mobster and tomato salesman who controlled slot machines, jukebox concessions, narcotics shipments, strip clubs, bookie networks, and other miscellaneous underworldy activities in Louisiana, in Mississippi, and, through his Texas emissary Joe Civello, in Dallas. In the early sixties, the syndicate run by Marcello and his brothers made more money than General Motors; the Marcellos owned judges, police departments, and FBI bureau chiefs. And when somebody failed to honor a debt, they killed him, or they killed someone close to him.

According to an FBI informant, Carlos Marcello confessed to the assassination. Some years before he died in 1993, Marcello said—as revealed by Lamar Waldron in three confusingly thorough books, the latest and best of which is The Hidden History of the JFK Assassination—“Yeah, I had the little son of a bitch killed,” meaning President Kennedy. “I’m sorry I couldn’t have done it myself.” As for Jack Ruby, the irascible strip-club proprietor and minor Marcello operative who silenced Lee Harvey Oswald in the Dallas police station, Bobby Kennedy exclaimed, on looking over the record of Ruby’s pre-assassination phone calls, “The list was almost a duplicate of the people I called before the Rackets Committee.” And then in 1968, Bobby Kennedy himself, having just won the California primary, was shot to death in a hotel kitchen in Los Angeles by an anti-Zionist cipher with gambling debts who had been employed as a groom at the Santa Anita racetrack. The racetrack was controlled by Carlos Marcello’s friend Mickey Cohen. The mob’s palmprints were, it seems, all over the war on the Kennedy brothers. Senator John Kennedy, during the labor-racketeering hearings in 1959, said, “If they’re crooks, we don’t wound them, we kill them.” Ronald Goldfarb, who worked for Bobby Kennedy’s justice department, wrote in 1995, “There is a haunting credibility to the theory that our organized crime drive prompted a plan to strike back at the Kennedy brothers.”

Lamar Waldron’s Hidden History is a primary source for a soon-to-be-produced movie, with Robert De Niro reportedly signed to play Marcello and Leonardo DiCaprio in the part of jailhouse informant Jack Van Laningham. Other new books that offer the Mafia-did-it view are Mark Shaw’s The Poison Patriarch—which contains an interesting theory about Ruby’s celebrity lawyer, Melvin Belli, and fingers “Marcello in collusion with Trafficante, while Hoffa cheered from the sidelines”—and Stefano Vaccara’s Carlos Marcello: The Man Behind the JFK Assassination, which has just been translated. “Dallas was a political assassination because it was a Mafia murder,” writes Vaccara, an authority on the Sicilian Mafia. “The Mafia went ahead with the hit once it understood that the power structure or the ‘establishment’ would not be displeased by the possibility.” Burton Hersh, in his astute and effortlessly well-written Bobby and J. Edgar, a revised version of which appeared in 2013, calls the Warren Commission Report a “sloppily executed magic trick, a government-sponsored attempt to stuff a giant wardrobe of incongruous information into a pitifully small valise.” Carlos Marcello, Hersh is convinced, was “the organizing personality behind the murder of John Kennedy.”

by Nicholson Baker, The Baffler |  Read more:
Image: Michael Duffy
[ed. Sorry for all The Baffler articles lately... they've been putting out some really great stuff (lately and pastly). See also: Stephen King's 11/22/63, one of his best.]

Suboxone

I am the last person with a right to complain about Internet articles being too long. But if I did have that right, I think I would exercise it on Dying To Be Free, the Huffington Post’s 20,000-word article on the current state of heroin addiction treatment. I feel like it could have been about a quarter the size without losing much.

It’s too bad that most people will probably shy away from reading it, because it gets a lot of stuff really right.

The article’s thesis is also its subtitle: “There’s a treatment for heroin addiction that actually works; why aren’t we using it?” To save you the obligatory introductory human interest story: that treatment is suboxone. Its active ingredient is the drug buprenorphine, which is kind of like a safer version of methadone. Suboxone is slow-acting, gentle, doesn’t really get people high, and is pretty safe as long as you don’t go mixing it with weird stuff. People on suboxone don’t experience opiate withdrawal and have greatly decreased cravings for heroin. I work at a hospital that’s an area leader in suboxone prescription, I’ve gotten to see it in action, and it’s literally a life-saver.

Conventional heroin treatment is abysmal. Rehab centers aren’t licensed or regulated and most have little interest in being evidence-based. Many are associated with churches or weird quasi-religious groups like Alcoholics Anonymous. They don’t necessarily have doctors or psychologists, and some actively mistrust them. All of this I knew. What I didn’t know until reading the article was that – well, it’s not just that some of them try to brainwash addicts. It’s more that some of them try to cargo cult brainwashing, do the sorts of things that sound like brainwashing to them, without really knowing how brainwashing works assuming it’s even a coherent goal to aspire to. Their concept of brainwashing is mostly just creating a really unpleasant environment, yelling at people a lot, enforcing intentionally over-strict rules, and in some cases even having struggle-session-type-things where everyone in the group sits in a circle, scream at the other patients, and tell them they’re terrible and disgusting. There’s a strong culture of accusing anyone who questions or balks at any of it of just being an addict, or “not really wanting to quit”.

I have no problem with “tough love” when it works, but in this case it doesn’t. Rehab programs make every effort to obfuscate their effectiveness statistics – I blogged about this before in Part II here – but the best guesses by outside observers is that for a lot of them about 80% to 90% of their graduates relapse within a couple of years. Even this paints too rosy a picture, because it excludes the people who gave up halfway through.

Suboxone treatment isn’t perfect, and relapse is still a big problem, but it’s a heck of a lot better than most rehabs. Suboxone gives people their dose of opiate and mostly removes the biological half of addiction. There’s still the psychological half of addiction – whatever it was that made people want to get high in the first place – but people have a much easier time dealing with that after the biological imperative to get a new dose is gone. Almost all clinical trials have found treatment with methadone or suboxone to be more effective than traditional rehab. Even Cochrane Review, which is notorious for never giving a straight answer to anything besides “more evidence is needed”, agrees that methadone and suboxone are effective treatments.

Some people stay on suboxone forever and do just fine – it has few side effects and doesn’t interfere with functioning. Other people stay on it until they reach a point in their lives when they feel ready to come off, then taper down slowly under medical supervision, often with good success. It’s a good medication, and the growing suspicion it might help treat depression is just icing on the cake.

There are two big roadblocks to wider use of suboxone, and both are enraging.

The first roadblock is the #@$%ing government. They are worried that suboxone, being an opiate, might be addictive, and so doctors might turn into drug pushers. So suboxone is possibly the most highly regulated drug in the United States. If I want to give out OxyContin like candy, I have no limits but the number of pages on my prescription pad. If I want to prescribe you Walter-White-level quantities of methamphetamine for weight loss, nothing is stopping me but common sense. But if I want to give even a single suboxone prescription to a single patient, I have to take a special course on suboxone prescribing, and even then I am limited to only being able to give it to thirty patients a year (eventually rising to one hundred patients when I get more experience with it). The (generally safe) treatment for addiction is more highly regulated than the (very dangerous) addictive drugs it is supposed to replace. Only 3% of doctors bother to jump through all the regulatory hoops, and their hundred-patient limits get saturated almost immediately. As per the laws of suppy and demand, this makes suboxone prescriptions very expensive, and guess what social class most heroin addicts come from? Also, heroin addicts often don’t have access to good transportation, which means that if the nearest suboxone provider is thirty miles from their house they’re out of luck. The List Of Reasons To End The Patient Limits On Buprenorphine expands upon and clarifies some of these points.

(in case you think maybe the government just honestly believes the drug is dangerous – nope. You’re allowed to prescribe without restriction for any reason except opiate addiction)

The second roadblock is the @#$%ing rehab industry. They hear that suboxone is an opiate, and their religious or quasi-religious fanaticism goes into high gear. “What these people need is Jesus and/or their Nondenominational Higher Power, not more drugs! You’re just pushing a new addiction on them! Once an addict, always an addict until they complete their spiritual struggle and come clean!” And so a lot of programs bar suboxone users from participating.

This doesn’t sound so bad given the quality of a lot of the programs. Problem is, a lot of these are closely integrated with the social services and legal system. So suppose somebody’s doing well on suboxone treatment, and gets in trouble for a drug offense. Could be that they relapsed on heroin one time, could be that they’re using something entirely different like cocaine. Judge says go to a treatment program or go to jail. Treatment program says they can’t use suboxone. So maybe they go in to deal with their cocaine problem, and by the time they come out they have a cocaine problem and a heroin problem.

And…okay, time for a personal story. One of my patients is a homeless man who used to have a heroin problem. He was put on suboxone and it went pretty well. He came back with an alcohol problem, and we wanted to deal with that and his homelessness at the same time. There are these organizations called three-quarters houses – think “halfway houses” after inflation – that take people with drug problems and give them an insurance-sponsored place to live. But the catch is you can’t be using drugs. And they consider suboxone to be a drug. So of about half a dozen three-quarters houses in the local area, none of them would accept this guy. I called up the one he wanted to go to, said that he really needed a place to stay, said that without this care he was in danger of relapsing into his alcoholism, begged them to accept. They said no drugs. I said I was a doctor, and he had my permission to be on suboxone. They said no drugs. I said that seriously, they were telling me that my DRUG ADDICTED patient who was ADDICTED TO DRUGS couldn’t go to their DRUG ADDICTION center because he was on a medication for treating DRUG ADDICTION? They said that was correct. I hung up in disgust.

So I agree with the pessimistic picture painted by the article. I think we’re ignoring our best treatment option for heroin addiction and I don’t see much sign that this is going to change in the future.

But the health care system not being very good at using medications effectively isn’t news. I also thought this article was interesting because it touches on some of the issues we discuss here a lot:

by Scott Alexander, Slate Star Codex |  Read more:
[ed. See also: Against Rat Park.]

Wednesday, October 25, 2017


Joan RabascallIBM 360, 1967.
via:

I Interviewed at Five Top Companies in Silicon Valley, and Got Five Job Offers

In the five days from July 24th to 28th 2017, I interviewed at LinkedIn, Salesforce Einstein, Google, Airbnb, and Facebook, and got all five job offers.

It was a great experience, and I feel fortunate that my efforts paid off, so I decided to write something about it. I will discuss how I prepared, review the interview process, and share my impressions about the five companies.

How it started

I had been at Groupon for almost three years. It’s my first job, and I have been working with an amazing team and on awesome projects. We’ve been building cool stuff, making impact within the company, publishing papers and all that. But I felt my learning rate was being annealed (read: slowing down) yet my mind was craving more. Also as a software engineer in Chicago, there are so many great companies that all attract me in the Bay Area.

Life is short, and professional life shorter still. After talking with my wife and gaining her full support, I decided to take actions and make my first ever career change.

Preparation

Although I’m interested in machine learning positions, the positions at the five companies are slightly different in the title and the interviewing process. Three are machine learning engineer (LinkedIn, Google, Facebook), one is data engineer (Salesforce), and one is software engineer in general (Airbnb). Therefore I needed to prepare for three different areas: coding, machine learning, and system design.

Since I also have a full time job, it took me 2–3 months in total to prepare. Here is how I prepared for the three areas.

Coding

While I agree that coding interviews might not be the best way to assess all your skills as a developer, there is arguably no better way to tell if you are a good engineer in a short period of time. IMO it is the necessary evil to get you that job.

I mainly used Leetcode and Geeksforgeeks for practicing, but Hackerrank and Lintcode are also good places. I spent several weeks going over common data structures and algorithms, then focused on areas I wasn’t too familiar with, and finally did some frequently seen problems. Due to my time constraints I usually did two problems per day.

Here are some thoughts:
  1. Practice, a lot. There is no way around it.
  2. But rather than doing all 600 problems on Leetcode, cover all types and spend time understanding each problem thoroughly. I did around 70 problems in total and felt that was enough for me. My thought is that if 70 problems isn’t helpful then you may not be doing it right and 700 won’t be helpful either.
  3. Go for the hardest ones. After those the rest all become easy.
  4. If stuck on one problem for over two hours, check out the solutions. More time might not be worth it.
  5. After solving one problem, check out the solutions. I was often surprised by how smart and elegant some solutions are, especially the Python one-liners.
  6. Use a language that you are most familiar with and that is common enough to be easily explained to your interviewer.
System design

This area is more closely related to the actual working experience. Many questions can be asked during system design interviews, including but not limited to system architecture, object oriented design, database schema design, distributed system design, scalability, etc.

There are many resources online that can help you with the preparation. For the most part I read articles on system design interviews, architectures of large-scale systems, and case studies.

Here are some resources that I found really helpful:
  1. Understand the requirements first, then lay out the high-level design, and finally drill down to the implementation details. Don’t jump to the details right away without figuring out what the requirements are.
  2. There are no perfect system designs. Make the right trade-off for what is needed.
With all that said, the best way to practice for system design interviews is to actually sit down and design a system, i.e. your day-to-day work. Instead of doing the minimal work, go deeper into the tools, frameworks, and libraries you use. For example, if you use HBase, rather than simply using the client to run some DDL and do some fetches, try to understand its overall architecture, such as the read/write flow, how HBase ensures strong consistency, what minor/major compactions do, and where LRU cache and Bloom Filter are used in the system. You can even compare HBase with Cassandra and see the similarities and differences in their design. Then when you are asked to design a distributed key-value store, you won’t feel ambushed.

Many blogs are also a great source of knowledge, such as Hacker Noon and engineering blogs of some companies, as well as the official documentation of open source projects.

The most important thing is to keep your curiosity and modesty. Be a sponge that absorbs everything it is submerged into.

Machine learning

Machine learning interviews can be divided into two aspects, theory and product design.

Unless you are have experience in machine learning research or did really well in your ML course, it helps to read some textbooks. Classical ones such as the Elements of Statistical Learning and Pattern Recognition and Machine Learning are great choices, and if you are interested in specific areas you can read more on those.

Make sure you understand basic concepts such as bias-variance trade-off, overfitting, gradient descent, L1/L2 regularization,Bayes Theorem,bagging/boosting,collaborative filtering,dimension reduction, etc. Familiarize yourself with common formulas such as Bayes Theorem and the derivation of popular models such as logistic regression and SVM. Try to implement simple models such as decision trees and K-means clustering. If you put some models on your resume, make sure you understand it thoroughly and can comment on its pros and cons.

by Xiaohan Zeng, Medium | Read more:
[ed. If you're considering a degree in computer programming/engineering and feel drawn to Silicon Valley, good luck! (... then you have to find a place to live.)]

A helicopter lands on the Pan Am roof
Like a dragonfly on a tomb
And business men in button downs
Press into conference rooms
Battalions of paper minded males
Talking commodities and sales
While at home their paper wives
And paper kids
Paper the walls to keep their gut reactions hid
(lyrics)
   
    ~ Joni Mitchell, Harry's House/Centerpiece (youtube)

Image via:

Tropical Depressions

"I don't know how to be human any more."

On a wretched December afternoon in 2015, as raindrops pattered a planetary threnody on grayed-out streets, five thousand activists gathered around Paris’s Arc de Triomphe, hoping to force world leaders to do something, anything, that would save the future. Ellie was there. But what she remembers most from that afternoon during the UN’s Climate Change Conference wasn’t what happened in the open, in front of cameras and under the sky. As they took the Metro together, activists commiserated, briefly, before the moment of struggle and the need to be brave, over just how hopeless it could sometimes feel. People talked about bafflement, rage, despair; the sense of having discovered a huge government conspiracy to wipe out the human race—but one that everybody knows about and nobody seems willing to stop.

Twenty meters beneath the Paris streets, the Metro became a cocoon, tight and terrified, in which a brief moment of honest release was possible. Eventually someone expressed the psychic toll in words that have stuck with Ellie since. It was a chance remark: “I don’t know how to be human any more.”

Climate change means, quite plausibly, the end of everything we now understand to constitute our humanity. If action isn’t taken soon, the Amazon rainforest will eventually burn down, the seas will fester into sludge that submerges the world’s great cities, the Antarctic Ice Sheet will fragment and wash away, acres of abundant green land will be taken over by arid desert. A 4-degree Celsius rise in global temperatures would, within a century, produce a world as different from the one we have now as ours is from that of the Ice Age. And any humans who survive this human-made chaos would be as remote from our current consciousness as we are from that of the first shamanists ten thousand years ago, who themselves survived on the edges of a remote and cold planet. Something about the magnitude of all this is shattering: most people try not to think about it too much because it’s unthinkable, in the same way that death is always unthinkable for the living. For the people who have to think about it—climate scientists, activists, and advocates—that looming catastrophe evokes a similar horror: the potential extinction of humanity in the future puts humanity into question now. (...)

An Empty World

Many of the climate scientists and activists we’ve spoken with casually talk of their work with a sense of mounting despair and hopelessness, a feeling we call political depression. We’re used to considering and treating depression as an internal, medical condition, something that can be put right with a few chemicals to keep the brain swimming in serotonin; in conceptualizing our more morose turns of mind, modern medicine hasn’t come too far from the ancient idea that a melancholy disposition arises from too much black bile in the body. But when depressives talk about their experiences, they describe depression in terms of a lost relationship to the world. The author Tim Lott writes that depression “is commonly described as being like viewing the world through a sheet of plate glass; it would be more accurate to say a sheet of thick, semi-opaque ice.” A woman going by the pseudonym of Marie-Ange, one of Julia Kristeva’s analysands, describes a world hollowed out and replaced by “a nothingness . . . like invisible, cosmic, crushing antimatter.” In other words, the inward condition of depression is nothing less than a psychic event horizon; the act of staring at a vast gaping absence—of hope, of a future, of the possibility of human life. The depressive peeks into the future that climate change generates. Walter Benjamin, trying to lay out the contours of melancholic experience, saw it there. “Something new emerged,” he wrote: “an empty world.”

Freud diagnoses melancholia as the result of a lost object—a thing, a person, a world—and the fracture of that loss repeats itself within the psyche. It’s the loss that comes first. We do not think of political depression as a personal disorder, the state of being depressed because of political events; rather it’s the interiorization of our objective powerlessness in the world. We all feel, vaguely, that our good intentions should matter, that we should have some power to affect the things around us for the better; political depression is the hopelessness that meets the determination to do something in a society whose systems and instruments are designed to frustrate our ability to act.

But it’s not that, like Kafka’s heroes, we’re facing a vast and inscrutable apparatus whose operation seems to make no sense, trembling in front of a machine. What’s unbearable is that it does make sense; it’s the same logic that governs every second of our lives.

At times, the climate movement has insisted on burying this crushing truth under a relentless optimism: the disaster can be averted, all that’s needed is the political will, and we simply have no time to luxuriate in feeling sad. And all this is true. But as activists have begun to acknowledge, there needs to be room for sadness. As the veteran activist Danni Paffard—arrested three times in climate protests, once narrowly avoiding prison after she shut down a runway on Heathrow Airport—puts it to us, “the climate movement has recognized that this is an existential problem and has created spaces for people to talk things through,” to exist within the sense of grief, to work with political depression instead of repressing it. After all, as the writer Andrew Solomon says, “a lot of the time, what [depressives] are expressing is not illness, but insight, and one comes to think what’s really extraordinary is that most of us know about those existential questions and they don’t distract us very much.” There’s a substantial literature on “depressive realism”—the suspicion that depressed people are actually right. In one 1979 study by Lauren B. Alloy and Lyn Y. Abramson, it was found that when compared to their nondepressed peers, depressed subjects’ “judgements of contingency were surprisingly accurate.’”

The depressive is, first of all, one who refuses to forget. In Freud’s account, while mourning is the slow release of emotional ties to something that’s vanished, melancholia is a refusal to let go. It’s not just that climate change is depressing; the determination to stop it has to begin from a depressive conviction: to not just forget that so much has been lost and more is going every day—to keep close to memory. Or as Paffard puts it, “You need to hold what’s at stake in your head enough to remember why it’s important to take action.”

La-La Land

In April this year, the Australian marine biologist Jon Brodie made headlines with his widely publicized despair. In an unprecedented tide, severe coral bleaching had destroyed much of the Great Barrier Reef; for Brodie, what had once been a worst-case scenario took horrifying form. “We’ve given up,” he told the Guardian. “It’s been my life managing water quality, we’ve failed. Even though we’ve spent a lot of money, we’ve had no success.” Brodie had spent decades warning the Australian government—which also funds his efforts—that something like this would happen if serious action didn’t take place, and being repeatedly disappointed as politicians refused to listen.

What do you do after the worst has already happened? He sounds stoic over the phone when we speak to him, as if he’s not fully aware of just how awful everything he says really is. “If you want to see the coral reefs,” he tells us, “go now. It’s got some good bits, but you have to see them now, because they won’t look like that in ten years’ time.”

Hope is difficult. “I work with young people,” Brodie explains. “Even up until five years ago, I felt I could inspire them. But now I have PhD students—I have trouble giving them a feeling that they can still do something. We’re in an era of science denial.” It’s not the inevitability of climate change that’s depressing; rather, it’s precisely the realization that it can be prevented—together with the day-to-day reckoning with the pettiness of what stands in the way. “When I was younger,” Paffard tells us, “I would walk through the City of London and look at people living their everyday lives and think, ‘We’re all just continuing as though everything is normal, as though the world isn’t about to end.’ And that used to freak me out and make me angry. But now it just makes me sad . . . it’s the moments where you let yourself think about it when you get overwhelmed by it.”

For Brodie, political stupidities blend seamlessly into the apocalypse that they create.

When I contrast what we had eighteen years ago to the idiots [in government] today, I feel sad and angry. They keep positing coal as the solution to our energy needs. They’re living in la-la land. In the end, [biological] life will go on. Maybe humans will go on a bit longer, but the Earth will still be here. (...)

Brodie seeks to cope with the loss of the coral reefs by creating an ecosystem within his control: “There is a sense of loss, but I do other things to compensate. I live on a large piece of land and I am growing a forest on it, so that gives me a sense of satisfaction—there are birds and butterflies.” You step back; you find other things; the moments we still have. Faced with the vastness of climate change, people reach for what’s smaller. “I keep myself so busy,” Paffard says, “so I don’t think about it on an existential level.” Lewis, who notes that “it’s very hard to plan long term because we live in a capitalist economy,” and that “people hedge their bets by consuming now and worrying about the future later,” says he resorts to similar strategies of full cognitive immersion in the many shorter-term tasks at hand.

“People on the outside of science think we sit around all day worrying about these big questions, and we don’t. Scientists are thinking about where their next grant is going to come from. You find intellectual stimulation in your work without thinking about the big picture. Recently I caught myself thinking, when the El Niño happened over the last couple of years, which gave us abnormally high temperatures—brilliant! I get to see what abnormally high temperatures do to the tropical forests I’m studying.”

With this response, Lewis says, he shocked himself. There’s an impersonality to the processes that are destroying our planet. He even has a kind of sympathy for fossil-fuel lobbyists: what they do is evil, but it’s hard to separate from the evil that’s everywhere around us. “A lot of people go in trying to change it on the inside and then end up adopting the culture, and don’t change things. Because it’s very difficult . . . There’s all sorts of psychological tricks people play on themselves to allow them to do things that are incredibly antisocial.” It’s difficult for anyone to change things, and the prospects for substantive change can be as hard for government-funded scientists and battle-hardened campaigners as for anyone else. “Would I want to live like someone in Papua New Guinea to avoid climate change?” Brodie wonders. “Probably not.”

Political depression means staring into a vastness, but one without grandeur or the sublime, one that’s almost invisible. When we wake up with every morning, it’s just there, seeping into our bones. “I am amazed,” Paffard tells us, “by our inability to engage with things that are scary and bigger than us. It’s the minutiae that keep us going . . . it’s too big for us to hold in our minds.” What can we do? We’re only human.

by Sam Kriss and Ellie Mae O'Hagan, The Baffler |  Read more:
Image: Jacob Magraw

The Globalized Jitters

News Updates from the Edge

All the cool kids have anxiety disorders these days. I’m not claiming that this makes me one of them. Correlation, as we all know, does not imply causation, and I am reliably informed that the cool kids also understand Snapchat, wear floral jumpsuits, and know how to talk to people they fancy without pulling a face like a spaniel on acid. Nevertheless, if depression was the definitive diagnosis of the 1990s, anxiety is the mental health epidemic that makes the modern world what it is: overwhelmed, unstable, and in serious need of a decade-long lie down.

The ubiquity of anxiety disorders would shock anyone who hadn’t watched the news lately and understood quite how much most of us have to worry about. Nearly one in five American adults over the age of thirteen suffer from an anxiety disorder, with women twice as likely to be affected. Depression, anxiety, and related disorders have increased in the decade since the financial crash, and we can’t blame it all on Big Pharma. In June, New York Times writer Alex Williams sized up a slew of anxiety memoirs atop the bestseller lists and noted that “Prozac Nation Is Now the United States of Xanax.”

If dealing with your anxiety is now a lifestyle trend, talking about your anxiety is a publishing trend. Part of the reason that memoirs by professionally neurotic authors are now so bankable is that the last best job of a writer is to make the anxieties of the age beautiful, comprehensible and, if possible, lucrative. It helps that many of them—like Kat Kinsman’s Hi, Anxiety and Andrea Petersen’s On Edge: A Journey Through Anxiety—are also deliciously well-written. Anxiety, unlike depression, is exciting to write about, in part because it is a condition in which absolutely everything is suddenly way too exciting. This, by coincidence, is also a neat description of the geopolitics of the chaotic adolescence of the twenty-first century. In the immortal words of Horse ebooks: “Everything happens so much.”

The problem is both profound and profoundly modern. The problem, specifically, is that a lot of us are pretty freaked out pretty much all the time, and whether or not we’ve good reasons for it, the condition of constant panic is debilitating. Despite the rash of articles suggesting a novel condition known as “Trump-related anxiety,” this is a problem far bigger than the presidency. The gurning batrachian monster that crawled out of the mordant id of mass society to squat in the Oval Office was a symptom of our collective neurosis before he was a cause.

“Each phase of capitalism,” according to the activist collective Plan C,
has a particular affect which holds it together. . . . Today’s public secret is that everyone is anxious. Anxiety has spread from its previous localized locations—such as sexuality—to the whole of the social field. All forms of intensity, self-expression, emotional connection, immediacy, and enjoyment are now laced with anxiety. It has become the linchpin of subordination.

One major part of the social underpinning of anxiety is the multi-faceted omnipresent web of surveillance. The NSA, CCTV, performance management reviews, the Job Centre, the privileges system in the prisons, the constant examination and classification of the youngest schoolchildren. But this obvious web is only the outer carapace. We need to think about the ways in which a neoliberal idea of success inculcates these surveillance mechanisms inside the subjectivities and life-stories of most of the population. . . . We are failing to escape the generalized production of anxiety.
Anxiety may be a logical response to overwhelming stress and insecurity, but it is also a very easy way to keep people isolated, cowed, and compliant. Existing in a state of constant agitation is unpleasant, but it is also useful. Anxious people get things done—at least up to a certain point, at which productivity rapidly plummets in a condition known euphemistically as “burnout,” “stress,” or “complete and utter wall-gnawing, corner-scuttling, gibbering breakdown.”

In his forthcoming book Kids These Days, Malcolm Harris draws on a body of psychological research to observe:
Given what we know about recent changes in the American sociocultural environment, it would be a surprise if there weren’t elevated levels of anxiety among young people. Their lives center around production, competition, surveillance, and achievement in ways that were totally exceptional only a few decades ago. All this striving, all this trying to catch up and stay ahead—it simply has to have psychological consequences. The symptoms of anxiety aren’t just the unforeseen and unfortunate outcome of increased productivity and decreased labor costs; they’re useful. . . . Restlessness, dissatisfaction and instability—which Millennials report experiencing more than generations past—are negative ways of framing the flexibility and self-direction employers increasingly demand. . . . All of these psychopathologies are the result of adaptive developments. (...)
Reality Bites Back

If anything, nearly two decades into the new millennium, everything feels a little too real. In place of predictable consumerist anhedonia, instead of the blithe horizon of cardboard cut-out suburban security, reality has come back with a mouthful of razors ready to rip open the throats of anyone pretending they know what’s coming.

This age of anxiety did not begin with this presidency, nor does it end at the U.S. border, but there is something fundamentally Yankish about the culture of dogged, dead-eyed competition that produces it. It’s what happens when the American Dream becomes a nightmare you can’t wake from, and not just because you haven’t had a proper night’s sleep in years. It’s what happens when a society clings to a defining mythos that celebrates working until you drop, abhors poverty as evidence of moral failure, considers the provision of a basic safety net a pansy European affectation, and continues to call itself free.

Mental health is invariably political, even if the first available solutions are individual. Anxiety keeps us ready for a fight-or-flight response in a society that has all but outlawed both flight and fight. Today’s anxiety memoirists, in particular, are attached to the understanding of anxiety as a disorder with only a tangential relationship to real-world events. There is a focus on “raising awareness” of the condition, as if awareness by itself were any sort of answer. Those of us who live with anxiety have more awareness than we know what to do with, including of how our own brains can ambush us with the sudden black anticipation of imminent death. Knowing that you might at any point be incapacitated by panic is not a restful prospect for anyone who has been there. On top of your deadlines, your debts, and the crawling curiosity about which will fall apart first, your life plans or western civilization, you now also have to deal with being crazy.

But just because you’re crazy doesn’t mean you’re wrong. The problem is not that a very large number of us are very worried almost all the time; it is, rather, that there are an awful lot of things to worry about. This is true both at the micro level—how will you ever pay off the debts you ran up earning those qualifications you were told you needed for that job you can’t get?—and at the macro level, where we peek through our fingers at the wildfires and war refugees on the news and wonder if it’s worth starting any long books. Constant concern may be unhealthy, but it is not illogical. Of course, I would say that. I have an anxiety disorder.

On the other hand, simply recognizing that your fears are abundantly founded doesn’t make you less unwell—and choosing not to manage your anxiety is hardly an efficient way of enacting social change. No reputable shrink can write you a prescription for the psycho-social symptoms of late capitalism in its current form, and structural solutions for chronic despairing precarity are not available in over-the-counter form. But people still need to get up and go to work in the morning, whether or not it terrifies us. So what are we supposed to do?

The Organizing Cure

Well, we can always go shopping. The cultural response to this ambient panic can be measured in the desperate mood of wish-fulfillment we see in the sprawling market for emotionally stabilizing sloganeering and interior design. There is, for instance, a bewildering number of throw-pillows currently on sale begging for “good vibes only.” The aesthetics of pop culture are washed in a frantic blush-pastel color scheme, all seafoam green and soothing “millennial pink” and bare stripped-down minimalism, as if we’re desperate to decorate our box-room apartments like the inside of a psychiatric ward. And then there’s the pop culture meme that refuses to die—the endless regurgitation on T-shirts and tea towels and pocket pill-cutters of the cheesy Blitz-era slogan “keep calm and carry on,” as if either approach were a remotely appropriate response to the mounting crises that are engulfing the common weal. The last thing any of us needs to do is keep calm.

by Laurie Penny, The Baffler |  Read more:
Image: Shreya Chopra

Tuesday, October 24, 2017


The Circleville Herald, Ohio, June 29, 1956
via:

Ticked Off: What We Get Wrong About Lyme Disease

My sister Camilla and I stepped off the passenger ferry onto the dock at Vineyard Haven, Martha’s Vineyard’s main port, with a group that had already begun their party. They giggled, dragging coolers and beach chairs behind them. We competed to see how many items of Nantucket red we could spot.

Not that we were wearing any. Camilla wore shorts with white long underwear underneath, and I wore beige quick-dry hiking pants. Both of us had on sneakers with long white socks. It was late June, perfect beach weather. The water sparkled. But we weren’t headed toward the ocean. We were there to hunt for ticks.

On the island, we hopped in a cab. Camilla looked longingly out the window as we passed the turns for the town beach and Owens Park Beach. The driver pointed out the location of the famous shark attack beach from Jaws. We drove on south to Manuel Correllus State Forest, an unremarkable park in the center of the island and the farthest point from any beach.

Deer ticks, or blacklegged ticks, are poppy-seed sized carriers of Lyme disease. We needed to collect 300 before the last ferry returned to Woods Hole, Massachusetts that night. We each unfurled a drag cloth—a one-meter square section of once-white corduroy attached to a rope—and began to walk, dragging the cloth slowly behind us as if we were taking it for a stroll. The corduroy patch would rise and fall over the leaves and logs in the landscape, moving like a mouse or a chipmunk scurrying through the leaf litter. Ticks, looking for blood, would attach to the cloth. Every 20 meters, we’d stoop to harvest them.

Tick collecting made it to Popular Science’s 2004 list of worst science jobs alongside landfill monitor and anal wart researcher. On cool days, though, sweeping the forest floor, kneeling to pluck ticks from corduroy ridges, the job became rhythmic. I felt strangely close to the forest. As I soon found out, the work got me closer to people, too.

Sometimes hikers would stop by, curious, then repulsed. They would want confirm the proper way to pull off ticks (with tweezers planted close to skin, perpendicularly), or to tell us about their diagnoses. Lyme disease isn’t like many of the diseases studied by my friends in the epidemiology department, where I was a doctoral student. No one talks about their grandmother’s syphilis infection, caused by Treponema pallidum, another spirochete bacterium.

But once people heard what Camilla and I were collecting, stories of brushes with ticks and family members’ diagnoses were shared freely. I quickly became the “tick girl.” When I started my dissertation I was preoccupied by the ecological question: How have humans altered the environment and triggered a disease emergence? By the time I finished, I realized that far more interesting were the rich and revealing tick stories shared with us along the way.

Illness makes us talk. “This is true of all forms of pain and suffering,” Arthur Kleinman, an anthropologist and physician at Harvard University, told me. We talk about illness “to seek assistance, care, and in part to convey feelings about fear, anxiety, or sadness.” In his book, The Illness Narratives, Kleinman writes that “patients order their experience of illness … as personal narratives.” These narratives become a part of the experience of being sick. “The personal narrative does not merely reflect illness experience, but rather it contributes to [it].”

The result is a peculiar togetherness. Once, a friend’s mom emailed that she’d just pulled off her first tick of the season, from her pubic hair: “I’m guessing it doesn’t it surprise you to hear, Katie, that you came to mind almost immediately when I discovered the little bugger? I’m afraid that ticks and you will be forever linked in my mind.” Naturally some took the motif too far. One creepy grad student thought that, because I was standing in front of a tick poster at an academic conference, I’d want to hear about the time he pulled a tick off his dick.

The country singer Brad Paisley romances the tick: “I’d like to see you out in the moonlight / I’d like to kiss you way back in the sticks / I’d like to walk you through a field of wildflowers / And I’d like to check you for ticks.” I’m with Paisley here. Creeps aside, tick grooming is an act of love. My sister and I were diligent in the tick checks we gave ourselves and each other. Most nights, we’d pull off several at the campsite showers. (...)

The idea that the natural and human exist in separate realms is the very “trouble with wilderness,” the environmental historian William Cronon wrote in his 1995 book Uncommon Ground. The wilderness that we’ve feared, romanticized, and valorized over the last few hundred years, he says, is a fantasy:
[Wilderness] is quite profoundly a human creation—indeed, the creation of very particular human cultures at very particular moments in human history … Wilderness hides its unnaturalness behind a mask that is all the more beguiling because it seems so natural. As we gaze into the mirror it holds up for us, we too easily imagine that what we behold is Nature when in fact we see the reflection of our own unexamined longings and desires.
In the stories told by our doctors, our parks, and the CDC, ticks are invaders. To defend ourselves, we use insect repellent, clothing, and prophylactic antibiotics; fences, signs, and pesticides. “When it comes to pesticides, the environmental toxin par excellence, Lyme patients are often its greatest proponents,” writes Abigail Dumes, an anthropologist at Michigan State University. We prefer the risk posed by pesticides to the fear of Lyme, Dumes explained to me. They let us become actors instead of victims. By dosing ourselves with pesticides (or antibiotics), we gain control of our risks. Ticks, on the other hand are uncontrollable. “It’s difficult to live with the idea that there are enormous threats and many can’t be controlled,” Kleinman tells me.

The problem is our defensive barriers aren’t working particularly well. Deer ticks are now established across 45 percent of United States counties. Their range has more than doubled in the last 20 years. Reported cases of Lyme disease have more than tripled since 1995 and the CDC estimates that more than 300,000 Americans fall ill each year. The story of tick-as-invader isn’t particularly helpful—or complete.

by Katharine Walter, Nautilus | Read more:
Image: Katharine Walter

Jan Tarasin, Records series 1991
via:

Up The Irony

There’s no limit to the waves of embarrassment that old metal dudes will rain down on their beleaguered, black-clad fans. Every time a hesher bangs his head, some fogey from the first or second gen kills the buzz: here’s Phil Anselmo doing a white power sign like a dickwad, or how about Gene Simmons’s failed bid to trademark the horns? Or, Ted Nugent, still sucking air? It certainly does suck. More recently, Dee Snider, the one-time frontman of the ’80s glam band Twisted Sister, who made an entire career out of appropriating crossdressing, is pissed about the “new” trend of non-metal fans wearing metal tees.

On October 17, Snider tweeted, “Gotta say, this new trend of non-metal fans wearing vintage metal T’s if [sic] pretty sickening. Metal is not ironic! Dicks.” Where the hell do I even begin? Old Man Snider is so completely out of touch with the culture he was once a part of, that he thinks a) the trend of wearing heavy metal shirts ironically is new, and b) that heavy metal itself is not ironic.

During the mid to late ’90s (heavy metal’s dark ages), every high school emo band on earth had at least one kid who donned an ironic Maiden tee. While no one could expect the lead singer of a band that barely classifies as metal to be aware of this particular phenomenon, the completely unavoidable resurgence of heavy metal in the early to mid 2000s (which is the only reason anyone under the age of 50 might give a damn about what Dee Snider thinks) was led by bands like Mastodon and The Sword, whose stock in trade was equal parts sincerity and irony. They showed a level of care and appreciation for old metal that led them to craft intricate, loud compositions that sounded fresh and exciting compared to whichever snooze fest James Mercer was packaging as a Shins album that year. Coupled with the fact that what those bands were saying was just fucking funny was what made it so timely and welcome. This balance is the only thing that allowed heavy metal to force its way back into any kind of cultural relevance.

Snider followed his first ill-conceived tweet on the subject with this the next day:

“It’s not just the wearing of our metal T’s, it’s their cherry picking of our style #skulls#metalhorns These are OUR symbols; OUR image.”

And that’s where his rant takes the all-too predictable turn from stupid to problematic. Who is this OUR that Dee Snider imagines? And who is the THEY? The group that Dee Snider fancies himself a part of is a genre of music that wouldn’t exist had white people not stolen the blues and early rock and roll from people of color. Heavy metal is just one step in a long line of music born of that initial theft. The heavy metal that I know and love does not discriminate. It’s inclusive and open-minded, and it must reckon with a past that includes Pantera proudly waving confederate flags and Lemmy sporting a straight-up Nazi uniform for half a century. It’s a heavy metal that struggles to turn a profit and can ill afford to alienate any person who wants to buy a shirt and wear it in public. People are being bombed and gunned down at concerts across the world, and no one is referring to them as pop or country music fans. An old white guy crying cultural appropriation reeks of hypocrisy. It’s time to let go of the ‘who is a true metal fan?’ debate and worry about things that actually fucking matter.

by John Dziuban, The Awl | Read more:
Image: uncredited

Nobody Thinks About eBay

One of those things that so many brands want is scale: eBay is enormous. It has 171 million users, with 1.1 billion listed items at any given time. But it’s also no longer the only game in town. There’s competition from all over, most notably from eBay's great rival to the north, Amazon; Brooklyn-based crafts giant Etsy; and venture-backed consignment sites like The Real Real and Poshmark. Deering may talk of the company’s advancements, but the truth is, eBay has fallen far behind.

It’s dedicated to remaining an online marketplace — nothing more than a platform on which buyers and sellers can interact — a position that’s hard to justify as it’s become less enticing to both kinds of users. It hasn’t invested in warehouses or inventory; it hasn’t introduced competitive shipping programs. It now needs to both differentiate and elevate itself, and then it must communicate all of that to the customer.

These days, 88 percent of postings are “Buy It Now” items, not at all tied to the auction function eBay is known for, and 81 percent of what’s available for sale is new. To eBay, new means unopened, never-used items; this claim is murky, though, as most items are still coming from third-party sellers and not from brands themselves. In fact, eBay has become a haven for flipping, a practice in which users sell in-demand merchandise at exponentially higher prices, further adding to eBay’s sometimes-dubious reputation.

eBay also thinks it’s positioned to acquire Millennial and Gen Z customers who have largely ignored the site. “Younger customers don’t have misperceptions of eBay — they don't have any perceptions,” says Deering. “We’re not even in their awareness at all.”

The company’s research has found that a younger audience wants unique products and “is searching for items that push against conformity.” In this way, Deering believes eBay can be something of a foil to Amazon: “People felt like they were becoming anti-human because Amazon is so habitual, but that isn’t us. If you love Converse, you come to our site because there’s every color, every graffiti-ed version, vintage. You’re not going to get that if you go onto Amazon or into a department store.”

The goal, as eBay’s vice president of merchandising Jay Hanson puts it, is to get customers to think of eBay as the first shopping site they should visit, no matter what they’re looking to buy. But can the once-dominant company actually take that crown from Amazon? Or compete with much nimbler startups that offer white-glove services and more curated and easily navigated shopping experiences? It seems wildly unlikely, but eBay’s determined to try. (...)

In the shadow of competitors big and small, eBay has remained stagnant. Its 2016 net revenue hovered just under $9 billion — a significant figure, but one that's only marginally risen over the last for five years. Analysts attribute eBay’s shortcomings to outdated technology and a confounding user experience.

“The eBay site hasn’t really gone through dramatic levels of change since the beginning, and if there was any change, it was subtle and not slick enough,” says Sean Maharaj, a director at global management consulting firm AArete. “The customer interface, the website, it is not on par with some other companies that are coming at this with a new digital strategy. It’s unfriendly and it’s not easy to navigate.”

One of eBay’s main bragging points — that it has 1.1 billion listings — has also become a source of weakness for the company. Organizing that many items is a herculean task, particularly when the bones of the site are now decades old. Greg Portell, a retail partner at global strategy and management consulting firm A.T. Kearney, understands why eBay positioned itself for so long as a marketplace for everything and anything, as opposed to offering a more edited selection of products.

“If you think of successful retailers like T.J. Maxx or Marshall’s, you can justify it,” says Portell. “They aren’t known to be very neat and tidy, and they do extremely well. My guess is eBay was emphasizing this attitude, as opposed to a clean, personalized experience you’d expect from a well-curated store. eBay’s problem now, though, is they are stuck in a middle space where it’s hard to differentiate. You have smaller, niche sites that can curate and provide a less cluttered environment, which is what’s growing right now.”

by Chavie Lieber, Racked |  Read more:
Image: Christie Hemm Klok

Cash Prizes for Bad Corporate Citizenship, Amazon Edition

Everyone in the urban space is busy handicapping the Amazon horserace, to see which city will land Amazon’s HQ2, which promises to be the biggest economic development prize of the 21st century. Amazon’s RFP, issued last week, invites metro areas with a million or more population to submit their entries.

Prominent among them: Show us your incentive packages:
Capital and Operating Costs – A stable and business-friendly environment and tax structure will be high-priority considerations for the Project. Incentives offered by the state/province and local communities to offset initial capital outlay and ongoing operational costs will be significant factors in the decision-making process. 
Incentives – Identify incentive programs available for the Project at the state/province and local levels. Outline the type of incentive (i.e. land, site preparation, tax credits/exemptions, relocation grants, workforce grants, utility incentives/grants, permitting, and fee reductions) and the amount. The initial cost and ongoing cost of doing business are critical decision drivers.
There’s actually very little to add to the speculation about which city has the inside edge. Plenty has been written that makes the most obvious points. Brookings’ Joseph Parilla narrows the list to 20 cities that have the size to accomodate the company. Richard Florida makes a strong case for the top half dozen. The New York Times Upshot has gone so far as to pick a winner (Denver); although their article is actually more helpful for thinking about the winnowing process than handicapping the eventual winner.

A common refrain is that this beauty contest is ultimately revealing as to 21st century corporate decision making factors. While there’s a lot of detail here, the factor that’s going to make the most difference is the availability of talent. When you’re hiring upwards of 50,000 highly trained workers, as we’ve said before, the location decision is going to be made by the HR department. A city has to have a substantial base of talent–especially software engineers–and be a place that can easily attract and accomodate more. Beyond these the availability of talent, it’s likely that analysts are reading too much into the criteria laid out in the RfP. The request for proposals was not drawn up to reveal Amazon’s decision criteria. It was drawn up to solicit the maximum number of credible incentive packages.

If you’ve been around the economic development fraternity for long, you’ll know that this is just the latest in a series of similar high profile corporate gambits to generate state and local subsidies. Back the the 1980s, states and cities were throwing themselves at GM’s newly minted Saturn division (remember them?), offering up subsidies for Microelectronics and ComputerTechnology Corporation(MCC) and submitting bids to be the home of the SuperConducting Super Collider. All three of these supposedly world-changing enterprises have since expired or been absorbed into other organizations.

Amazon–who after all, makes its business knowing the decision preferences of tens or hundreds of millions of customers–is hardly likely to rely on cities for the information to make its decision. In all likelihood, the company already has in mind a preferred site, or perhaps two. The whole point of this exercise is to improve the company’s bargaining position for the location it wants. (...)

Corporations have choices. They could go about their business, and simply choose the best location, the one that makes the greatest business sense, and invest accordingly. Or they can as Amazon, GE, and dozens of others, go through the ritual of pretending to entertain a wide range of proposals, and use the leverage of competing bids to sweat the best possible deal out of their preferred location. The net result of our current approach is to provide giant cash rewards to those who engage in the most cynical behavior. As a result, while Amazon may turn out to be a winner, it may come at the cost of fiscally impoverishing the city that it chooses to locate in. The other losers will be all the businesses against which Amazon competes, who are too small to have the leverage to insist on a comparable level of public subsidy for their similar operations.

by Joe Cortright, City Commentary | Read more:
Image: Amazon
[ed. See also: This Is What Really Happens When Amazon Comes to Your Town.]

Monday, October 23, 2017

What the Washington Post/CBS DEA Investigation Tells You About Congress: It’s Really bad

Recently, the Washington Post and CBS teamed up on an investigation that has now cost Congressman Tom Marino (R-PA) his nomination as the next Drug Czar. It is an incredible report, deeply sourced, with amazing details on how industry worked with Congress to gut DEA’s ability to prosecute drug trafficking abuses and deepened a horrific opioid epidemic in the U.S.

Many watchers have already latched on to the financial ties between industry and government. However, as troubling as they may be, it appears no laws were broken. No bribes were reported. Campaign contributions appear to conform to the letter of the law. Yet something feels clearly corrupt in this story. Which brings us to why the report is devastating on another front: it is a stunning display of institutional incompetence.

Congress as an institution, in a bipartisan fashion, both professionally and politically, failed. And its failure illustrates by far the most common form of influence in today’s Congress. This was not a case where 535 members of Congress were corrupted by a few thousand dollars of campaign contributions. This was a case where 535 members and their staffs didn’t know any better. This investigation did not uncover a crime; it exposed an institution in decline. Congress, in this instance, was unable to prevent the worsening of a national crisis because it didn’t know what it was doing.

Lobbying Influence


The bill at the heart of the investigation is not a major bill. It is short and obscure. If you read it, you would likely have no clue what it does. And it’s a perfect example of where lobbying has the greatest influence.

Contrary to popular belief, lobbyists often do not have their biggest impact on major legislation. The intense scrutiny major legislation receives and the rich information environment in which it is debated means much more competition for lobbyists trying to affect legislation. Multiple studies illustrate that Congress is not a vending machine: money in does not necessarily equal results out. So while lobbyists have an impact on major legislation, it is not often where the lobbying industry thrives. Instead, complex, low-salience issues are where lobbyists wield the most potent influence.

These minor bills comprise the overwhelming bulk of legislation Congress passes each year. Most of their work is on issues you’ve likely never heard of: removing restrictions on land transfers; improving services for older youth in foster care; increasing helium reserves at hospitals. These obscure and generally uncontroversial bills rarely make national headlines but take up the lion's share of Congress’s time.

This was true -- until a few days ago -- of the law that was the subject of the WaPo/CBSinvestigation. The bill was obscure when Congress passed it, and it remained so when President Obama signed it into law. Unless you have intimate knowledge of the authorizing statutes for the Drug Enforcement Agency, the internal mechanisms of the Department of Justice, and the Controlled Substances Act, you probably had no clue what this bill did. And that’s the point.

One of the investigation’s key characters is Linden Barber, a former DEA official who was the top lawyer at the Office of Diversion Control, which is charged with litigating abuses from industry and distributors. Barber left government for a better paying job in the private sector. Ultimately, he became a lobbyist that pushed the idea for legislation on Capitol Hill, finding champions for a bill that would weaken his former office’s ability to enforce the law.

The revolving door is nothing new to federal government, and it’s a fact of life on Capitol Hill. Staff and federal officials frequently leave government for the private sector, bringing with them deep expertise and valuable insider knowledge. This gives former government officials critical information advantages and can enable them to maneuver easily on complex issues most people do not understand.

But this was more than just a case of prior experience giving a lobbyist a leg up. This was a situation where expertise in the private sector far outweighed the expertise in either chamber of Congress, laying bare a knowledge gap between the two that many fear is widening.

The way this bill passed illustrates exactly how large - and how dangerous - this knowledge gap has become.

by Joshua C. Huder, Legbranch.com | Read more:

[ed. The Washington Post does no one any favors by installing a hard paywall (not even themselves, I'd suspect, even if they are making money). It'd be nice if Jeff Bezos' bottom line were a little more nuanced (especially for issues affecting society in general, as the Guardian and NY Times have attempted to do) but it's not. Bezos, and his company Amazon, apparently have no problem using loss leaders if they ultimately result in cornering whatever business category they're focused on, but that doesn't seem to apply to the Washington Post. There's a lot of good journalism being produced there these days and the company could be a lot more influencial if it wanted to be, but it isn't. Because... subscriptions. Hard paywall. Here's another link to the article in question: maybe it'll work, maybe it won't. Probably not, and if so, you'll never know what this story is about.]

Why Those Looking for the Next Crisis May be Looking in the Wrong Places

Despite, or more accurately, because so many markets are at high levels, often on thin trading volumes, many investors are edgy. Even though markets famously climb a wall of worry, I can’t recall a time when there have been so many skeptical long investors.

For instance, even though the famed FAANG keep racing to even loftier levels, a US stock market crash would be unlikely to do a lot of damage. Unlike the 1929 crash, this rally isn’t fueled mainly by money borrowed from banks. And unlike the dot-com bust, speculative stocks are not being used as a form of payment. Recall that companies that should have known better, such as Lucent (this BTW was Carly Fiorina’s doing) and McKinsey were taking equity instead of cash, meaning as consideration for services. Informed insiders say McKinsey had to write off $200 million of stock it took in lieu of fees; the actual number may be higher given that McKinsey could have discounted its fees. That practice was sufficiently widespread to give the dot-com crash a tad more sting than it might otherwise have had. Even so, there was no blowback to the payment system, and the early 2000s recession was not terrible by historical standards.

This is far from a complete list, but investors are worried about ETFs, Deutshe Bank, festering banking problems in Italy, and China’s debts, and a longer than usual list of exogenous risks, including nasty events resulting from increasing hostilities with North Korea, Russia, and Iran, perhaps a nuclear disaster resulting from wild weather, and further down the road, a disorderly Brexit doing more damage to Europe and it not-so-solid banks.

The reason this situation is so striking is that historically, crises that did real damage hurt financial institutions. In the Great Depression, banks all over the world failed, wiping out depositors’ funds, big chunks of the payment system, and the resulting downdraft correctly made the survivors too fearful to lend. In the US, a lot of traditional lending has been displaced by securitization, so investors taking losses or simply getting nervous could damage credit creation.

If one were to step back, and this is hardly a novel thought, the root of investor nervousness is the sustained and extreme intervention by central banks all around the world in financial markets. No one in 2008 would have thought it conceivable that less than a decade later, one quarter of the world economy would have set negative policy interest rates. Even though markets only occasionally pay attention to fundamentals, sustained super low interest rates, by design, have sent asset prices of all kinds into nosebleed territory.

The Fed seemed to be the first to recognize that its monetary experiments had done little for the real economy, save allow for some additional spending via mortgage refis. It had done more to transfer income and wealth to the top 1%, and even more so to the top 0.1%, and enrich banks, all of which are hindrances to long-term growth. Yet Bernanke announced his intention to taper in 2014, and how far has the Fed gotten in getting back to normalcy? The answer is not very. And that’s because central bankers fear that their policies are asymmetrical: they can do more to dampen activity by increasing rates than they can to spur growth by lowering them. As we’ve repeatedly pointed out, businessmen do not go out and expand because money is on sale. They expand when they see commercial opportunity. The exception is in industries where the cost of money is one of the biggest costs of production…such as in financial services and levered speculation.

However, from what I can tell, the Fed’s desire to raise rates is driven by its perception that it need to have short term rates meaningfully higher, as in 2% or higher, so as to have room for cuts if the banking system gets wobbly. That is why it keeps treating a flaccid but less terrible than in the past labor market as robust.

But the potentially more interesting contradiction is in the posture of conservative businessmen. Higher interest rates will hurt their stock portfolios and the value of their homes. It will also hurt fracking, which is very dependent on borrowed money. Yet Republicans are more eager than Democrats to raise interest rates, apparently out of the misguided belief that low interest rates help labor, as opposed to capital (the Fed’s using the state of the labor market as its indicator as to whether to increase interest rates or not no doubt feeds this belief). Similarly, Republicans are far more exercised about the size of the Fed’s balance sheet and want it smaller. Again, there’s no logical reason for this move. The Fed’s assets will liquidate over time. They may not do much additional good sitting there (save the remittance payments back to the Treasury), but they aren’t doing any harm either.

In other words, the varying views about what to do about central bank interest rates and their holdings in many, too many, cases have to do with political aesthetics that often run counter to economic interests. A big reason that conservatives don’t like the Fed’s big balance sheet, even though the Fed is the stalwart friend of banks and investors, is that they still see the Fed as government, and government intervening in the economy offends them, even when it might benefit them. (Mind you, this is not the same as business exploiting government via “public private partnerships” or other approaches where commercial interests have their hand on the steering wheel). (...)

Now you might ask, how does this relate to the original question, that market mavens might be looking for the next crisis in all the wrong places?

The first is that despite widespread worries about a crisis, you don’t need to have a crisis to have a bubble deflate. In the runup to 2008, I expected the unwind of reckless lending spree to look like that of Japan’s. Japan’s joint commercial and residential real estate bubbles were much larger relative to the GDP than those in the US. Yet instead of a dramatic bust, the economy contracted like a car with no wheels banging down a steep slope. A mini-crisis of sorts did occur in 1997, when the authorities made the mistake of thinking the economy was strong enough to take some tightening, which kicked off a series of financial firm failures. So even if it turns out things do end badly, you can have the real economy suffer without having the financial system have a heart attack.

The second is that with some significant exceptions like Deutsche Bank, the authorities have succeeded in moving risk out of the financial system and more and more onto the backs of investors. That means the rich, but it also means pension funds, insurance companies, endowments, foundations, and sovereign wealth funds. Investors have already taken a toll via super low interest rates; economist Ed Kane estimated that in the US alone, that represented a $300 billion per annum subsidy to banks.

So even if we were to have something crisis-like, as in a sudden ratchet down in asset prices that stuck, it isn’t clear that the damage to critical financial plumbing would be significant.

by Yves Smith, Naked Capitalism |  Read more:
Image: Getty via:

Lembrou Canela
via: