Saturday, January 6, 2018


Mick and Keith
via:

What “Affordable Housing” Really Means

When people — specifically market urbanists versus regulation fans — argue about housing affordability on the internet it seems to me that the two groups are using the concept of "affordable" in different ways.
  1. In one usage, the goal of improving affordability is to make it possible for more people to share in the economic dynamism of a growing, high-income city like Seattle.
  2. In the other usage, the goal of improving affordability is to reduce (or slow the rise of) average rents in an economically dynamic, high-income city like Seattle.
These are both things that a reasonable person could be interested in. But since they are different things, different policies will impact them.

The first definition is what market urbanists are talking about. I live in a neighborhood of Washington, DC, that's walkable to much of the central business district, has good transit assets, and though predominantly poor in the very recent past has now become expensive (i.e., it's gentrifying).

If the city changed the zoning to allow for denser construction, the number of housing units available in the neighborhood would increase and thus (essentially by definition) the number of people who are able to afford to live there would go up.

What's not entirely clear is whether a development boom would reduce prices in the neighborhood. I think it's pretty clear that on some scale, "more supply equals lower prices" is true. The extra residents don't materialize out of thin air, after all, so there must be somewhere that demand is eased as a result of the increased development.

But skeptics are correct to note that the actual geography of the price impact is going to depend on a huge array of factors and there are no guarantees here. In particular, there's no guarantee that incumbent low-income residents will be more able to stay in place under a high-development regime than a low-development one.

To accomplish the goals of (2), you really do need regulation — either traditional rent control or some newfangled inclusionary zoning or what have you.

But — critically — (2) doesn't accomplish (1). If you're concerned that we are locking millions of Americans out of economic opportunity by making it impossible for thriving, high-wage metro areas to grow their housing stock rapidly, then simply reducing the pace of rent increases in those areas won't do anything to help. Indeed, there's some possibility that it might hurt by further constraining overall housing supply.

by Matthew Yglesias, Vox | Read more:
Image: Shutterstock

Our Cloud-Centric Future Depends on Open Source Chips

When a Doctor First Handed Me Opioids

On a sunny September morning in 2012, my wife and I returned to our apartment from walking our eldest daughter to her first day of kindergarten. When we entered our home, in the Washington, DC, suburb of Greenbelt, Maryland, I immediately felt that something was off. My Xbox 360 and Playstation 3 were missing.

My wife ran to the bedroom, where drawers were open, clothing haphazardly strewn about. It was less than a minute before a wave of terror washed over me: My work backpack was gone. Inside that bag were notebooks and my ID for getting into work at NBC News Radio, where I was an editor. But the most important item in my life was in that bag: my prescription bottle of Oxycodone tablets.

“I can’t believe this happened to us,” my wife said.

“They took my pills,” I said.

We repeated those lines to each other over and over, my wife slowly growing annoyed with me. Why didn’t I feel the same sense of violation? Why wasn’t I more upset about the break-in? Oh, but I was. Because they took my pills. The game consoles, few dollars and cheap jewelry they stole would all be replaced. But my pills! They took my fucking pills!

We had to call the police. Not because of the break-in but, rather, so I could have a police report to show my doctor. That was all I could think about. My pills.

How did I become this person? How did I get to a place where the most important thing in my life was a round, white pill of opiate pleasure?
***
Before 2010, I only had taken opiates a few times. In 2007, I went to the emergency room in my hometown of Cleveland, Ohio, because I could not stop vomiting from abdominal pain. Upon my discharge, I was given 15 Percocets, 5 milligrams each. I took them as prescribed, noticed that they made me feel happy, and never gave them another thought.

After I took a reporter job in Orlando, I began to get sick more frequently, requiring several visits to the ER for abdominal pain and vomiting. In September of 2008 I was diagnosed with Crohn’s, an inflammatory bowel disease, and put on a powerful chemotherapy drug called Remicade to quell the symptoms. My primary care doctor, knowing I was in pain, prescribed me Percocet every month. I took them as needed, or whenever I needed a pick-me-up at work. I shared a few with a coworker from time to time. We’d take them, and 20 minutes later, start giggling at each other. I never totally ran out—never took them that often. I never needed an early refill.

In March of 2010, I was hired as the news director of a radio station in Madison, Wisconsin. Before we moved, my doctor in Orlando wrote me a Percocet script for 90 pills to bridge the gap until my new insurance kicked in in Wisconsin—approximately three month’s worth. I went through them in four weeks. I spent about a week feeling like I had the flu and then recovered, never once realizing that I was experiencing opiate withdrawal for the first time. Soon after, I set up my primary and GI care with my new insurance, and went back to my one-to-two-pills-per-day Percocet prescription, along with a continuation of my Remicade treatment.

Two months later, while my wife and daughter were visiting family in Cleveland, I developed concerning symptoms. My joints were swollen, I couldn’t bend my elbows, I was dizzy. I went to the ER, where for two days the doctors performed all sorts of tests as my symptoms worsened. Eventually, the rheumatologist diagnosed me with drug-induced Lupus from the Remicade. I was prescribed 60 Percocets upon leaving the hospital.

When I went back to my GI doc four weeks later for a refill, he told me he was uncomfortable prescribing pain medication, so he referred me to a pain clinic. I told the physician there how I would get cramps, sharp pains that would sometimes lead to vomiting. Did it hurt when I drove over bumps, or when bending over? Yes, sometimes. I left with a prescription for Oxycodone, with a prescription to take one pill every three to four hours. My initial script was for 120 pills. I felt like I hit the jackpot.

by Anonymous, Mother Jones |  Read more:
Image: PeopleImages/Getty

Friday, January 5, 2018

Nine-Enders

You’re Most Likely to Do Something Extreme Right Before You Turn 30... or 40, or 50, or 60...

Red Hong Yi ran her first marathon when she was 29 years old. Jeremy Medding ran his when he was 39. Cindy Bishop ran her first marathon at age 49, Andy Morozovsky at age 59.

All four of them were what the social psychologists Adam Alter and Hal Hershfield call “nine-enders,” people in the last year of a life decade. They each pushed themselves to do something at ages 29, 39, 49, and 59 that they didn’t do, didn’t even consider, at ages 28, 38, 48, and 58—and didn’t do again when they turned 30, 40, 50, or 60.

Of all the axioms describing how life works, few are sturdier than this: Timing is everything. Our lives present a never-ending stream of “when” decisions—when to schedule a class, change careers, get serious about a person or a project, or train for a grueling footrace. Yet most of our choices emanate from a steamy bog of intuition and guesswork. Timing, we believe, is an art.

In fact, timing is a science. For example, researchers have shown that time of day explains about 20 percent of the variance in human performance on cognitive tasks. Anesthesia errors in hospitals are four times more likely at 3 p.m. than at 9 a.m. Schoolchildren who take standardized tests in the afternoon score considerably lower than those who take the same tests in the morning; researchers have found that for every hour after 8 a.m. that Danish public-school students take a test, the effect on their scores is equivalent to missing two weeks of school.

Other researchers have found that we use “temporal landmarks” to wipe away previous bad behavior and make a fresh start, which is why you’re more likely to go to the gym in the month following your birthday than the month before.  (...)

For example, to run a marathon, participants must register with race organizers and include their age. Alter and Hershfield found that nine-enders are overrepresented among first-time marathoners by a whopping 48 percent. Across the entire lifespan, the age at which people were most likely to run their first marathon was 29. Twenty-nine-year-olds were about twice as likely to run a marathon as 28-year-olds or 30-year-olds.

Meanwhile, first-time marathon participation declines in the early 40s but spikes dramatically at age 49. Someone who’s 49 is about three times more likely to run a marathon than someone who’s just a year older.

What’s more, nearing the end of a decade seems to quicken a runner’s pace—or at least motivates them to train harder. People who had run multiple marathons posted better times at ages 29 and 39 than during the two years before or after those ages.

The energizing effect of the end of a decade doesn’t make logical sense to the marathon-running scientist Morozovsky. “Keeping track of our age? The Earth doesn’t care. But people do, because we have short lives. We keep track to see how we’re doing,” he told me. “I wanted to accomplish this physical challenge before I hit 60. I just did.” For Yi, the artist, the sight of that chronological mile marker roused her motivation. “As I was approaching the big three-o, I had to really achieve something in my 29th year,” she said. “I didn’t want that last year just to slip by.”

However, flipping life’s odometer to a nine doesn’t always trigger healthy behavior. Alter and Hershfield also discovered that “the suicide rate was higher among nine-enders than among people whose ages ended in any other digit.” So, apparently, was the propensity of men to cheat on their wives. On the extramarital-affair website Ashley Madison, nearly one in eight men were 29, 39, 49, or 59, about 18 percent higher than chance would predict.

“People are more apt to evaluate their lives as a chronological decade ends than they are at other times,” Alter and Hershfield explain. “Nine-enders are particularly preoccupied with aging and meaningfulness, which is linked to a rise in behaviors that suggest a search for or crisis of meaning.”

by Daniel H. Pink, The Atlantic |  Read more:
Image: Mike Segar, Reuters

Politics 101

Thursday, January 4, 2018


Cy Twombly, Untitled 1970
via:

The Rise and Fall of the Blog

New York Times writer Nicholas Kristof was one of the first to start blogging for one of the most well-known media companies in the world. Yet on December 8th, he declared his blog was being shut down, writing, “we’ve decided that the world has moved on from blogs—so this is the last post here.”

The death knell of blogs might seem surprising to anyone who was around during their heyday. Back in 2008, Daniel W. Drezner and Henry Farrell wrote in Public Choice, “Blogs appear to be a staple of political commentary, legal analysis, celebrity gossip, and high school angst.” A Mother Jones writer who “flat out declared, ‘I hate blogs’…also admitted, ‘I gorge myself on these hundreds of pieces of commentary like so much candy.'”

Blogs exploded in popularity fast. According to Drezner and Farrell, in 1999, there were an estimated 50 blogs dotted around the internet. By 2007, a blog tracker theorized there were around seventy million. Yet, a popular question today is whether blogs still have any relevance. A quick Google search will yield suggested results, “are blogs still relevant 2016,” “are blogs still relevant 2017,” and “is blogging dead.”

In 2007, the blogosphere may have been crowded, but it was undeniably influential. Blogs were credited with playing a pivotal role in campaign tactics, removing a Mississippi inmate from death row, impeding the sales of arms to Hugo Chavez’s regime, and spurring several other twists and turns in important national events.

Of course, power is in the eye of the beholder, and blogs used to be seen as a powerful indicator of public opinion by the people in power. As Drezner and Farrell put it in their 2008 article, “there is strong evidence that politicians perceive that blogs are a powerful force in American politics. The top five political blogs attract a combined 1.5 million unique visits per day, suggesting that they have far more readers than established opinion magazines such as the New Republic, American Prospect, and Weekly Standard combined.”

Today, writers lament the irrelevance of blogs not just because there’s too many of them; but because not enough people are engaging with even the more popular ones. Blogs are still important to those invested in their specific subjects, but not to a more general audience, who are more likely to turn to Twitter or Facebook for a quick news fix or take on current events.

Explains author Gina Bianchini as she advises not starting a blog, “2017 is a very different world than 2007. Today is noisier and people’s attention spans shorter than any other time in history…and things are only getting worse. Facebook counts a ‘view’ as 1.7 seconds and we have 84,600 of those in a day. Your new blog isn’t equipped to compete in this new attention-deficit-disorder Thunderdome.”

by Farah Mohammed, JSTOR |  Read more:
Image: iStock
[ed. Maybe people blog for reasons other than simple metrics.]

Raw Water

Step aside, Juicero—and hold my “raw” water.

Last year, Silicon Valley entrepreneur Doug Evans brought us the Juicero machine, a $400 gadget designed solely to squeeze eight ounces of liquid from proprietary bags of fruits and vegetables, which went for $5 to $8 apiece. Though the cold-pressed juice company initially wrung millions from investors, its profits ran dry last fall after journalists at Bloomberg revealed that the pricy pouch-pressing machine was, in fact, unnecessary. The journalists simply squeezed juice out of the bags by hand.

But this didn’t crush Evans. He immediately plunged into a new—and yet somehow even more dubious—beverage trend: “raw” water.

The term refers to unfiltered, untreated, unsterilized water collected from natural springs. In the ten days following Juicero’s collapse, Evans underwent a cleanse, drinking only raw water from a company called Live Water, according to The New York Times. “I haven’t tasted tap water in a long time,” he told the Times. And Evans isn’t alone; he’s a prominent member of a growing movement to “get off the water grid,” the paper reports.

Members are taking up the unrefined drink due to both concern for the quality of tap water and the perceived benefits of drinking water in a natural state. Raw water enthusiasts are wary of the potential for contaminants in municipal water, such as traces of unfilterable pharmaceuticals and lead from plumbing. Some are concerned by harmless additives in tap water, such as disinfectants and fluoride, which effectively reduces tooth decay. Moreover, many believe that drinking “living” water that’s organically laden with minerals, bacteria, and other “natural” compounds has health benefits, such as boosting “energy” and “peacefulness.”

Mukhande Singh (né Christopher Sanborn), founder of Live Water, told the Times that tap water was “dead” water. “Tap water? You’re drinking toilet water with birth control drugs in them,” he said. “Chloramine, and on top of that they’re putting in fluoride. Call me a conspiracy theorist, but it’s a mind-control drug that has no benefit to our dental health.” (Note: There is plenty of datashowing that fluoride improves dental health, but none showing water-based mind control.)

by Beth Mole, ARS Technica |  Read more:
Image: Live Water

Wednesday, January 3, 2018


photo: markk
repost

Flip of a Coin

Before antidepressants became mainstream, drugs that treated various symptoms of depression were depicted as “tonics which could ease people through the ups and downs of normal, everyday existence,” write Jeffrey Lacasse, a Florida State University professor specializing in psychiatric medications, and Jonathan Leo, a professor of anatomy at Lincoln Memorial University, in a 2007 paper on the history of the chemical imbalance theory.

In the 1950s, Bayer marketed Butisol (a barbiturate) as “the ‘daytime sedative’ for everyday emotional stress”; in the 1970s, Roche advertised Valium (diazepam) as a treatment for the “unremitting buildup of everyday emotional stress resulting in disabling tension.”

Both the narrative and the use of drugs to treat symptoms of depression transformed after Prozac—the brand name for fluoxetine—was released. “Prozac was unique when it came out in terms of side effects compared to the antidepressants available at the time (tricyclic antidepressants and monoamine oxidase inhibitors),” Anthony Rothschild, psychiatry professor at the University of Massachusetts Medical School, writes in an email. “It was the first of the newer antidepressants with less side effects.”

Even the minimum therapeutic dose of commonly prescribed tricyclics like amitriptyline (Elavil) could cause intolerable side effects, says Hyman. “Also these drugs were potentially lethal in overdose, which terrified prescribers.” The market for early antidepressants, as a result, was small.

Prozac changed everything. It was the first major success in the selective serotonin reuptake inhibitor (SSRI) class of drugs, designed to target serotonin, a neurotransmitter. It was followed by many more SSRIs, which came to dominate the antidepressant market. The variety affords choice, which means that anyone who experiences a problematic side effect from one drug can simply opt for another. (Each antidepressant causes variable and unpredictable side effects in some patients. Deciding which antidepressant to prescribe to which patient has been described as a “flip of a coin.”)

Rothschild notes that all existing antidepressant have similar efficacy. “No drug today is more efficacious than the very first antidepressants such as the tricyclic imipramine,” agrees Hyman. Three decades since Prozac arrived, there are many more antidepressant options, but no improvement in efficacy of treatment.

Meanwhile, as Lacasse and Leo note in a 2005 paper, manufacturers typically marketed these drugs with references to chemical imbalances in the brain. For example, a 2001 television ad for sertraline (another SSRI) said, “While the causes are unknown, depression may be related to an imbalance of natural chemicals between nerve cells in the brain. Prescription Zoloft works to correct this imbalance.”

Another advertisement, this one in 2005, for the drug paroxetine, said, “With continued treatment, Paxil can help restore the balance of serotonin,” a neurotransmitter.

“[T]he serotonin hypothesis is typically presented as a collective scientific belief,” write Lacasse and Leo, though, as they note: “There is not a single peer-reviewed article that can be accurately cited to directly support claims of serotonin deficiency in any mental disorder, while there are many articles that present counterevidence.”

Despite the lack of evidence, the theory has saturated society. In their 2007 paper, Lacasse and Leo point to dozens of articles in mainstream publications that refer to chemical imbalances as the unquestioned cause of depression. One New York Times article on Joseph Schildkraut, the psychiatrist who first put forward the theory in 1965, states that his hypothesis “proved to be right.” When Lacasse and Leo asked the reporter for evidence to support this unfounded claim, they did not get a response. A decade on, there are still dozens of articles published every month in which depression is unquestionably described as the result of a chemical imbalance, and many people explain their own symptoms by referring to the myth.

Meanwhile, 30 years after Prozac was released, rates of depression are higher than ever.
* * *
Hyman responds succinctly when I ask him to discuss the causes of depression: “No one has a clue,” he says.

There’s not “an iota of direct evidence” for the theory that a chemical imbalance causes depression, Hyman adds. Early papers that put forward the chemical imbalance theory did so only tentatively, but, “the world quickly forgot their cautions,” he says.

“Neuroscientists don’t have a good way of separating when brains are functioning normally or abnormally.” Depression, according to current studies, has an estimated heritability of around 37%, so genetics and biology certainly play a significant role. Brain activity corresponds with experiences of depression, just as it corresponds with all mental experiences. This, says Horwitz, “has been known for thousands of years.” Beyond that, knowledge is precarious. “Neuroscientists don’t have a good way of separating when brains are functioning normally or abnormally,” says Horwitz.

If depression was a simple matter of adjusting serotonin levels, SSRIs should work immediately, rather than taking weeks to have an effect. Reducing serotonin levels in the brain should create a state of depression, when research has found that this isn’t the case. One drug, tianeptine (a non-SSRI sold under the brand names Stablon and Coaxil across Europe, South America, and Asia, though not the UK or US), has the opposite effect of most antidepressants and decreases levels of serotonin.

This doesn’t mean that antidepressants that affect levels of serotonin definitively don’t work—it simply means that we don’t know if they’re affecting the root cause of depression. A drug’s effect on serotonin could be a relatively inconsequential side effect, rather than the crucial treatment.

by Olivia Goldhill, Quartz |  Read more:
Image: Reuters/Lucy Nicholson
[ed. See also: Sometimes Depression Means Not Feeling Anything At All]

The Biggest Secret

My Life as a New York Times Reporter in the Shadow of the War on Terror

There's no press room at CIA headquarters, like there is at the White House. The agency doesn’t hand out press passes that let reporters walk the halls, the way they do at the Pentagon. It doesn’t hold regular press briefings, as the State Department has under most administrations. The one advantage that reporters covering the CIA have is time. Compared to other major beats in Washington, the CIA generates relatively few daily stories. You have more time to dig, more time to meet people and develop sources.

I started covering the CIA in 1995. The Cold War was over, the CIA was downsizing, and CIA officer Aldrich Ames had just been unmasked as a Russian spy. A whole generation of senior CIA officials was leaving Langley. Many wanted to talk.

I was the first reporter many of them had ever met. As they emerged from their insular lives at the CIA, they had little concept of what information would be considered newsworthy. So I decided to show more patience with sources than I ever had before. I had to learn to listen and let them talk about whatever interested them. They had fascinating stories to tell.

In addition to their experiences in spy operations, many had been involved in providing intelligence support at presidential summit meetings, treaty negotiations, and other official international conferences. I realized that these former CIA officers had been backstage at some of the most historic events over the last few decades and thus had a unique and hidden perspective on what had happened behind the scenes in American foreign policy. I began to think of these CIA officers like the title characters in Tom Stoppard’s play “Rosencrantz and Guildenstern Are Dead,” in which Stoppard reimagines “Hamlet” from the viewpoint of two minor characters who fatalistically watch Shakespeare’s play from the wings. (...)

Success as a reporter on the CIA beat inevitably meant finding out government secrets, and that meant plunging headlong into the classified side of Washington, which had its own strange dynamics.

I discovered that there was, in effect, a marketplace of secrets in Washington, in which White House officials and other current and former bureaucrats, contractors, members of Congress, their staffers, and journalists all traded information. This informal black market helped keep the national security apparatus running smoothly, limiting nasty surprises for all involved. The revelation that this secretive subculture existed, and that it allowed a reporter to glimpse the government’s dark side, was jarring. It felt a bit like being in the Matrix.

Once it became known that you were covering this shadowy world, sources would sometimes appear in mysterious ways. In one case, I received an anonymous phone call from someone with highly sensitive information who had read other stories I had written. The information from this new source was very detailed and valuable, but the person refused to reveal her identity and simply said she would call back. The source called back several days later with even more information, and after several calls, I was able to convince her to call at a regular time so I would be prepared to talk. For the next few months, she called once every week at the exact same time and always with new information. Because I didn’t know who the source was, I had to be cautious with the information and never used any of it in stories unless I could corroborate it with other sources. But everything the source told me checked out. Then after a few months, she abruptly stopped calling. I never heard from her again, and I never learned her identity. (...)

Disclosures of confidential information to the press were generally tolerated as facts of life in this secret subculture. The media acted as a safety valve, letting insiders vent by leaking. The smartest officials realized that leaks to the press often helped them, bringing fresh eyes to stale internal debates. And the fact that the press was there, waiting for leaks, lent some discipline to the system. A top CIA official once told me that his rule of thumb for whether a covert operation should be approved was, “How will this look on the front page of the New York Times?” If it would look bad, don’t do it. Of course, his rule of thumb was often ignored.

For decades, official Washington did next to nothing to stop leaks. The CIA or some other agency would feign outrage over the publication of a story it didn’t like. Officials launched leak investigations but only went through the motions before abandoning each case. It was a charade that both government officials and reporters understood. (...)

One reason that officials didn’t want to conduct aggressive leak investigations was that they regularly engaged in quiet negotiations with the press to try to stop the publication of sensitive national security stories. Government officials seemed to understand that a get-tough approach to leaks might lead to the breakdown of this informal arrangement. (...)

That spring, just as the U.S.-led invasion of Iraq began, I called the CIA for comment on a story about a harebrained CIA operation to turn over nuclear blueprints to Iran. The idea was that the CIA would give the Iranians flawed blueprints, and Tehran would use them to build a bomb that would turn out to be a dud.

The problem was with the execution of the secret plan. The CIA had taken Russian nuclear blueprints it had obtained from a defector and then had American scientists riddle them with flaws. The CIA then asked another Russian to approach the Iranians. He was supposed to pretend to be trying to sell the documents to the highest bidder.

But the design flaws in the blueprints were obvious. The Russian who was supposed to hand them over feared that the Iranians would quickly recognize the errors, and that he would be in trouble. To protect himself when he dropped off the documents at an Iranian mission in Vienna, he included a letter warning that the designs had problems. So the Iranians received the nuclear blueprints and were also warned to look for the embedded flaws.

Several CIA officials believed that the operation had either been mismanaged or at least failed to achieve its goals. By May 2003, I confirmed the story through a number of sources, wrote up a draft, and called the CIA public affairs office for comment.

Instead of responding to me, the White House immediately called Washington Bureau Chief Jill Abramson and demanded a meeting.

The next day, Abramson and I went to the West Wing of the White House to meet with National Security Adviser Condoleezza Rice. In her office, just down the hall from the Oval Office, we sat across from Rice and George Tenet, the CIA director, along with two of their aides.

Rice stared straight at me. I had received information so sensitive that I had an obligation to forget about the story, destroy my notes, and never make another phone call to discuss the matter with anyone, she said. She told Abramson and me that the New York Times should never publish the story. (...)

In the spring of 2004, just as the Plame case was heating up and starting to change the dynamics between the government and the press, I met with a source who told me cryptically that there was something really big and really secret going on inside the government. It was the biggest secret the source had ever heard. But it was something the source was too nervous to discuss with me. A new fear of aggressive leak investigations was filtering down. I decided to stay in touch with the source and raise the issue again.

Over the next few months, I met with the source repeatedly, but the person never seemed willing to divulge what the two of us had begun to refer to as “the biggest secret.” Finally, in the late summer of 2004, as I was leaving a meeting with the source, I said I had to know what the secret was. Suddenly, as we were standing at the source’s front door, everything spilled out. Over the course of about 10 minutes, the source provided a detailed outline of the NSA’s massive post-9/11 domestic spying program, which I later learned was code-named Stellar Wind.

The source told me that the NSA had been wiretapping Americans without search warrants, without court approval. The NSA was also collecting the phone and email records of millions of Americans. The operation had been authorized by the president. The Bush administration was engaged in a massive domestic spying program that was probably illegal and unconstitutional, and only a handful of carefully selected people in the government knew about it.

I left that meeting shocked, but as a reporter, I was also elated. I knew that this was the story of a lifetime.

by James Risen, The Intercept |  Read more:
Image: Elise Swain/Getty. Virginia Lozano for The Intercept
[ed. Must read account of collusion and coercion between the US government and media.]

Tuesday, January 2, 2018

When Mega-Cities Go Global

SAN FRANCISCO - Well before anyone thought of this place as the center of the tech economy, the Bay Area built ships. And it did so with the help of many parts of the country.

Douglas fir trees logged in the Pacific Northwest were turned into lumber schooners here. Steel from the East, brought in by railroad, became merchant vessels. During World War II, workers assembled military shipswith parts from across the country: steam turbines from Schenectady, N.Y., and Lester, Pa.; gear winches from Tacoma, Wash.; radio equipment from Newark; compasses from Detroit; generators from Milwaukee.

Most of these links that tied the Bay Area’s prosperity to a web of places far from here have faded. Westinghouse closed the Pennsylvania plant. General Electric downsized in Schenectady. The Milwaukee manufacturer dissolved. The old Bethlehem Shipbuilding yard in San Francisco will soon be redeveloped. And its former parent company, the Bethlehem Steel Corporation in Bethlehem, Pa., went bankrupt in 2001.

The companies that now drive the Bay Area’s soaring wealth — and that represent part of the American economy that’s booming — don’t need these communities in the same way. Google’s digital products don’t have a physical supply chain. Facebook doesn’t have dispersed manufacturers. Apple, which does make tangible things, now primarily makes them overseas.

A changing economy has been good to the region, and to a number of other predominantly coastal metros like New York, Boston and Seattle. But economists and geographers are now questioning what the nature of their success means for the rest of the country. What happens to America’s manufacturing heartland when Silicon Valley turns to China? Where do former mill and mining towns fit in when big cities shift to digital work? How does upstate New York benefit when New York City increases business with Tokyo?

The answers have social and political implications at a time when broad swaths of the country feel alienated from and resentful of “elite” cities that appear from a distance to have gone unscathed by the forces hollowing out smaller communities. To the extent that many Americans believe they’re disconnected from the prosperity in these major metros — even as they use the apps and services created there — perhaps they’re right.

“These types of urban economies need other major urban economies more than they need the standardized production economies of other cities in their country,” said Saskia Sassen, a sociologist at Columbia who has long studied the global cities that occupy interdependent nodes in the world economy. New York, in other words, needs London. But what about Bethlehem, Pa.?

Such a picture, Ms. Sassen said, “breaks a past pattern where a range of smaller, more provincial cities actually fed the rise of the major cities.” Now major cities are feeding one another, and doing so across the globe.

Ram Mudambi, a professor in the Fox School of Business at Temple University, offers an even more unnerving hypothesis, in two parts: The more globally connected a city, the more prosperous it is. And as such cities gain global ties, they may be shedding local ones to the “hinterland” communities that have lost their roles in the modern economy or lost their jobs to other countries.

Richard Longworth, a distinguished fellow with the Chicago Council on Global Affairs, fears that exactly this is happening in Chicago. The metropolitan area long sat at the center of a network of economic linkscrisscrossing the Midwest. They connected Chicago to Wisconsin mill towns that sent their lumber there, Iowa farmers who supplied the city’s meatpackers, Michigan ice houses that emerged along the railroads transporting that meat to New York.

“These links have been broken,” Mr. Longworth said. Of course, some remain. And antipathy toward prosperous big cities is not a new theme in history. “But this is different: This is deeper,” Mr. Longworth said. “It is also, as far as we can see, permanent, simply because the economy that supported the earlier relationships has gone away and shows no sign of coming back.”

The Rise of Global Cities


For much of the 20th century, wages in poorer parts of the country were rising faster than wages in richer places. Their differences were narrowing, a product of migration between the two and gains from manufacturing that helped lift up regions that were once deeply poor. Then around 1980, according to work by the Princeton researcher Elisa Giannone, that convergence began to stall.

Cities full of highly educated workers like Boston, San Francisco and New York began to pull away. And that pattern, Ms. Giannone finds, has been driven entirely by what’s happening with high-skilled workers: When they cluster together in these places, their wages rise even more. That widens inequality both within wealthy cities and between wealthy regions and poorer ones.

“Big changes have been happening over the last 30 years,” Ms. Giannone said. “Now we’re actually seeing the impact of them.”

Those changes have come from multiple directions — from globalization, from computerization, from the shift in the United States away from manufacturing toward a knowledge and service economy. These trends have buffeted many smaller cities and nonurban areas. The uncomfortable political truth is that they’ve also benefited places like San Francisco and New York.

“The economic base has shifted in a way that highly favors cities — and big cities — because it’s now based on knowledge, on idea exchange, on agglomeration,” said Mark Muro, the policy director of the Metropolitan Policy Program at the Brookings Institution. (...)

For all of the talk of how globalization has cost America manufacturing jobs, it has created American jobs, too — but the high-paying ones have tended to go to such cities.

Ms. Sassen argues that a global economy has created new kinds of needs for companies: accountants specializing in Asian tax law, lawyers expert in European Union regulation, marketers who understand Latin America. Global cities must connect to other global cities to tap these resources, which have become more valuable to them than lumber and steel.

by Emily Badger, NY Times |  Read more:
Image: Todd Heisler/New York Times

AmpliTube 4


[ed. I've been out of electric guitar world for a while, but this is insane. A complete engineering studio. See also: Jon Herington (Steely Dan) demo-ing his signature sounds on AmpliTube.]

Steely Dan

Just When You Thought Democrats Couldn't Get Any More Oblivious...

Theoretically, the left/liberal opposition party should have a lot to offer voters at the moment. After all, the country is presently being run by a cartoon of an evil billionaire, whose stated objective is to make his rich friends richer while eliminating regulations on predatory financial services companies, employers who injure and exploit their employees, and nursing homes that kill their patients. Most people do not support Donald Trump’s agenda: the majority believe that the government ought to guarantee people healthcare coverage and that corporations should not receive a huge tax cut, but Trump’s two major policy pushes have been for the elimination of the government’s role in health care and the reduction of corporate taxes.

Yet somehow, amidst what should be an important political opportunity for the left, the Democratic Party has just received its lowest public approval rating in 25 years of polling. Ratings have been dropping throughout the year and are especially poor among young people. That poses a puzzle, because millennials are actually more liberal than ever, with a greater number now preferring socialism to capitalism. If they despise the Republican agenda, why aren’t they all proud Democrats?

We can get some clues to the answer from Bill Scher’s Politico essay “The Case For a Generic Democrat,” which nicely encapsulates the Democratic obliviousness that is so harming the party’s electoral fortunes. Scher makes the case that the Democrats should be as flavorless and insipid as possible, with no real values beyond platitudes and no real policies beyond opposing Republicans. As far as I can tell, he is quite serious about this. And the fact that there are Democrats who think this way tells us a lot about what is going wrong.

Scher says that Democrats “have been embroiled in a debate over how to fix what went wrong in 2016. Should they tack left or center? Woo white working-class voters with an ambitious economic agenda or double down on the base by blitzing Donald Trump on bigotry? Prioritize health care? Inequality? Oligarchy? Democracy?” But the victory of Doug Jones in Alabama, Scher says, “may have just rendered these debates irrelevant.” After all, Doug Jones did not really do any of this: he didn’t swing to the left or the right. “In fact, he didn’t have any signature policy proposals at all.” Instead, he took “the most pallid Democratic talking points… and campaigned with a pleasant, inoffensive demeanor. He was boring. He was safe. He was Mr. Generic Democrat. And it worked. That should make Democrats think twice about what they should be looking for in a 2020 presidential nominee.” Scher says that Democrats should beware of boldness, because it risks “polarization.” The Democratic candidate should be like Jones and draw as little attention to themselves as possible. Scher points out that polling match-ups between a Generic Democrat and Donald Trump show the Generic Democrat winning. He says that the strategy of being nothing more than a party cipher worked well for Warren Harding in 1920, though Scher admits that Harding is now almost universally regarded as one of the worst presidents in American history. And Scher gives suggestions for candidates who would be the top of the list under his strategy. Number one? Tim Kaine. (...)

If your opponent is discredited by a scandal, running on the platform “I am not discredited by a scandal” may well secure you just enough votes to win. If Trump’s scandals were enough to sink him, the Democrats wouldn’t have to do much to get into office. But we have already seen that Trump’s scandals aren’t enough to sink him. A slew of women accused him of sexual assault in the lead-up to 2016, and Trump got millions more votes than squeaky-clean Mitt Romney. This “I am not my crazy opponent” pitch was exactly what Hillary Clinton ran on in 2016, and it got Donald Trump elected president. Bernie Sanders would have beaten Donald Trump, yet Scher wants to avoid “boldness” and haul Tim Kaine out to lose yet again. (Note that the logic seems to lead inevitably to choosing a white guy, like Gore or Kerry. A person of color might, after all, be “polarizing.”) (...)

I’ve suggested before that it’s a bad idea for Democrats to adopt traditionally Republican rhetoric for reasons of political opportunism. First, and most importantly, it undermines the whole point of left politics: we’re supposed to actually stand for left values, not whatever values are most useful to taunt Republicans with. We should therefore be sincere and consistent in refusing to enter the “Who Loves America More?” patriotism contest, and not adjusting our level of confidence in prosecutors’ integrity based on whether they happen to be prosecuting people we dislike. What disgusts me about Bruni and Scher is that their kind of Democratic politics has no serious underlying principles. In responding to the question “What Should Democrats Stand For?” Scher’s only consideration is what will get Democrats to office. He doesn’t care what they actually do when they’re in office (which is why he likes Warren G. Harding), whether they propose any actual policies or demonstrate any knowledge of how to accomplish anything that will improve human lives. Politics is nothing more than a contest for a few more seats in the legislature, and if the best way to get those is to abandon every hint of a strong moral conviction, well, so much the worse for your moral convictions.

But this kind of thinking is not just unprincipled, it’s also bad strategy even on its own terms. Democrats have run no shortage of boring candidates who sound like Republicans. The “willfully uninspiring” approach to electoral politics seems to have been official party policy for the last eight years, and it has cost Democrats both a lot of state governments and a lot of Congressional seats. The party has failed to recognize the most basic truths about contemporary America: a lot of people are going through unnecessary economic hardships, and the party of the Working People has ceased to represent their interests. Just look at this recent Washington Post articleabout workers who were laid off from a McDonnell Douglas plant in Tulsa when it closed in 1994. Today, they are well into old age, but many of them are still working, whether as Wal Mart greeters or Dollar Tree cashiers. (Dollar stores are prospering at the moment because for some unfathomable reason millennials seem to do a lot of their shopping at them.) The Post discusses how the decline of pension plans has meant that many workers now face the prospect of remaining employed well into their final years of life, never retiring, never paying off their mortgages.

What does the Democratic Party have to offer these people? What is it proposing to do to fix this? Even the reworked “populist” messaging the party tried out after 2016 did little more than emphasize “jobs.” But people have jobs, that’s the problem. Unemployment is actually low at the moment, the problem is that many people’s jobs suck, and that they are exhausted and hopeless and debt-ridden. The country needs its pension plans back, but that will require an incredible amount of ambition, since corporations are hardly going to do it willingly. The refusal to be “bold” is also a refusal to actually try to make life better for people.

There is a meme circulating among liberals at the moment that sums up the problem well. It encourages everyone on the left who dislikes the Democratic Party to suck it up and vote for them anyway:

Dear liberals and independents: In 2020 there will be a candidate competing against Donald Trump. It is very likely this candidate (1) isn’t your first choice (2) isn’t 100% ideologically pure (3) has made mistakes in their life (4) might not really excite you all that much (5) has ideas you are uncomfortable with. Please start getting over that shit now…

I like this because it admits that it’s very unlikely the Democrats will nominate someone who is inspiring and who people are actually comfortable voting for. We’ve given up on the possibility before the race has even begun, we’re getting a head start on compromising everything we’re fighting for. (The party’s informal slogan could be “You’ll Eat It And You’ll Like It.”)

by Nathan J. Robinson, Current Affairs |  Read more:
Image: Nick Sirotich

Making China Great Again

For years, China’s leaders predicted that a time would come—perhaps midway through this century—when it could project its own values abroad. In the age of “America First,” that time has come far sooner than expected.

Barack Obama’s foreign policy was characterized as leading from behind. Trump’s doctrine may come to be understood as retreating from the front. Trump has severed American commitments that he considers risky, costly, or politically unappealing. In his first week in office, he tried to ban travellers from seven Muslim-majority countries, arguing that they pose a terrorist threat. (After court battles, a version of the ban took effect in December.) He announced his intention to withdraw the U.S. from the Paris Agreement on climate change and from UNESCO, and he abandoned United Nations talks on migration. He has said that he might renege on the Iran nuclear deal, a free-trade agreement with South Korea, and NAFTA. His proposal for the 2018 budget would cut foreign assistance by forty-two per cent, or $11.5 billion, and it reduces American funding for development projects, such as those financed by the World Bank. In December, Trump threatened to cut off aid to any country that supports a resolution condemning his decision to recognize Jerusalem as the capital of Israel. (The next day, in defiance of Trump’s threat, the resolution passed overwhelmingly.)

To frame his vision of a smaller presence abroad, Trump often portrays America’s urgent task as one of survival. As he put it during the campaign, “At what point do you say, ‘Hey, we have to take care of ourselves’? So, you know, I know the outer world exists and I’ll be very cognizant of that, but, at the same time, our country is disintegrating.”

So far, Trump has proposed reducing U.S. contributions to the U.N. by forty per cent, and pressured the General Assembly to cut six hundred million dollars from its peacekeeping budget. In his first speech to the U.N., in September, Trump ignored its collective spirit and celebrated sovereignty above all, saying, “As President of the United States, I will always put America first, just like you, as the leaders of your countries, will always and should always put your countries first.”

China’s approach is more ambitious. In recent years, it has taken steps to accrue national power on a scale that no country has attempted since the Cold War, by increasing its investments in the types of assets that established American authority in the previous century: foreign aid, overseas security, foreign influence, and the most advanced new technologies, such as artificial intelligence. It has become one of the leading contributors to the U.N.’s budget and to its peacekeeping force, and it has joined talks to address global problems such as terrorism, piracy, and nuclear proliferation.

And China has embarked on history’s most expensive foreign infrastructure plan. Under the Belt and Road Initiative, it is building bridges, railways, and ports in Asia, Africa, and beyond. If the initiative’s cost reaches a trillion dollars, as predicted, it will be more than seven times that of the Marshall Plan, which the U.S. launched in 1947, spending a hundred and thirty billion, in today’s dollars, on rebuilding postwar Europe.

China is also seizing immediate opportunities presented by Trump. Days before the T.P.P. withdrawal, President Xi Jinping spoke at the World Economic Forum, in Davos, Switzerland, a first for a paramount Chinese leader. Xi reiterated his support for the Paris climate deal and compared protectionism to “locking oneself in a dark room.” He said, “No one will emerge as a winner in a trade war.” This was an ironic performance—for decades, China has relied on protectionism—but Trump provided an irresistible opening. China is negotiating with at least sixteen countries to form the Regional Comprehensive Economic Partnership, a free-trade zone that excludes the United States, which it proposed in 2012 as a response to the T.P.P. If the deal is signed next year, as projected, it will create the world’s largest trade bloc, by population.

Some of China’s growing sway is unseen by the public. In October, the World Trade Organization convened ministers from nearly forty countries in Marrakech, Morocco, for the kind of routine diplomatic session that updates rules on trade in agriculture and seafood. The Trump Administration, which has been critical of the W.T.O., sent an official who delivered a speech and departed early. “For two days of meetings, there were no Americans,” a former U.S. official told me. “And the Chinese were going into every session and chortling about how they were now guarantors of the trading system.”

By setting more of the world’s rules, China hopes to “break the Western moral advantage,” which identifies “good and bad” political systems, as Li Ziguo, at the China Institute of International Studies, has said. (...)

Xi Jinping has the kind of Presidency that Donald Trump might prefer. Last fall, he started his second term with more unobstructed power than any Chinese leader since Deng Xiaoping, who died in 1997. The Nineteenth Party Congress, held in October, had the spirit of a coronation, in which the Party declared Xi the “core leader,” an honor conferred only three other times since the founding of the nation (on Mao Zedong, Deng, and Jiang Zemin), and added “Xi Jinping Thought” to its constitution—effectively allowing him to hold power for life, if he chooses. He enjoys total dominion over the media: at the formal unveiling of his new Politburo, the Party barred Western news organizations that it finds troublesome; when Xi appeared on front pages across the country, his visage was a thing of perfection, airbrushed by Party “news workers” to the sheen of a summer peach.

For decades, China avoided directly challenging America’s primacy in the global order, instead pursuing a strategy that Deng, in 1990, called “hide your strength and bide your time.” But Xi, in his speech to the Party Congress, declared the dawn of “a new era,” one in which China moves “closer to center stage.” He presented China as “a new option for other countries,” calling this alternative to Western democracy the zhongguo fang’an, the “Chinese solution.” (...)

When Trump won, the Party “was in a kind of shock,” Michael Pillsbury, a former Pentagon aide and the author of “The Hundred-Year Marathon,” a 2015 account of China’s global ambitions, told me. “They feared that he was their mortal enemy.” The leadership drafted potential strategies for retaliation, including threatening American companies in China and withholding investment from the districts of influential members of Congress.

Most of all, they studied Trump. Kevin Rudd, the former Prime Minister of Australia, who is in contact with leaders in Beijing, told me, “Since the Chinese were stunned that Trump was elected, they were intrinsically respectful of how he could’ve achieved it. An entire battery of think tanks was set to work, to analyze how this had occurred and how Trump had negotiated his way through to prevail.”

Before he entered the White House, China started assembling a playbook for dealing with him. Shen Dingli, a foreign-affairs specialist at Fudan University, in Shanghai, explained that Trump is “very similar to Deng Xiaoping,” the pragmatic Party boss who opened China to economic reform. “Deng Xiaoping said, ‘Whatever can make China good is a good “ism.” ’ He doesn’t care if it’s capitalism. For Trump, it’s all about jobs,” Shen said.

The first test came less than a month after the election, when Trump took a call from Taiwan’s President, Tsai Ing-wen. “Xi Jinping was angry,” Shen said. “But Xi Jinping made a great effort not to create a war of words.” Instead, a few weeks later, Xi revealed a powerful new intercontinental ballistic missile. “It sends a message: I have this—what do you want to do?” Shen said. “Meantime, he sends Jack Ma”—the founder of the e-commerce giant Alibaba—“to meet with Trump in New York, offering one million jobs through Alibaba.” Shen went on, “China knows Trump can be unpredictable, so we have weapons to make him predictable, to contain him. He would trade Taiwan for jobs.”

Inside the new White House, there were two competing strategies on China. One, promoted by Stephen Bannon, then the chief strategist, wanted the President to take a hard line, even at the risk of a trade war. Bannon often described China as a “civilizational challenge.” The other view was associated with Jared Kushner, Trump’s son-in-law and adviser, who had received guidance from Henry Kissinger and met repeatedly with the Chinese Ambassador, Cui Tiankai. Kushner argued for a close, collegial bond between Xi and Trump, and he prevailed.

He and Rex Tillerson, the Secretary of State, arranged for Trump and Xi to meet at Mar-a-Lago on April 7th, for a cordial get-to-know-you summit. To set the tone, Trump presented two of Kushner and Ivanka Trump’s children, Arabella and Joseph, who sang “Jasmine Flower,” a classic Chinese ballad, and recited poetry. While Xi was at the resort, the Chinese government approved three trademark applications from Ivanka’s company, clearing the way for her to sell jewelry, handbags, and spa services in China.

Kushner has faced scrutiny for potential conflicts of interest arising from his China diplomacy and his family’s businesses. During the transition, Kushner dined with Chinese business executives while the Kushner Companies was seeking their investment in a Manhattan property. (After that was revealed in news reports, the firm ended the talks.) In May, Kushner’s sister, Nicole Kushner Meyer, was found to have mentioned his White House position while she courted investors during a trip to China. The Kushner Companies apologized.

During the Mar-a-Lago meetings, Chinese officials noticed that, on some of China’s most sensitive issues, Trump did not know enough to push back. “Trump is taking what Xi Jinping says at face value—on Tibet, Taiwan, North Korea,” Daniel Russel, who was, until March, the Assistant Secretary of State for East Asian and Pacific Affairs, told me. “That was a big lesson for them.” Afterward, Trump conceded to the Wall Street Journal how little he understood about China’s relationship to North Korea: “After listening for ten minutes, I realized it’s not so easy.”

Russel spoke to Chinese officials after the Mar-a-Lago visit. “The Chinese felt like they had Trump’s number,” he said. “Yes, there is this random, unpredictable Ouija-board quality to him that worries them, and they have to brace for some problems, but, fundamentally, what they said was ‘He’s a paper tiger.’ 

by Evan Osnos, New Yorker |  Read more:
Image: Paul Rogers

Monday, January 1, 2018

Legal Weed Isn’t The Boon Small Businesses Thought It Would Be

The business of selling legal weed is big and getting bigger. North Americans spent $6.7 billion on legal cannabis last year, and some analysts think that with California set to open recreational dispensaries on Jan. 1 and Massachusetts and Canada soon to follow, the market could expand to more than $20.2 billion by 2021. So it’s no surprise that you see eager business people across the country lining up to invest millions of dollars in this green rush.

But here’s a word of warning for those looking to dive head-first into these brand-new legal weed markets: The data behind the first four years of legal pot sales, with drops in retail prices and an increase in well-funded cannabis growing operations, shows a market that increasingly favors big businesses with deep pockets. As legal weed keeps expanding, pot prices are likely to continue to decline, making the odds of running a profitable small pot farm even longer.

Washington offers a cautionary tale for would-be pot producers. The state’s marijuana market, for which detailed information is available to the public, has faced consistent declines in prices, production consolidated in larger farms and a competitive marketplace that has forced cannabis processors to shell out for sophisticated technology to create brand new ways to get high.

“A lot of people (in Washington) are surprised, and a lot of people are in denial about the price dropping,” said Steven Davenport, a researcher with the RAND Corporation. “The average price per gram in Washington is about $8, and it’s not clear where the floor is going to be.” (...)

Consolidating Cannabis Farming

When Washington’s regulators set up their market for legal cannabis, they created three tiers of pot producers based on the square footage of each farm. License different sizes of farms, the thinking went, and the market will support a range of small, medium and large producers.

Fast-forward three years, and it appears this thinking was flawed. Big recreational producers have swallowed up most of the market, pushing out the small-scale growers of the black and medical markets. From January through September of this year, the 10 largest farms in Washington harvested 16.79 percent of all the dry weight weed grown in the state, which is more than the share produced by the 500 smallest farms combined (13.12 percent).

Davenport said this consolidation of cannabis farming in Washington is just a harbinger of what’s to come. “I think what has become more clear is the inevitability of pretty large-scale production, and that is really going to start to drive down production costs,” Davenport said.

Current regulations keep pot farms from infinitely expanding, but as legalization marches forward, bigger farms could well be permitted. This summer, regulators in Washington expanded the maximum farm size from 30,000 square feet to 90,000. California plans on capping farms at 1 acre, or 43,560 square feet, when the market first launches. But the state rules do not currently stop farmers from using multiple licenses, which opens the door for larger farms.

What would happen if pot farms could be as large as wheat or corn fields? According to Caulkins, 10 reasonably sized farms could conceivably produce the entire country’s supply of tetrahydrocannabinol, pot’s most famous active chemical (usually shortened to THC).

“You can grow all of the THC consumed in the entire country on less than 10,000 acres,” Caulkins said. “A common size for a Midwest farm is 1,000 acres.”

by Lester Black, FiveThirtyEight |  Read more:
Image: Gilles Mingasson