Wednesday, February 24, 2016

The Madness of Airline Élite Status

When you fly a lot for work, as I do, you check your frequent-flier mile balance often, to provide data for competitive commiseration. “Eighteen flights this year already, fourteen hotel nights in eleven different hotels,” a friend e-mailed me, in victory, earlier this month. You also compulsively track your frequent-flier “status” levels, to mark your progress toward becoming a trusty in the prison of weekly air travel. And so, last month, when my United Airlines app told me that my status—as a customer, as a flier, as a man—had changed, I did a delighted double take. United had made me a member of Global Services, its apotheosis of a frequent flier. But even as I tried to remember the advertised perks (free tickets? free back rubs?), I was beginning to sense some symptoms.

My status was good only for 2016, which meant that I would be relegated to a lower level if I didn’t keep up the pace of ticket purchases. So, not twenty minutes after achieving my new status, I found myself calling the Global Services help desk and asking how much it would cost to change a frequent-flier award ticket to a bought one. (Global Services veterans had warned me never to lose the chance to “earn” miles, and instead to use frequent-flier points for other people’s flights.) I then asked my wife for permission to spend five hundred and sixty dollars for a flight that I already had a free ticket for. She told me I was insane. But I wasn’t insane. I knew others similarly afflicted. I had Global Services Maintenance Anxiety Disorder.

GS-MAD afflicts only a small sliver of the frequent-flying élite. As a precondition, you have to be extremely loyal to United, either because you have a soft spot for incessantly played “Rhapsody in Blue” (and I like a Gershwin tune, how about you?) or, more probably, because the airline has a hub near your home. You also have to fly a lot. Global Services is a level above another status tier, Premier 1K, that requires you to fly an annual cumulative distance equal, more or less, to four times the circumference of the earth. With Premier 1K and the Platinum, Gold, and Silver MileagePlus status levels, you can track your progress with each flight. It’s a logical system of inputs and outputs, like dieting, except instead of being rewarded for skipping a fudge nut sundae, you’re credited for flying to Peru. But the diabolical marketing genius of Global Services is that, as St. Paul said of grace, it cannot be earned by works. It is a gift. And God, in this case, is an algorithm of United Airlines.

Absent posted guidelines, road-warrior message boards are filled with speculation about why certain travellers receive Global Services. Is it a measure of dollars spent? Segments flown? Behavior? Maybe United is watching us all, and you weren’t elevated because someone noticed you wiping Doritos dust off your fingers on the armrest in 17C. Maybe United is reading this essay. Maybe by writing this I am committing an unpardonable sin, akin to a Scientologist mafia underboss penning a memoir. Or maybe United will be pleased by the publicity and invite me to an even more secretive status level—Solar System Services, here I come.

by Gary Sernovitz, New Yorker |  Read more:
Image: Kent Nishimura, Denver Post via Getty

The Next Justice? It’s Not Up to Us

[ed. See also: Senate Republicans Lose Their Minds on a Supreme Court Seat]

No sooner was Antonin Scalia dead than Republicans said that his seat should not be filled before the election of a new president. Senator Mitch McConnell said this will let the American people “have a voice” in who the new justice will be. Senator Kelly Ayotte said “Americans deserve an opportunity to weigh in” on the matter. And Senator Ted Cruz, the presidential candidate, Senate Judiciary Committee member, and self-styled guardian of the Constitution, wrote on Twitter, “We owe it to him, [Scalia] & the Nation, for the Senate to ensure that the next President names his replacement.” That is, we owe it to the archetypal originalist, where the Constitution is concerned, to ignore and defy the original Constitution.

One thing the framers of the Constitution set out to prevent was a popular say in who should be a Supreme Court justice. The aim of the document was to ensure there would be an independent judiciary—independent of Congress (by ensuring justices’ salaries), independent of changing administrations (by granting them life tenure), and not subject to popular election. This ideal could not be perfectly reached, and changes in the Constitution have made it even harder to attain. But those who profess an absolute devotion to the Constitution should at least pay it some lip service.

If the framers wanted to let the people “have a say” and “weigh in,” they would have made the appointment or confirmation of the justices come from the one directly democratic part of the system—the popularly-elected and short-termed members of the House of Representatives, a body that was designed to read the pulse of the people in a direct and frequent way. Instead, they gave the choice of justices a double baffle of insulation from the public. The president alone has the appointment power—and remember that the president was originally not elected directly by the people but indirectly through electors. Then a second filter was provided by confirmation in the Senate—and the Senate was originally not directly elected but indirectly by state legislatures. The Senate was meant to be a more stable body than the House, its members serving terms that are three times as long and only a third of them up for reelection at a time—not the whole body, as in the House. The Senate was meant to assure other nations that treaties (confirmed by the Senate) and other commitments would be honored for more than a day.

Of course, Senators became popularly elected in 1913, by the Seventeenth Amendment. But originalists should at least remember that senators were given their confirmation power because they were not subject to continuing popular approval. An extra fillip of irony is provided now, since some of the conservatives who want to let the people “have a say” in who becomes a justice—including Ted Cruz!—have recently called for revocation of the Seventeenth Amendment, so the people would not have a say in who becomes a senator.

by Garry Wills, NY Review of Books |  Read more:
Image: Honoré Daumier

Tuesday, February 23, 2016

The Echo-Friendly

How a Pink Flower Defeated the World’s Sole Superpower

In October 2001, the U.S. launched its invasion of Afghanistan largely through proxy Afghan fighters with the help of Special Operations forces, American air power, and CIA dollars. The results were swift and stunning. The Taliban was whipped, a new government headed by Hamid Karzai soon installed in Kabul, and the country declared “liberated.”

More than 14 years later, how’d it go? What’s “liberated” Afghanistan like and, if you were making a list, what would be the accomplishments of Washington all these years later? Hmm... at this very moment, according to the latest reports, the Taliban control more territory than at any moment since December 2001. Meanwhile, the Afghan security forces that the U.S. built up and funded to the tune of more than $65 billion are experiencing “unsustainable” casualties, their ranks evidently filledwith “ghost” soldiers and policemen -- up to 40% in some places -- whose salaries, often paid by the U.S., are being pocketed by their commanders and other officials. In 2015, according to the U.N., Afghan civilian casualties were, for the seventh year in a row, at record levels. Add to all this the fact that American soldiers, their “combat mission” officially concluded in 2014, are now being sent by the hundreds back into the fray (along with the U.S. Air Force) to support hard-pressed Afghan troops in a situation which seems to be fast “deteriorating.”

Oh, and economically speaking, how did the “reconstruction” of the country work out, given that Washington pumped more money (in real dollars) into Afghanistan in these years than it did into the rebuilding of Western Europe after World War II? Leaving aside the pit of official corruption into which many of those dollars disappeared, the country is today hemorrhaging desperate young people who can’t find jobs or make a living and now constitute what may be the second largestcontingent of refugees heading for Europe.

As for that list of Washington’s accomplishments, it might be accurate to say that only one thing was “liberated” in Afghanistan over the last 14-plus years and that was, as TomDispatch regular Alfred McCoy points out today, the opium poppy. It might also be said that, with the opium trade now fully embedded in both the operations of the Afghan government and of the Taliban, Washington’s single and singular accomplishment in all its years there has been to oversee the country’s transformation into the planet’s number one narco-state. McCoy, who began his career in the Vietnam War era by writing The Politics of Heroin, a now-classic book on the CIA and the heroin trade (that the Agency tried to suppress) and who has written on the subject of drugs and Afghanistan before for this site, now offers a truly monumental look at opium and the U.S. from the moment this country’s first Afghan War began in 1979 to late last night.


How a Pink Flower Defeated the World’s Sole Superpower
America’s Opium War in Afghanistan
By Alfred W. McCoy

After fighting the longest war in its history, the United States stands at the brink of defeat in Afghanistan. How can this be possible? How could the world’s sole superpower have battled continuously for 15 years, deploying 100,000 of its finest troops, sacrificing the lives of 2,200 of those soldiers, spending more than a trillion dollars on its military operations, lavishing a record hundred billion more on “nation-building” and “reconstruction,” helping raise, fund, equip, and train an army of 350,000 Afghan allies, and still not be able to pacify one of the world’s most impoverished nations? So dismal is the prospect for stability in Afghanistan in 2016 that the Obama White House has recently cancelled a planned further withdrawal of its forces and will leave an estimated 10,000 troops in the country indefinitely.

Were you to cut through the Gordian knot of complexity that is the Afghan War, you would find that in the American failure there lies the greatest policy paradox of the century: Washington’s massive military juggernaut has been stopped dead in its steel tracks by a pink flower, the opium poppy.

by Alfred W. McCoy, TomDispatch |  Read more:
Image: Wikipedia

Happy All the Time

As biometric tracking takes over the modern workplace, the old game of labor surveillance is finding new forms.

Housed in a triumph of architectural transparency in Cambridge, Massachusetts, is the Media Lab complex at MIT, a global hub of human-machine research. From the outside of its newest construction, you can see clear through the building. Inside are open workspaces, glittering glass walls, and screens, all encouragement for researchers to peek in on one another. Everybody always gets to observe everybody else.

Here, computational social scientist Alex Pentland, known in the tech world as the godfather of wearables, directs a team that has created technology applied in Google Glass, smart watches, and other electronic or computerized devices you can wear or strap to your person. In Pentland’s quest to reshape society by tracking human behavior with software algorithms, he has discovered you don’t need to look through a glass window to find out what a person is up to. A wearable device can trace subliminal signals in a person’s tone of voice, body language, and interactions. From a distance, you can monitor not only movements and habits; you can begin to surmise thoughts and motivations.

In the mid-2000s Pentland invented the sociometric badge, which looks like an ID card and tracks and analyzes the wearer’s interactions, behavior patterns, and productivity. It became immediately clear that the technology would appeal to those interested in a more hierarchical kind of oversight than that enjoyed by the gurus of MIT’s high-tech playgrounds. In 2010 Pentland cofounded Humanyze, a company that offers employers the chance to find out how employee behavior affects their business. It works like this: A badge hanging from your neck embedded with microphones, accelerometers, infrared sensors, and a Bluetooth connection collects data every sixteen milliseconds, tracking such matters as how far you lean back in your chair, how often you participate in meetings, and what kind of conversationalist you are. Each day, four gigabytes’ worth of information about your office behavior is compiled and analyzed by Humanyze. This data, which then is delivered to your supervisor, reveals patterns that supposedly correlate with employee productivity.

Humanyze CEO Ben Waber, a former student of Pentland’s, has claimed to take his cues from the world of sports, where “smart clothes” are used to measure the mechanics of a pitcher’s throw or the launch of a skater’s leap. He is determined to usher in a new era of “Moneyball for business,” a nod to baseball executive Billy Beane, whose data-driven approach gave his team, the Oakland Athletics, a competitive edge. With fine-grained biological data points, Waber promises to show how top office performers behave—what happy, productive workers do.

Bank of America hired Humanyze to use sociometric badges to study activity at the bank’s call centers, which employ more than ten thousand souls in the United States alone. By scrutinizing how workers communicated with one another during breaks, analysts came to the conclusion that allowing people to break together, rather than in shifts, reduced stress. This was indicated by voice patterns picked up by the badge, processed by the technology, and reported on an analyst’s screen. Employees grew happier. Turnover decreased.

The executives at Humanyze emphasize that minute behavior monitoring keeps people content. So far, the company has focused on loaning the badges to clients for limited study periods, but as Humanyze scales up, corporate customers may soon be able to use their own in-house analysts and deploy the badges around the clock.

Workers of the world can be happy all the time.

The optimists’ claim: technologies that monitor every possible dimension of biological activity can create faster, safer, and more efficient workplaces, full of employees whose behavior can be altered in accordance with company goals.

Widespread implementation is already underway. Tesco employees stock shelves with greater speed when they wear armbands that register their rate of activity. Military squad leaders are able to drill soldiers toward peak performance with the use of skin patches that measure vital signs. On Wall Street, experiments are ongoing to monitor the hormones of stock traders, the better to encourage profitable trades. According to cloud-computing company Rackspace, which conducted a survey in 2013 of four thousand people in the United States and United Kingdom, 6 percent of businesses provide wearable devices for workers. A third of the respondents expressed readiness to wear such devices, which are most commonly wrist- or head-mounted, if requested to do so.

Biological scrutiny is destined to expand far beyond on-the-job performance. Workers of the future may look forward to pre-employment genetic testing, allowing a business to sort potential employees based on disposition toward anything from post-traumatic stress disorder to altitude sickness. Wellness programs will give employers reams of information on exercise habits, tobacco use, cholesterol levels, blood pressure, and body mass index. Even the monitoring of brain signals may become an office commonplace: at IBM, researchers bankrolled by the military are working on functional magnetic-resonance imaging, or fMRI, a technology that can render certain brain activities into composite images, turning thoughts into fuzzy external pictures. Such technology is already being used in business to divine customer preferences and detect lies. In 2006 a San Diego start-up called No Lie MRI expressed plans to begin marketing the brain-scanning technology to employers, highlighting its usefulness for employee screening. And in Japan, researchers at ATR Computational Neuro­science Laboratories have a dream-reading device in the pipeline that they claim can predict what a person visualizes during sleep. Ryan Hurd, who serves on the board of the International Association for the Study of Dreams, says such conditioning could be used to enhance performance. While unconscious, athletes could continue to practice; creative types could boost their imaginations.

The masterminds at Humanyze have grasped a fundamental truth about surveillance: a person watched is a person transformed. The man who invented the panopticon—a circular building with a central inspection tower that has a view of everything around it—gleaned this, too. But contrary to most discussions of the “all-seeing place,” the idea was conceived not for the prison, but for the factory.

by Lynn Stuart Parramore, Lapham's Quarterly |  Read more:
Image: Edgar Degas

Yohji Yamamoto Fall 2016
via:

The Seven Habits of Highly Depolarizing People

I didn’t vote for him but he’s my President, and I hope he does a good job.
John Wayne (b. 1907) on the election of John F. Kennedy in 1960


I hope he fails.
Rush Limbaugh (b. 1951) on the election of Barack Obama in 2008


In recent decades, we Americans have become highly practiced in the skills and mental habits of demonizing our political opponents. All our instruments agree that we currently do political polarization very well, and researchers tell us that we’re getting better at it all the time.

For example, Stanford Professor Shanto Iyengar and his colleagues recently found that, when it comes both to trusting other people with your money and evaluating the scholarship applications of high school seniors, Americans today are less friendly to people in the other political party than we are to people of a different race. The researchers conclude that “Americans increasingly dislike people and groups on the other side of the political divide and face no social repercussions for the open expression of these attitudes.” As a result, today “the level of partisan animus in the American public exceeds racial animus.”1 That’s saying something!

But if polarization is all around us, familiar as an old coat, what about its opposite? What would depolarization look and sound like? Would we know it if we saw it, in others or in ourselves? Perhaps most importantly, what are the mental habits that encourage it?

We’re confronted with an irony here. We Americans didn’t necessarily think our way into political polarization, but we’ll likely have to think our way out. A number of big structural and social trends—including the end of the Cold War, the rising importance of cultural issues in our politics, growing secularization, greater racial and ethnic diversity, the shift from the Greatest Generation to Baby Boomers as the nation’s dominant elites, the break-up of the old media system, the increasing ideological coherence of both of our two main political parties, among others—appear to have helped produce our current predicament.

Yet over time, the intellectual habits encouraged by these underlying shifts developed a life and autonomy of their own. They became “baked in,” ultimately forming a new popular wisdom regarding how we judge what is true and decide what is right in public life. The intellectual habits of polarization include binary (Manichaean) thinking, absolutizing one’s preferred values, viewing uncertainty as a weakness, privileging deductive thinking, assuming that one’s opponents are motivated by bad faith, and hesitating to agree on basic facts and the meaning of evidence.

What are the antidotes to these familiar habits? We can recognize the mindset of the polarizer, but how does the depolarizer understand conflict and try to make sense of the world? Here is an attempt to answer these questions, by way of proposing the seven habits of highly depolarizing people.

by David Blankenhorn, American Interest |  Read more:
Image: Shutterstock

Monday, February 22, 2016

A New Breed of Trader on Wall Street: Coders With a Ph.D.

[ed. This hardly seems like news these days what with HFT (high frequency trading) and other technological distortions in the market, but whatever. As always, buyer beware. See also: Good times for Exchange-Traded Funds.]

The mood in the markets may be getting grimmer, but in the booming world of exchange-traded funds, people just want to party.

And so it was last month at the $2.8 trillion industry’s annual jamboree in South Florida, where 2,200 investment advisers and fund salesmen came together for three days of hard drinking and product pitching. Against a backdrop of New Orleans jazz bands and poolside schmooze-fests — some call it spring break for the E.T.F. crowd — one event stood out, though.

It was an invitation-only party (crabs, cocktails and a D.J. on a moonlit dock) thrown by Jane Street, a secretive E.T.F. trading firm that, after years of minting money in the shadows of Wall Street, is now pitching itself to some of the largest institutional investors in the world.

And the message was clear: Jane Street, which barely existed 15 years ago and now trades more than $1 trillion a year, was ready to take on the big boys.

Much of what Jane Street, which occupies two floors of an office building at the southern tip of Manhattan, does is not known. That is by design, as the firm deploys specialized trading strategies to capture arbitrage profits by buying and selling (using its own capital) large amounts of E.T.F. shares.

It’s a risky business.

As the popularity of E.T.F.s has soared — exchange-traded funds now account for a third of all publicly traded equities — the spreads, or margins, have narrowed substantially, making it harder to profit from the difference.

And in many cases, some of the most popular E.T.F.s track hard-to-trade securities like junk bonds, emerging-market stocks and a variety of derivative products, adding an extra layer of risk.

These dangers were brought home last August, when markets were rattled by China’s decision to devalue its currency; some of the largest E.T.F.s sank by 50 percent or more.

While traders at large investment banks watched their screens in horror, at Jane Street, a bunch of Harvard Ph.D.s wearing flip-flops, shorts and hoodies, swung into action with a wave of buy orders. By the end of the day, the E.T.F. shares had retraced their sharp falls.

“It’s remarkable what they can do,” said Blair Hull, a founder of an electronic trading firm who relies on Jane Street to make a market for his recently started E.T.F. “If you look at who provides this kind of liquidity these days, it’s fewer and fewer firms.”

It is not only Jane Street, of course. Cantor Fitzgerald, the Knight Capital Group and the Susquehanna International Group have all capitalized on the E.T.F. explosion. And as these firms have grown, so has the demand for a new breed of Wall Street trader — one who can build financial models and write computer code but who also has the guts to spot a market anomaly and bet big with the firm’s capital.

In a word, these are not your suit-and-tie bond and stock traders of yore, riding the commuter train into Manhattan. They are, instead, the pick of the global brain crop.

by Landon Thomas, Jr. |  Read more:
Image: Cole Wilson

[ed. Decisions decisions...]
via:

The Triumph of the Hard Right

Everybody told everybody early in this year’s presidential campaign (during what was called Trump Summer) that we had never seen anything so sinisterly or hilariously (take your choice) new. But Trump Summer was supposed to mellow into Sane Autumn, and it failed to—and early winter was no saner. People paid to worry in public tumbled over one another in asking what had gone wrong with our politics. Even the chairman of the Republican National Committee, Reince Priebus, joined the worriers. After Mitt Romney lost in 2012, he set up what he called the Growth and Opportunity Project, to reach those who had not voted Republican—young people, women, Latinos, and African-Americans. But its report, once filed, had no effect on the crowded Republican field of candidates in the 2016 race, who followed Donald Trump’s early lead as he treated women and immigrants as equal-opportunity objects of scorn. Now the public worriers were yearning for the “good old days” when there were such things as moderate Republicans. What happened to them?

The current Republican extremism has been attributed to the rise of Tea Party members or sympathizers. Deadlock in Congress is blamed on Republicans’ fear of being “primaryed” unless they move ever more rightward. Endless and feckless votes to repeal Obamacare were motivated less by any hope of ending the program than by a desire to be on record as opposing it, again and again, to avoid the dreaded label RINO (Republican in Name Only).

E.J. Dionne knows that Republican intransigence was not born yesterday, and he has the credentials for saying it because this dependably intelligent liberal tells us, in his new book, that he began as a young Goldwaterite—like Hillary Clinton (or like me). He knows that his abandoned faith sounded themes that have perdured right down to our day. In the 1950s there were many outlets for right-wing discontent—including H.L. Hunt’s Lifeline, Human Events, The Dan Smoot Report, the Fulton Lewis radio show, Willis Carto’s Liberty Lobby, the Manion Forum. In 1955, William F. Buckley founded National Review to give some order and literary polish to this cacophonous jumble. But his magazine had a small audience at the outset. Its basic message would reach a far wider audience through a widely popular book, The Conscience of a Conservative, ghostwritten for Barry Goldwater by Buckley’s brother-in-law (and his coauthor for McCarthy and His Enemies), L. Brent Bozell.

The idea for the book came from Clarence Manion, the former dean of Notre Dame Law School. He persuaded Goldwater to have Bozell, who had been his speechwriter, put his thoughts together in book form. Then Manion organized his own and other right-wing media to promote and give away thousands of copies of the book. Bozell did his part too—he went to a board meeting of the John Birch Society and persuaded Fred Koch (father of Charles and David Koch) to buy 2,500 copies of Conscience for distribution. The book put Goldwater on the cover of Time three years before he ran for president. A Draft Goldwater Committee was already in existence then (led by William Rusher of National Review, F. Clifton White, and John Ashbrook). Patrick Buchanan spoke for many conservatives when he called The Conscience of a Conservative their “New Testament.”

The Goldwater book, Dionne says, had all the basic elements of the Tea Party movement, fully articulated fifty years before the Koch brothers funded the Tea Party through their organizations Americans for Prosperity and Freedomworks. The book painted government as the enemy of liberty. Goldwater called for the elimination of Social Security, federal aid to schools, federal welfare and farm programs, and the union shop. He claimed that the Supreme Court’s Brown v. Board decision was unconstitutional, so not the “law of the land.” He said we must bypass and defund the UN and improve tactical nuclear weapons for frequent use.

It was widely thought, when the book appeared, that its extreme positions would disqualify Goldwater for the presidency, or even for nomination to that office. Yet in 1964 he became the Republican nominee, and though he lost badly, he wrenched from the Democrats their reliably Solid South, giving Nixon a basis for the Southern Strategy that he rode into the White House in the very next election. The Southern Strategy had been elaborated during Nixon’s campaign by Kevin Phillips, a lawyer in John Mitchell’s firm. The plan did not rely merely on Southern racism, but on a deep conviction that, as Phillips put it in a 1968 interview, all politics comes down to “who hates who.” In that interview, Phillips laid out an elaborate taxonomy of hostilities to be orchestrated by Republicans—another predictor of the Tea Party. Dionne argues, with ample illustration decade by decade, that this right-wing populism would remain a Republican orthodoxy, latent or salient, throughout the time he covers.

Joe Scarborough, in a recent book, The Right Path: From Ike to Reagan, How Republicans Once Mastered Politics—and Can Again, claims that moderate conservatism is the real Republican orthodoxy, interrupted at times by “extremists” like Goldwater or the Tea Party. He suggests Dwight Eisenhower as the best model for Republicans to imitate. Yet Scarborough is also an admirer of Buckley, and his thesis does not explain—as Dionne’s thesis does—why Buckley despised Eisenhower. Eisenhower, as the first Republican elected president after the New Deal era of Roosevelt and Truman, was obliged in Buckley’s eyes to dismantle the New Deal programs, or at least to begin the dismantling. Buckley resembled the people today who think the first task of a Republican president succeeding Obama will be to repeal or take apart the Affordable Care Act.

Eisenhower, instead, adhered to the “Modern Republicanism” expounded by the law professor Arthur Larson, which accepted the New Deal as a part of American life. Eisenhower said, “Should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history.” It was to oppose that form of Republicanism that Buckley founded National Review in 1955, with a program statement that declared: “Middle-of-the-Road, qua Middle-of-the-Road is politically, intellectually, and morally repugnant.” (...)

The sense of betrayal by one’s own is a continuing theme in the Republican Party (a Fox News poll in September 2015 found that 62 percent of Republicans feel “betrayed” by their own party’s officeholders). (...)

Both Bush presidents were denounced by the Republican right, the first for raising taxes, the second for expanding Medicare’s pharmaceutical support and expanding the government’s role in education—and the two of them for increasing the size and cost of government. Even the sainted Reagan disappointed the hard right with his arms control efforts, his raising (after cutting) taxes, his failure to shrink the government, and his selling of arms to Iran (though that bitterness has been obscured by the clouds of myth and glory surrounding Reagan).

To be on the right is to feel perpetually betrayed. At a time when the right has commanding control of radio and television talk shows, it still feels persecuted by the “mainstream media.” With all the power of the one percent in control of the nation’s wealth, the right feels its influence is being undermined by the academy, where liberals lurk to brainwash conservative parents’ children (the lament of Buckley’s very first book, God and Man at Yale). Dionne shows how the right punishes its own for “selling out” to any moderate departures from its agenda once a person gets into office.

by Garry Wills, NY Review of Books |  Read more:
Image: Robyn Beck/AFP/Getty Images

How Economists Would Wage the War on Drugs

In April, the world’s governments will meet in New York for a special assembly at the United Nations to discuss how to solve the drug problem. Don’t hold your breath: Since the previous such gathering nearly two decades ago, the narcotics industry has done better than ever. The number of people using cannabis and cocaine has risen by half since 1998, while the number taking heroin and other opiates has tripled. Illegal drugs are now a $300 billion world-wide business, and the diplomats of the U.N. aren’t any closer to finding a way to stamp them out.

This failure has a simple reason: Governments continue to treat the drug problem as a battle to be fought, not a market to be tamed. The cartels that run the narcotics business are monstrous, but they face the same dilemmas as ordinary firms—and have the same weaknesses.

In El Salvador, the leader of one of the country’s two big gangs complained to me about the human-resources problems he faced given the high turnover of his employees. (Ironically, his main sources of recruitment were the very prisons that were supposed to reform young offenders.) In Mexican villages, drug cartels provide basic public services and even build churches—a cynical version of the “corporate social responsibility” that ordinary companies use to clean up their images. Mexico’s Zetas cartel expanded rapidly by co-opting local gangsters and taking a cut of their earnings; it now franchises its brand rather like McDonald’s and faces similar squabbles from franchisees over territorial encroachment. Meanwhile, in richer countries, street-corner dealers are being beaten on price and quality by “dark web” sites, much as ordinary shops are being undercut by Amazon.

Soldiers and police officers have done rather poorly at regulating this complex global business. So what would happen if the war on drugs were waged instead by economists?

Take cocaine, which presents one of the great economic puzzles of narcotics. The war against cocaine rests on a simple idea: If you restrict its supply, you force up its price, and fewer people will buy it. Andean governments have thus deployed their armies to uproot the coca bushes that provide cocaine’s raw ingredient. Each year, they eradicate coca plants covering an area 14 times the size of Manhattan, depriving the cartels of about half their harvest. But despite the slashing and burning, the price of cocaine in the U.S. has hardly budged, bobbing between $150 and $200 per pure gram for most of the past 20 years. How have the cartels done it?

In part, with a tactic that resembles Wal-Mart’s. The world’s biggest retailer has sometimes seemed similarly immune to the laws of supply and demand, keeping prices low regardless of shortages and surpluses. Wal-Mart’s critics say that it can do this in some markets because its vast size makes it a “monopsony,” or a monopoly buyer. Just as a monopolist can dictate prices to its consumers, who have no one else to buy from, a monopsonist can dictate prices to its suppliers, who have no one else to sell to. If a harvest fails, the argument goes, the cost is borne by the farmers, not Wal-Mart or its customers.

In the Andes, where coca farmers tend to sell to a single dominant militia, the same thing seems to be happening. Cross-referencing data on coca-bush eradication with local price information shows that, in regions where eradication has created a coca shortage, farmers don’t increase their prices as one might expect. It isn’t that crop eradication is having no effect; the problem is that its cost is forced onto Andean peasants, not drug cartels or their customers.

Even if the price of coca could be raised, it wouldn’t have much effect on cocaine’s street price. The raw leaf needed to make one kilogram of cocaine powder costs about $400 in Colombia; in the U.S., that kilogram retails for around $150,000, once divided into one-gram portions. So even if governments doubled the price of coca leaf, from $400 to $800, cocaine’s retail price would at most rise from $150,000 to $150,400 per kilogram. The price of a $150 gram would go up by 40 cents—not much of a return on the billions invested in destroying crops. Consider trying to raise the price of art by driving up the cost of paint: It would be futile since the cost of the raw material has so little to do with the final price.

by Tom Wainwright, WSJ |  Read more:
Image: Pedro Pardo

The Shame of Wisconsin

Wisconsin is probably the most beautiful of the midwestern farm states. Its often dramatic terrain, replete with unglaciated driftless areas, borders not just the Mississippi River but two great inland seas whose opposite shores are so far away they cannot be glimpsed standing at water’s edge. The world across the waves looks distant to nonexistent, and the oceanic lakes stretch and disappear into haze and sky, though one can take a ferry out of a town called Manitowoc and in four hours get to Michigan. Amid this somewhat lonely serenity, there are the mythic shipwrecks, blizzards, tornadoes, vagaries of agricultural life, industrial boom and bust, and a burgeoning prison economy; all have contributed to a local temperament of cheerful stoicism.

Nonetheless, a feeling of overlookedness and isolation can be said to persist in America’s dairyland, and the idea that no one is watching can create a sense of invisibility that leads to the secrets and labors that the unseen are prone to: deviance and corruption as well as utopian projects, untested idealism, daydreaming, provincial grandiosity, meekness, flight, far-fetched yard decor, and sexting. Al Capone famously hid out in Wisconsin, even as Robert La Follette’s Progressive Party was getting underway. Arguably, Wisconsin can boast the three greatest American creative geniuses of the twentieth century: Frank Lloyd Wright, Orson Welles, and Georgia O’Keeffe, though all three quickly left, first for Chicago, then for warmer climes. (The state tourism board’s campaign “Escape to Wisconsin” has often been tampered with by bumper sticker vandals who eliminate the preposition.)

More recently, Wisconsin is starting to become known less for its ever-struggling left-wing politics or artistic figures—Thornton Wilder, Laura Ingalls Wilder—than for its ever-wilder murderers. The famous late-nineteenth-century “Wisconsin Death Trip,” by which madness and mayhem established the legend that the place was a frigid frontier where inexplicably gruesome things occurred—perhaps due to mind-wrecking weather—has in recent decades seemingly spawned a cast of killers that includes Ed Gein (the inspiration for Psycho), the serial murderer and cannibal Jeffrey Dahmer, and the two Waukesha girls who in 2014 stabbed a friend of theirs to honor their idol, the Internet animation Slender Man.

The new documentary Making a Murderer, directed and written by Laura Ricciardi and Moira Demos, former film students from New York, is about the case of a Wisconsin man who served eighteen years in prison for sexual assault, after which he was exonerated with DNA evidence. He then became a poster boy for the Innocence Project, had his picture taken with the governor, had a justice commission begun in his name—only to be booked again, this time for murder.

Ricciardi and Demos’s rendition of his story will not help rehabilitate Wisconsin’s reputation for the weird. But it will make heroes of two impressive defense attorneys as well as the filmmakers themselves. A long-form documentary in ten parts, aired on Netflix, the ambitious series looks at social class, community consensus and conformity, the limits of trials by jury, and the agonizing stupidities of a legal system descending on more or less undefended individuals (the poor). The film is immersive and vérité—that is, it appears to unspool somewhat in real and spontaneous time, taking the viewer with the camera in unplanned fashion, discovering things as the filmmakers discover them (an illusion, of course, that editing did not muck up). It is riveting and dogged work.

The film centers on the Avery family of Manitowoc County, home to the aforementioned ferry to Michigan. Even though the lake current has eroded some of the beach, causing the sand to migrate clockwise to the Michigan dunes, and the eastern Wisconsin lakeshore has begun to fill forlornly with weeds, it is still a picturesque section of the state. The local denizens, whether lawyers or farmers, speak with the flata’s, throatily hooted o’s, and incorrect past participles (“had went”) of the region. There is a bit of Norway and Canada in the accent, which is especially strong in Wisconsin’s rural areas and only sometimes changes with education.

The Avery family are the proprietors of Avery’s Auto Salvage, and their property—a junkyard—on the eponymous Avery Road is vast and filled with over a thousand wrecked automobiles. It is a business not unlike farming in that in winter everything is buried in snow and unharvestable. The grandparents, two children, and some grandchildren live—or used to—on an abutting compound that consists of a small house, a trailer, a garage, a car crusher, a barn, a vegetable garden, and a fire pit.

In 1985 Steven Avery, the twenty-three-year-old son of Dolores and Allan Avery, was arrested and convicted of a sexual assault he did not commit. There was no forensic technology for DNA testing in 1985, and he had the misfortune to look much like the actual rapist—blond and young—and the traumatized victim, influenced by the county investigators who had the whole Avery family on their radar, identified him in a line-up as her attacker. Despite having sixteen alibi witnesses, he was found guilty. The actual rapist was allowed to roam free.

After the Wisconsin Innocence Project took on his case, Avery was finally exonerated in 2003. DNA tests showed he was not guilty and that the real attacker was now serving time for another rape. Avery then hired lawyers and sued Manitowoc County and the state of Wisconsin for wrongful imprisonment and for denying his 1995 appeal (a time during which DNA evidence might have set him free), which the state had mishandled, causing him to serve eight more years.

Days after Avery’s release, Manitowoc law enforcement was feeling vulnerable about the 1995 appeal and writing memos, redocumenting the case from eight years earlier. The civil suit was making headway, and only the settlement amount remained to be determined; it was going to be large and would come out of Manitowoc County’s own budget, since the insurance company had denied the county coverage on the claim.

Then, in November 2005, just as crucial depositions were both scheduled and proceeding and Avery stood to receive his money, he was suddenly and sensationally arrested for the murder of a freelance photographer named Teresa Halbach, who had come to Avery’s Auto Salvage on Halloween to photograph a truck for an auto magazine, and whose SUV had been found on the Avery property, as eventually were her scattered and charred remains. Avery had two quasi alibis—his fiancée, to whom he’d spoken at length on the phone the afternoon of Halbach’s disappearance, and his sixteen-year-old nephew, Brendan Dassey, who had just come home from school.

No one but Steven Avery ever came under suspicion, and county investigators circled in strategically. After getting nowhere with the fiancée, they focused on the nephew, who was gentle, learning-disabled, and in the tenth grade; they illegally interrogated him and suggested he was an accomplice. They took a defense witness and turned him into one for the prosecution.

Brendan was then charged with the same crimes as Avery: kidnapping, homicide, mutilation of a corpse. Prodded and bewildered, Brendan had made up a gruesome story about stabbing Halbach and slitting her throat in Avery’s trailer (the victim’s blood andDNA were never found on the premises), a fictional scenario that came, he later said, from the James Patterson novel Kiss the Girls. When asked why he’d said the things he said, he told his mother it was how he always got through school, by guessing what adults wanted him to say, then saying it. In an especially heartbreaking moment during the videotaped interrogation included in the documentary, and after he has given his questioners the brutal murder tale they themselves have prompted and helped tell, Brendan asks them how much longer this is going to take, since he has “a project due sixth hour.”

It is a crazy story. And the film’s double-edged title pays tribute to its ambiguity. Either Steven Avery was framed in a vendetta by Manitowoc County or the years of angry prison time turned him into the killer he had not been before. But the title aside, the documentary is pretty unambiguous in its siding with Avery and his appealing defense team, Jerry Buting and Dean Strang, who are hired with his settlement money as well as money his parents, Dolores and Allan, put up from the family business.

One cannot watch this film without thinking of the adage that law is to justice what medicine is to immortality. The path of each is a little crooked and always winds up wide of the mark.

by Lorrie Moore, NY Review of Books |  Read more:
Image: Netflix