Sunday, August 19, 2018

When You’re Hot, You’re Hot

Career Successes Come in Clusters

In 1905, Albert Einstein was on a roll. Between March and June of that year—which scientists refer to as his annus mirabilis, or “miracle year”—the legendary physicist finished three different papers that would radically change how we think about space and time, and pave the way for quantum physics.

“And that’s not even getting into the summer,” marvels Dashun Wang, an associate professor of management and organizations at Kellogg.

To Wang, Einstein’s big year posed a puzzle. In a previous study looking at the careers of more than 10,000 scientists, Wang had found that the timing of a researcher’s most influential paper was completely random—it was equally likely to come at any given year in their career, be it the beginning, middle, or end. But the concept of a “miracle year” seemed to challenge that conclusion.

“What are the odds, if everything is random?” Wang wondered.

In a new paper, Wang investigates whether “hot-streak” periods like Einstein’s are more than just a lucky coincidence—in science, and in other fields as well. He teamed up with visiting student Lu Liu and Kellogg post-doctoral student Yang Wang, as well as Chaoming Song of the University of Miami, Roberta Sinatra of Central European University, and Lee Giles of Pennsylvania State University.

Looking at the career histories of thousands of scientists, artists, and film directors, the team found evidence that hot streaks are both real and ubiquitous, with virtually everyone experiencing one at some point in their career. While the timing of an individual’s greatest successes is indeed random, their top hits are highly likely to appear in close proximity.

And while more research is needed to determine what causes these bursts of genius, Wang says that the findings shed important new light on the patterns underlying success in all fields, and could be used to improve decisions about tenure, promotions, and hiring.

“If we know where your best work is, then we know very well where your second-best work is, and your third,” he says, “because they’re just around the corner.”

Pinpointing Hot Streaks

Wang nicknamed his earlier paper on the timing of scientific successes “the hope project.” By demonstrating that a scientist’s most-cited paper was equally likely to appear in any given year, his findings refuted the conventional wisdom that researchers in their 50s or 60s were past their prime. The only reason that early-career hits were more common, explains Wang, is because younger scientists tend to be more prolific. “So if you keep producing, maybe your biggest work is yet to come,” he says—hence, plenty of hope.

But what happens after that big hit? Wang could imagine two very different scenarios.

If the timing of every paper a scientist wrote was truly random, then “regression towards the mean” should set in—meaning their next paper was more likely to be average than spectacular.

On the other hand, there were logical reasons to think that one strong paper might beget another. “If I produced a good work, I feel like I learned the trick,” says Wang. “Now I feel like I’m equipped to do another work that’s just as good or even better.”

Working with Liu, a PhD student at Kellogg, Wang began to design an experiment. The idea: “Let’s not just focus on the biggest hit, but let’s also look at the second-biggest hit, and the third-biggest hit,” he explains. Their goal was to determine whether there was some kind of statistically meaningful pattern in the timing of one’s greatest achievements.

Using Google Scholar and Web of Science, Wang and Liu obtained data on research papers published by more than 20,000 scientists, which included the number of citations each paper received in its first ten years.

Then they decided to expand the analysis beyond academia. “We realized, ‘Gee, if this happens, it might be in all different kinds of careers,’” Wang says.

That idea was inspired by Alejandro González Iñárritu, the director of Birdman and The Revenant, explains Wang. “He won two Oscars back-to-back for best director. And that’s where I realized, this is not just scientists. This story’s much much bigger.”

So they collected data on the careers of artists and firm directors. To gauge success, they obtained the auction prices of artworks produced by 3,480 painters, sculptors, and other artists. And they combed film database IMDb for the career histories of 6,233 directors, using the IMDb ratings to approximate the success.

by Jake J. Smith, KellogInsight | Read more:
Image: Michael Meier

It’s So Freaking Hot. Now What?

cwick (Chadwick Matlin, features editor): 🎵 Summertime, and the livin’ is really freaking hot. People are jumping into the sea in Greece to avoid wildfires. The temps are higher than they’ve ever been in Japan. California is dealing with the biggest wildfires it’s ever seen. The heat even has Andean flamingos laying eggs for the first time in 15 years.

The consequences of climate change are growing more undeniable than ever. Which leads me to wonder: What now?

Christie and Maggie, thanks for joining the Slack chat to answer that totally simple question.

christie (Christie Aschwanden, lead science writer): “simple” question 😂

cwick: Editors pride themselves on asking giant questions and demanding simplified answers.

maggiekb (Maggie Koerth-Baker, senior science writer): Glad to be here, Chrad.

Chad

cwick: Off to a great start, Meggie.

maggiekb: I pride myself on my typing skills.

cwick: I want us to try tackling the question in a few different ways:
  1. By talking about the politics of climate change, of course. 
  2. By discussing whether the dire reality of climate change means that scientists’ roles in public discourse ought to change going forward. 
  3. And by answering the question of what comes next: Are all these ecological changes the new normal or just a waypoint on an even more dangerous trajectory? 
christie: This is a lot to chew on. I didn’t realize we were going to be here all day!

cwick: So let’s start with politics — much was made that President Trump pulled the U.S. out of the Paris climate accord. But is there any line to draw between the hot summer we’re having and Trump’s decision?

christie: Well, there’s a line to draw, but it’s not between this hot summer and Trump’s decision. The heat of this summer was set with the emissions we’d already spewed into the atmosphere.

The goal of the Paris climate agreement was to keep the overall average temperature of the planet from rising more than 2 degrees Celsius, and some estimates suggest that we’ve already put out enough greenhouse gases to exceed 1.5 degrees. To keep our emissions within the 2-degree “carbon budget” will require countries to leave 80 percent of coal, 50 percent of gas and 33 percent of oil untouched until at least 2050.

maggiekb: There’s a gap between immediate news and climate consequences. Arguably, one of the big problems with the politics of climate change is that the results and the risks play out on different time scales than the politics.

cwick: Maggie, I think that’s a sharp insight about why political action has been so hard to come by on climate change. (Corporate interests have also played a role, of course.) What’s a politician to do about that dynamic?

maggiekb: Oh, corporate interests have DEFINITELY played a role. But what I think is particularly interesting is how they played a role. One of the things I’ve written about in regard to the Paris agreement is that, 30 or 35 years ago, it probably would have been a bipartisan, no-debate sort of thing. Starting around 1990, environmental legislation became WAY more divisive.

christie: And I would argue that one reason that happened is there were real efforts underway to do something about it, the 1992 Rio Earth Summit, for instance. But then the corporate interests swooped in…

maggiekb: What made the corporate interests successful is that they were able to convince enough conservatives that climate science was a backdoor to oppressive, statist, globalist government.

cwick: Was it just conservatives? Obviously they’ve been more vocal about climate denialism, but Democrats are just as susceptible to corporate influence.

maggiekb: My point here isn’t that conservatives are bad, bad, bad. My point is that we keep having these debates about climate science … while ignoring that what the debate is actually about is political philosophy.

christie: What happened was that climate change became an identity issue. As Dan Kahan at Yale has documented, “What people ‘believe’ about global warming doesn’t reflect what they know; it expresses who they are.”

maggiekb: You can’t show people enough charts to make them believe climate change is a real threat if they feel like accepting what they see in the charts is going to hurt them and their family. And I think that’s the fundamental political problem here. What the corporations did right (for their purposes, not for the planet) is to turn the science (that you can’t argue about) into a proxy for political philosophy (which you can).

christie: And they used the “sound science” strategy to enlist science to muddle the debate. Naomi Oreskes and Erik M. Conway document this in their book, “Merchants of Doubt.”

maggiekb: And so scientists and politicians who care about climate change spend all their effort now trying to explain uncertainty spreads and the greenhouse effect when the conversation that we actually should be having is, “OK, how can we tackle this problem in a way that is philosophically acceptable to the most people?”

cwick: So given all that, what are climate-minded voters in the U.S. to do right now? Should they be focusing on state-level legislation instead of federal?

christie: Chad, a bunch of governors have banded together to take action. What’s happening now is a bottom-up kind of approach that’s happening on local and state levels. Washington Gov. Jay Inslee told Yale Environment 360, “We heard the president wanted to run up the white flag of surrender. We wanted to send a strong message to the world: We’re not going to surrender.” (...)

Also, when a city or state adopts climate-friendly technology and regulations, that can have a ripple effect. For instance, California’s market is big, so when it makes stricter rules for fuel efficiency, manufacturers are nudged into producing the necessary products to meet them.

When a city looks to improve energy efficiency in its buildings, it creates a market for those products too.

christie: Everyone has to pitch in, so it’s not at all insignificant that cities are doing that.

Also, when a city or state adopts climate-friendly technology and regulations, that can have a ripple effect. For instance, California’s market is big, so when it makes stricter rules for fuel efficiency, manufacturers are nudged into producing the necessary products to meet them.

When a city looks to improve energy efficiency in its buildings, it creates a market for those products too.

maggiekb: That reminds me of this article, which shows how the growing size of American homes (and the growing number of electrical appliances in them) all but canceled out gains in home energy efficiency.

christie: Oh, yes, the beer fridge problem! You get a new, energy-efficient fridge and then keep the old one in the garage. Or, what happened to me was that I got a very fuel-efficient car, and I felt less guilty driving so I did more of it.

by Chadwick Matlin, Christie Aschwanden, Maggie Koerth-Baker |  FiveThirtyEight |  Read more:
Image: Paul Zinken/Getty
[ed. This seems to be an important point: how substative issues of fact become matters of political philosophy (and how much corporations, lobbyists, media, the religious industrial  complex, etc. exploit that dynamic. See also: U.S. Catholics 'sickened' by sex abuse report, stand by their faith.]

Saturday, August 18, 2018

Hail to the Chief

Soon, according to a June report in The Washington Post, the moment of truth will arrive. Robert Mueller, the special counsel investigating the president, his administration, and his campaign, will deliver his verdict on whether Donald Trump obstructed justice.

On the larger and more complicated question of his campaign’s possible collusion with Russia, Mueller may take longer to issue a second report. But it is widely expected in Washington—which has been wrong about such matters before—that a first report, on obstruction, will drop before Labor Day. Assuming it happens, it will follow shortly after Mueller’s July 13 indictment of twelve Russian military intelligence officers. Those indictments have to do with the larger collusion story, and they suggest that more indictments might well be on the way. Even as Trump gave Putin the benefit of the doubt in Helsinki, a Russian woman, Maria Butina, was charged with trying to illegally influence the 2016 election.

It seems inconceivable that Mueller will absolve the president in that first report. Trump has obstructed justice right in front of our noses, and more than once, either because he doesn’t know what obstruction of justice is or because he knows and doesn’t care. The most notable instance was his interview with Lester Holt of NBC in May 2017, right after he fired FBI Director James Comey. Deputy Attorney General Rod Rosenstein had prepared a letter laying out the president’s reasons for the dismissal. The reasons included, rather laughably, the charge that Comey was unfair to Hillary Clinton in his handling of the probe of her State Department e-mails. Holt asked Trump about the reasons stated in the letter, and eventually Trump acknowledged that they hadn’t a thing to do with it:
I was going to fire Comey knowing there was no good time to do it. And in fact when I decided to just do it, I said to myself, I said, you know, this Russia thing with Trump and Russia is a made-up story.
That is obviously Trump saying, as directly as Trump can say anything, that he fired Comey because of the FBI’s investigation into his campaign’s possible Russia ties. But it’s hardly the only example we know of. Two months before that, in March 2017, he’d berated Attorney General Jeff Sessions in a meeting about Sessions’s earlier decision to recuse himself from the Russia probe and urged him to reverse course. He also made requests to both Director of National Intelligence Dan Coats and National Security Agency director Michael Rogers to issue statements proclaiming that there was no collusion (both refused). There is more along these lines. Arguably every single tweet the president writes about the investigation, attacking Mueller’s “13 Angry Democrats” and denouncing it as an invariably upper-cased Witch Hunt, is an attempt to obstruct justice; if you don’t think so, get yourself placed under federal investigation and try mimicking Trump’s Twitter habits and see what happens to you.

All of this doesn’t begin to detail what Mueller and his team have learned from interviews about what took place in private. It’s a reasonable bet, then, that Mueller will find that Trump and others around him—former press aide Hope Hicks, possibly his son Donald Jr., maybe Jared Kushner, other campaign associates and hangers-on—have lied or tried to quash or in some way compromise the investigation.

If that happens, what comes next? Three days before Trump’s inauguration, the neoconservative Bush administration official Eliot A. Cohen wrote that “this will be a slogging match until the end.” He felt confident, however, that “the institutions will contain him and the laws will restrain him if enough people care about both, and do not yield to fear of him and whatever leverage he tries to exert from his mighty office.”

Of those forty-five words of Cohen’s, the most important is “if.” When Cohen wrote his piece, there may have been reason for optimists to hope that the Republicans who control Congress and the conservative jurists who constitute the majority on the Supreme Court, as well as rank-and-file Republicans, would tire of this vulgar burlesque and would find ways to check Trump, to communicate to him that even a president can’t just do whatever he wants.

But what has actually happened over the last year and a half has been the opposite. Two Republican legislators who have criticized him in a way that bared any teeth, Senators Jeff Flake and Bob Corker, are giving up the fight and retiring, while much of the congressional GOP is instead laying the groundwork for an all-out assault on Mueller when a report hits. The Supreme Court, which will presumably soon have two Trump appointees, is far more political and less independent than the Supreme Court that in 1974 ordered Richard Nixon to hand over his tapes. Trump’s base, as long as he is deporting asylum-seekers and inveighing against knee-taking football players and fake news journalists, grows more and more besotted. And undergirding it all is the Fox News Channel, now a pure propaganda network, from which Republicans take their cues and get their talking points. What will they do when Mueller’s first allegations appear?

It’s worth stepping back here to review quickly the steps by which the Republican Party became this stewpot of sycophants, courtesans, and obscurantists. It’s easy to forget these things, but it’s not as if Trump announced his candidacy in mid-2015 and all this self-abasement suddenly happened. In a May 2015 Washington Post–ABC poll, his favorable-to-unfavorable numbers among Republicans were 23 to 65 percent. Then he announced his candidacy in mid-June, warning us about those Mexican rapists. By mid-July, another Washington Post–ABC News poll gave Trump a 57 percent favorable rating among Republicans, with 40 percent seeing him unfavorably—a big improvement, but still far from Dear Leader territory.

That August brought the first Republican debate, at which Megyn Kelly confronted Trump over his “disparaging comments about women’s looks.” The day after that debate, Trump said that Kelly had “blood coming out of her eyes, blood coming out of her wherever.” The war that resulted between Trump and Fox News foreshadowed his subsequent takeover of the Republican Party as a whole.

Trump had known Rupert Murdoch, Roger Ailes, Bill O’Reilly, and Sean Hannity for years, and occasionally appeared on Fox to natter on about Barack Obama’s birth certificate. Now, however, he grandly announced a boycott of the network and put out a flurry of tweets like this one, which reads strangely (except for the grammatical error) in light of all we know today: “.@oreillyfactor was very negative to me in refusing to to [sic] post the great polls that came out today including NBC. @FoxNews not good for me!” Who knows the extent to which this was all show. Murdoch and Ailes no doubt felt that they had to at least appear to be defending Kelly, their top female star at the time, who has since decamped to NBC (this was months before Ailes was exposed as a serial sexual predator).

It now seems as if what we were witnessing then was really a cautious waltz of alpha-male lions loosed upon an unfamiliar savannah, fighting to determine which one would lead the pride. And Trump clearly won. I’m not sure this qualifies as something for which he deserves credit, but it’s a fact that Trump is the only Republican politician I can think of since the network has been on the air (1996) to take it on and bend it to his will rather than the other way around.

As Trump began piling up primary victories, Republicans started coming around. Some stopped short of endorsing him, but they found ways to signal that they would do nothing to stop him. In late April 2016, Tennessee’s Bob Corker announced his support for Trump. The day before, Trump had given a foreign policy address that Corker praised as “challenging the foreign policy establishment that has been here for so long.” That June, when Trump delivered a racist tirade against the judge (of Mexican heritage) who was presiding over the Trump University case, Senator Lindsey Graham said, “There’ll come a time when the love of country will trump hatred of Hillary.” But for most Republicans—very much including Graham himself, who just three months into Trump’s term announced himself “the happiest dude in America right now” over the administration’s anti-Iran saber-rattling—that time never came.

The release of the Access Hollywood tape in early October 2016 provided another look-in-the-mirror moment for Republicans. More than forty elected Republicans did back away from Trump at that point—a significant number, no doubt, but still a small minority. Big donors like Robert and Rebekah Mercer announced they were sticking with him. The Never-Trumpers, which at the time included those forty, along with a number of conservative writers and intellectuals and conservative TV pundits, stood their ground, but they were overwhelmed and warned by their constituents that they had better fall into line: Trump, Rudy Giuliani, Roger Stone, Julian Assange, and Fox News were now fully in charge of the Republican Party.

None of this was inevitable. I used to argue, in these pages and elsewhere, that the Republicans could have stopped Trump, and I still believe it. Doing so would have required three elements: a bit of leadership from Reince Priebus, then the party chairman and later the easily steamrolled White House chief of staff; an agreement (this was the hard part) among the other major presidential candidates to check their egos and coalesce behind one of them; and a commitment by a few major donors to support that candidate.

But they didn’t do this, and no one stood up to Trump. His only forceful critic was Mitt Romney, who called him “a phony, a fraud” in a scathing speech; but he delivered that speech in March 2016, two days after Trump had swept the Super Tuesday voting, i.e., after he was already well on his way to the nomination. The time for that speech was before the Iowa caucuses. Today, Romney, running for the Senate in Utah, cheerily predicts that Trump will “be reelected solidly.” This is at least the fourth political incarnation of Romney, from the moderate who gave Massachusetts a health care plan in the early 2000s to the “severe” conservative who ran for president in 2012 to the anti-Trump spokesman of two years ago to the capitulator of today.

This is the remarkable thing we have witnessed: the Republican Party has essentially ceased to be a political party in our normal understanding of the term and has instead become an instrument of one man’s will. Fifty years ago, the GOP was an amalgam of different factions that often disagreed among themselves—New England liberals, the heirs of the “Free Soil” moderates, prairie conservatives, Wall Street money people. Then in 1980, the new “movement conservatives” gained the upper hand. Incrementally, they took over. Incrementally, they moved ever more rightward, egged on by the new right-wing media.

All that was bad enough for the country—it led us to a war waged under false pretenses against an “enemy” that hadn’t attacked us and a campaign to dismantle a social compact carved out over the course of a century. But at least through all those phases, the Republican Party remained committed to the basic idea of democratic allocation of power. Since the Civil War, Democrats and Republicans have fought sometimes fiercely over their ideological goals, but they always respected the idea of limits on their power.

No one had come along to suggest that power should be unlimited. But now someone has, and we have learned something very interesting, and alarming, about these “conservatives,” both the rank and file and holders of high office: their overwhelming commitment is not to democratic allocation of power, but to their ideological goals—the annihilation of liberalism, the restoration of a white ethno-nationalist hegemony. They know better than to speak of such things openly, but every once in a while they have allowed a piece of the cat’s anatomy to slip out of the bag, a tail here, a hind leg there. In June 2016, for example, Senate Majority Leader Mitch McConnell said:
For all of his obvious shortcomings, Donald Trump is certainly a different direction, and I think if he is in the White House he’ll have to respond to the right-of-center world which elected him, and the things that we believe in. So I’m comfortable supporting him.
In other words, to McConnell, that “right-of-center world” predated Trump, and on most important questions—taxes, deregulation, cultural issues, and the judges who have the power to nullify so many liberal achievements—Trump would do just what McConnell wanted a Republican president to do.

It has often been written, and I’ve written it myself, that the Republicans have been weak in the face of Trumpism. But I’ve come to think that’s wrong. They’re not weak at all. Most of them are perfectly happy to have become Trump’s vassals. They were waiting for just such a man.

Trump’s popularity among Republicans now stands at close to 90 percent. This is a fairly recent development—since the early part of this year. No doubt it is a function in part of certain accomplishments, notably the tax cut and the reshaping of the courts. But I think it’s tied most directly to the increasing awareness of what a serious threat Mueller poses to the president. Hence the ferocious pushback, orchestrated by Fox. Most nights, if I’m watching Rachel Maddow at 9 PM on MSNBC, I’ll flip over for a few moments to watch Hannity on Fox. If you don’t do this, I recommend that you do. It’s like being transported to a parallel universe. Hours continue to be devoted to why Hillary belongs in jail. The Mueller probe is discussed only for the purpose of telling viewers how corrupt it is.

by Michael Tomasky, NYRB | Read more:
Image: Siegfried Woldhek
[ed. I hesitate to pollute this blog with anything Trump-related but sometimes you just can't avoid it. At least this article lays blame where it's deserved: establishment Republicans and media outlets who debase themselves by continuing to support him. See also: We Know Trump Is Guilty. We’re Having a Hard Time Admitting It.]

The Blaze



[ed. Quite a good video. I wonder what the backstory is?]

Friday, August 17, 2018

How TripAdvisor Changed Travel

Should one be so unlucky as to find oneself, as I did, lying awake in bed in the early hours of the morning in a hostel in La Paz, Bolivia, listening anxiously to the sound of someone trying to force their way into one’s room, one could do worse than to throw a chair under the doorknob as a first line of defence. But this is not what I did. Instead, I held my breath and waited until the intruder, ever so mercifully, abandoned his project and sauntered down the hall. The next morning, when I raised the incident with the hostel employee at the front desk, he said the attempted intrusion had just been an innocent mistake, a misdirected early-morning wake-up call gone wrong, and what was the big deal, anyway? Fuming, I turned to the highest authority in the world of international travel, the only entity to which every hotel, restaurant, museum and attraction in the world is beholden: I left the hostel a bad review on TripAdvisor.

TripAdvisor is where we go to praise, criticise and purchase our way through the inhabited world. It is, at its core, a guestbook, a place where people record the highs and lows of their holiday experiences for the benefit of hotel proprietors and future guests. But this guestbook lives on the internet, where its contributors continue swapping advice, memories and complaints about their journeys long after their vacations have come to an end.

Every month, 456 million people – about one in every 16 people on earth – visit some tentacle of TripAdvisor.com to plan or assess a trip. For virtually every place, there exists a corresponding page. The Rajneeshee Osho International Meditation Resort in Pune, India, has 140 reviews and a 4 out of 5 rating, Cobham Service Station on the M25 has 451 reviews and a rating of 3.5, while Wes Anderson’s fictional Grand Budapest Hotel currently has358 reviews and a rating of 4.5. (At the top of the page, there is a message from TripAdvisor: “This is a fictional place, as seen in the movie The Grand Budapest Hotel. Please do not try to book a visit here.”)

Over its two decades in business, TripAdvisor has turned an initial investment of $3m into a $7bn business by figuring out how to provide a service that no other tech company has quite mastered: constantly updated information about every imaginable element of travel, courtesy of an ever-growing army of contributors who provide their services for free. Browsing through TripAdvisor’s 660m reviews is a study in extremes. As a kind of mirror of the world and all its wonders, the site can transport you to the most spectacular landmarks, the finest restaurants, the most “adrenaline-pumping” water parks, the greatest “Hop-On Hop-Off Experiences” that mankind has ever devised. Yet TripAdvisor reviews are also a ruthless audit of the earth’s many flaws. For every effusive review of the Eiffel Tower (“Worth the hype at night,” “Perfect Backdrop!”), there is another that suggests it is a blight on the face of the earth (“sad, ugly, don’t bother”; “similar to the lobby of a big Vegas casino, but outside”.)

TripAdvisor is to travel as Google is to search, as Amazon is to books, as Uber is to cabs – so dominant that it is almost a monopoly. Bad reviews can be devastating for business, so proprietors tend to think of them in rather violent terms. “It is the marketing/PR equivalent of a drive-by shooting,” Edward Terry, the owner of a Lebanese restaurant in Weybridge, UK, wrote in 2015. Marketers call a cascade of online one-star ratings a “review bomb”. Likewise, positive reviews can transform an establishment’s fortunes. Researchers studying Yelp, one of TripAdvisor’s main competitors, found that a one-star increase meant a 5-9% increase in revenue. Before TripAdvisor, the customer was only nominally king. After, he became a veritable tyrant, with the power to make or break lives. In response, the hospitality industry has lawyered up, and it is not uncommon for businesses to threaten to sue customers who post negative reviews.

As the so-called “reputation economy” has grown, so too has a shadow industry of fake reviews, which can be bought, sold and traded online. For TripAdvisor, this trend amounts to an existential threat. Its business depends on having real consumers post real reviews. Without that, says Dina Mayzlin, a professor of marketing at the University of Southern California, “the whole thing falls apart”. And there have been moments, over the past several years, when it looked like things were falling apart. One of the most dangerous things about the rise of fake reviews is that they have also endangered genuine ones – as companies like TripAdvisor raced to eliminate fraudulent posts from their sites, they ended up taking down some truthful ones, too. And given that user reviews can go beyond complaints about bad service and peeling wallpaper, to much more serious claims about fraud, theft and sexual assault, their removal becomes a grave problem.

Thus, in promising a faithful portrait of the world, TripAdvisor has, like other tech giants, found itself in the unhappy position of becoming an arbiter of truth, of having to determine which reviews are real and which are fake, which are accurate and which are not, and how free speech on their platform should be. It is hard to imagine that when CEO Stephen Kaufer and his co-founders were sitting in a pizza restaurant in a suburb of Boston 18 years ago dreaming up tripadvisor.com, they foresaw their business growing so powerful and so large that they would find themselves tangled up in the kinds of problems that vex the minds of the world’s most brilliant philosophers and legal theorists. From the vantage point of 2018, one of the company’s early mottos now seems comically naive: “Get the truth and go.”

Many of the difficult questions the company faces are also questions about the nature of travel itself, about what it means to enter unknown territory, to interact with strangers, and to put one’s trust in them. These are all things that one also does online – it is no coincidence that the some of the earliest analogies that we once used to talk about the digital world (“information superhighway”, “electronic frontier”) tended to belong to the vocabulary of travel. In this sense, the story of TripAdvisor, one of the least-examined and most relied-upon tech companies in the world, is something like a parable of the internet writ large.
***
The travel guide is an ancient genre, one that has never been far removed from the questions that trouble TripAdvisor LLC. For nearly all of human history, people have wanted to know everything about where they were going before they got there. The Greek geographer Pausanias is often credited with authoring the first travel guide, his Description of Greece, sometime in the second century AD. Over 10 books, he documented the sights and stories of his native land. Of Lake Stymphalia, in Corinth, for example, Pausanias writes: “In the Stymphalian territory is a spring, from which the emperor Hadrian brought water to Corinth … at one time man-eating birds bred on it, which Heracles is said to have shot down.” Today, on TripAdvisor, Lake Stymphalia gets a meagre rating of 3.5, below the average of 4: “It is more like a swampy marshland … there isn’t really anywhere to chill out and relax so we didn’t stay long,” writes one reviewer. Beneath this review, and beneath all TripAdvisor reviews, is a disclaimer: “This review is the subjective opinion of a TripAdvisor member and not of TripAdvisor LLC.”

When TripAdvisor was founded, in 2000 – six years after Amazon, four years before Facebook and Yelp – consumer reviews were still thought of as a risky endeavour for businesses, a losing bet. Amazon first allowed customers to post reviews in 1995, but it was a controversial move that some critics derided as retail suicide. When TripAdvisor launched, it did so as a simple aggregator of guidebook reviews and other established sources, keeping its distance from the unpredictable world of crowd-sourced content.

Kaufer envisaged TripAdvisor as an impartial referee, providing “reviews you can trust”, as one of its former taglines promised. But as an experiment, in February 2001, he and his partners created a way for consumers to post their own reviews. The first-ever review was of the Captain’s House Inn, on Cape Cod, which received four “bubbles”. (TripAdvisor uses “bubbles” rather than stars to evaluate companies to avoid confusing its ratings with more conventional luxury hotel ratings.)

Soon, Kaufer noticed that users were gravitating away from expert opinion and towards the crowdsourced reviews, so he abandoned his original concept and began focusing exclusively on collecting original consumer input. He hoped that selling ads on the site would be enough to keep the company afloat, but when it became clear that this wasn’t bringing in enough money, his team shifted to a new model. From late 2001, every time a visitor clicked on a link to a given hotel or restaurant, TripAdvisor would charge the business a small fee for the referral. Within three months, the company was making $70,000 a month, and in March 2002, it broke even. “I think they call it a pivot now,” Kaufer said in 2014. “I called it running for my life back then.”

By 2004, TripAdvisor had 5 million unique monthly visitors. That year, Kaufer sold TripAdvisor to InterActiveCorp (IAC), the parent company of the online travel company Expedia, for $210m in cash, but stayed on as CEO. It seemed like a good deal at the time – as Kaufer told Harvard Business School’s student newspaper in 2013, none of the founders were previously wealthy, so the windfall was a “life-changing event”. But he eventually regretted selling out so early on: “In hindsight, this was the stupidest move I ever made!”

For the next few years, TripAdvisor continued to grow, hiring more than 400 new employees around the world, from New Jersey to New Delhi. By 2008, it had 26 million monthly unique visitors and a yearly profit of $129m; by 2010, it was the largest travel site in the world. To cement its dominance, TripAdvisor began buying up smaller companies that focused on particular elements of travel. Today, it owns 28 separate companies that together encompass every imaginable element of the travel experience – not just where to stay and what to do, but also what to bring, how to get there, when to go, and whom you might meet along the way. Faced with such competition, traditional guidebook companies have struggled to keep up. In 2016, Fodor’s, one of the most established American travel guide companies, was bought by a company called Internet Brands.

Over time, hoteliers largely accepted that TripAdvisor wasn’t going away, even as they watched it turn their industry upside down. “The online world has changed pretty much every industry, but hospitality beyond recognition,” Peter Ducker, chief executive of the Institute of Hospitality, told me. “For a long time when [TripAdvisor] first came out, hoteliers didn’t like it. We didn’t want to air our dirty laundry in public,” he said. Now, though, “hotels have learned that a) it’s not going away, so get over it, and b) you can use it to your advantage … They use good TripAdvisor ratings in their marketing materials, because to a lot of the public, that means more than a star rating, more than a government accreditation. It transcends borders.”
***
By 2011, TripAdvisor was drawing 50 million monthly visitors, and its parent company, IAC, decided that the time had come to spin it out as a separate, publicly traded entity. Its IPO was valued at $4bn, but in December, on the first day of trading, shares fell. TripAdvisor was in new and uncertain territory, and no one knew how the company would fare on its own.

TripAdvisor had become a tech giant, but its leadership did not quite realise that yet. The year it went public was the final year that TripAdvisor published its annual lists of the “Top 10 Dirtiest Hotels” in the US and Europe. A couple of months before the IPO, Kenneth Seaton, owner of what had been voted “America’s dirtiest hotel” (the Grand Resort Hotel & Convention Center, in Pigeon Forge, Tennessee), filed a lawsuit against TripAdvisor for defamation, claiming $10m in damages. The suit was tossed out in 2012, after the judge ruled that any review posted to TripAdvisor is an opinion and therefore protected under the first amendment. Seaton appealed, but the original verdict was upheld on the grounds that the use of the word “dirtiest” could not count as defamation as it was no more than “rhetorical hyperbole”. TripAdvisor won the legal battle, but it still decided to scrub the “dirtiest” list from its site. “We want to stay more on the positive side,” Kaufer told the New York Times.

In 2012, the media behemoth Liberty Interactive purchased $300m in TripAdvisor shares. TripAdvisor had become an established giant of the travel industry, an inevitable part of even the most cursory vacation planning. As the company sought to clean up its public profile, its audience grew, but so did the pressure to turn a profit. “When [platforms] start to commercialise, it changes the DNA,” says Rachel Botsman, a lecturer at Oxford University’s Saïd Business School who has chronicled the rise of the reputation economy. “When that happens, it’s a problem.” Many of the website’s most loyal users feel most aggrieved by the way the site has changed.

by Linda Kinstler, The Guardian |  Read more:
Image: AFP/Getty/Guardian Design

Arizona Students' Stand on Gun Control Switches to Voter Registration

Four months ago, hundreds of Arizona students staged a die-in on the floor of their state Capitol to protest for stricter gun laws.

Now, many of those same students are working on a new campaign: registering their high school classmates to vote, with the goal of voting out the politicians who have blocked the passage of gun safety laws.

“This entire thing is led by mostly kids who can’t vote yet,” said Jordan Harb, 17, one of the organizers of March for Our Lives Arizona, a group of teenage gun violence prevention advocates running a statewide voter registration program.

Harb himself will not be old enough to vote this November. But that has not stopped him and his fellow teenage activists from leading an intensive campaign to shift the balance of power in the midterm elections.Sign up to receive the top US stories every morning

To vote National Rifle Association-backed candidates out of office, a coalition of gun violence prevention groups has launched a $1.75m campaign to register 50,000 young voters before this November’s midterm elections. Part of that money is going to nearly a dozen local groups, including March for Our Lives Phoenix, who are working to register 18- and 19-year-olds to vote.

Lower youth voter turnout in midterm elections tends to favor Republican candidates, who have blocked the passage of stricter federal gun control laws for decades. But gun violence prevention activists are trying to change that dynamic by bringing a wave of young voters to the polls.

The Our Lives Our Vote campaign is backed by Everytown for Gun Safety and Giffords, two gun violence prevention groups, and NextGen America, an advocacy group founded by the billionaire Tom Steyer, a major Democratic donor. The coalition says it has registered 27,000 voters through online and mail-in voter registration drives, focusing on 10 states where National Rifle Association-backed politicians are on the ballot. It’s now dedicating $600,000 to local groups organizing voter registration drives, including two groups run by high school students.

Since the 1999 Columbine high school shooting, and then the 2012 Sandy Hook elementary school shooting, schools across the United States added drills to prepare students for how to respond if an attacker with a gun targeted their school.

The school shooting at Marjory Stoneman Douglas high school in Parkland, Florida, this February, which left 17 people dead, sparked an unprecedented wave of youth gun control protests across the country. Thousands of schools nationwide held walkouts to protest against government inaction on preventing school shootings. The March for Our Lives, organized by student survivors from Parkland, Florida, sparked hundreds of rallies and marches worldwide, some with tens or even hundreds of thousands of participants.

After the Parkland shooting, students who had grown up with “active shooter” drills as a normal part of their lives had suddenly had enough.

by Lois Beckett, The Guardian |  Read more:
Image: Evelyn Hockstein
[ed. It's the only way.]

Wednesday, August 15, 2018

Remembering Anthony Bourdain as Only His Fixers Could

Michiko Zentoh was Anthony Bourdain’s first fixer. A freelance television producer in Japan, she worked with Bourdain on the initial two episodes of his first series, A Cook’s Tour, which were set in Tokyo and the onsen towns of Atami and Yugawara. It was 2000, and Bourdain was no longer working the same kind of schedule at New York’s Les Halles brasserie as he had before writing his best-selling Kitchen Confidential. Yet in those early shows it’s clear he still thinks of himself as a chef first, expertly evaluating a piece of bluefin and remarking on how much he’d like to get an octopus he sees at Tsukiji Fish Market back into the kitchen. What Zentoh remembers most from those days is his enthusiasm. “He told me, ‘I feel like I won the lottery,’” she recalls. “He spent so many years never leaving the kitchen and now he was traveling the world.”

Bourdain’s enthusiasm is evident in those early episodes. The characteristic intonation is there, but his voice seems an octave or two higher, and as he delights in a kaiseki meal or struggles through a bowl of mucilaginous nattō, there’s a sweetness to his demeanor, a naïveté, that belies the confidence of later years. He’s the quintessential innocent abroad—eager for new experiences but left vulnerable by them, too. On-screen, he admits to feeling intimidated, not only by the sumo wrestlers whose practice sessions he attends but even by the bullet train, where the crew shot him eating a bento lunch of eel. “He was very modest, very cautious about protocol,” Zentoh says. At one point she corrected his bowl handling, gently suggesting that he stop using both palms to cup it. “He asked me at every step, ‘Am I doing it right?’ He was the opposite of arrogant.”

He was also the opposite of profligate. Although at age 44 Bourdain was able, he said, to open a savings account for the first time in his life with the proceeds from Kitchen Confidential, budgets during A Cook’s Tour remained tight. Bourdain traveled in the same van as the rest of the small team, and their accommodations, if not dives, weren’t exactly posh. Zentoh recalls staying in a hotel with rooms so tiny Bourdain barely had room for luggage. “That’s why the geisha in the second episode are so old,” she says. “We couldn’t afford younger ones.”

Behind every bite of Moroccan sheep testicle or sip of high-octane Georgian chacha that Anthony Bourdain took on-screen was a fixer like Zentoh. Before the start of any shoot, from Reykjavík to Congo, the chef turned television star’s production company, Zero Point Zero, hired a local—usually a freelance journalist, or producer—to suggest segment ideas, set up shoots, get permissions, act as Bourdain’s interpreter, and occasionally appear on camera. These fixers may not have written the scripts or edited the footage, but they ultimately played a significant role in what viewers saw on-screen. And because, for the few days or weeks that a shoot lasted, most were also thrust into this suddenly intimate relationship with someone they knew only from TV, they possess a view onto the man that few share.

When news spread in early June that Bourdain had committed suicide at age 61, the shock, rippling across social media, felt seismic. It wasn’t just that he was so influential a figure, though countless viewers learned to eat—lustfully and catholically—from him, and there are legions of chefs today who were drawn to the profession, for better and for worse, by the pirate-ship approach to the kitchen he so vividly described. Nor was it simply the fact of his celebrity, though after nearly two decades spent crisscrossing the globe for his television series, he was recognized on the street everywhere from Beijing to Buenos Aires. It wasn’t even the confounding tragedy of his suicide, that he might choose to end a life so seemingly enviable. Rather, the thing that made his death so terribly traumatic to so many was the loss of connection. It was the loss of a real, if fleeting, sense that Bourdain somehow found time and space for an actual human moment with every person who ever cooked him a meal or even interrupted one to ask for a selfie.

For those who fixed for him, it was so often more than just a moment. Fixing is among the lowest jobs on the production hierarchy, and yet Bourdain not only treated his fixers well, but engaged with them, soliciting their insight into whatever place and people he had landed among that week and gradually coming to call several of them friends. Though most of them never met one another, they formed a sort of unspoken international network, these people who helped Bourdain know the world more deeply and who, in turn, were shaped by his way of experiencing it.

When Matt Walsh began working for No Reservations in 2005, Bourdain’s enthusiasm and curiosity were the first qualities the fixer noticed. An American journalist living in Hong Kong, Walsh had seen A Cook’s Tour, recognized the similarities between the emerging star’s New Jersey heritage and his Long Island own, and decided he wanted to have the kind of fun Bourdain seemed to be having. He pitched himself to No Reservations’ producers and was soon leading Bourdain to a roast-duck restaurant in Beijing and a family meal in Chengdu. “It was all new to him, and he was really hungry,” says Walsh. “He wanted to see it all, do it all, taste it all.”

And imbibe it all. Bourdain made no secret of his predilections. “The Tony we used to work with back then was always laughing and drinking. We got loaded all the time,” says Walsh. “By the end of some nights we were all a little slurry.”

His fixers from those early years recall Bourdain as especially happy when he was having the kind of experience that allowed him to connect with a place and its people. After the Khmer Rouge largely destroyed Cambodia’s train system, locals used what they called lorries or norries—basically a platform on wheels, outfitted with a rudimentary engine and a hand brake—to travel the rails in areas where there were no roads. On a shoot there in 2010, the crew took one out for a meal with a family in the rice fields. “It was pouring rain, but it didn’t matter,” Walsh recalls. After “riding back through those electric-green rice paddies, having smoked a lot of weed, with the wind [from] going 30 kilometers an hour—the sensation of all that. I looked at Tony and the expression on his face was exactly what I was feeling: it doesn’t get better than this.”

The lorry trip exemplifies the kind of authentic experience that Bourdain craved and that he attempted to bring to his show. For No Reservations’ second season, Zentoh was charged with coming up with a segment that took the crew to Japan’s Kiso Valley. The only dates available for the shoot fell during Obon, a holiday typically celebrated with family, but the fixer managed to wrangle an invitation with the latest three generations of the family that cares for the country’s sacred hinoki trees. “Tony started drinking shochu and sake with the head of the family,” Zentoh recalls. “After a while, he turned to us and said, ‘Forget about the shoot. I don’t care. I just want to drink with this guy. I want to be 100 percent there.’ That’s why people liked him—he showed up.”

He was also utterly authentic in his own responses. “Tony didn’t do fake,” Zentoh says. “He really would eat what was on the plate, drink what was in the glass.” He would try anything, but if he didn’t like, say, a bite of dried sea-cucumber liver that elicited an “I don’t need to try that again,” he wouldn’t pretend otherwise.

No Reservations gave Bourdain the space to express not only his political and social beliefs, but his artistic passions as well. Lucio Mollica first worked with Bourdain on the Naples episode that aired in 2011. By then, the crew had already produced a Rome episode intended as an homage to Fellini. In Naples, he wanted to shoot in the neighborhood where the film Gomorrah, released a couple of years earlier, had been set. “He wasn’t only a fine connoisseur of Italian cuisine, but of Italian culture, and Italian cinema,” Mollica says. “His knowledge of that was amazing.”

Yet as he made aspects of the show more closely in his own image, others slipped from him. As the crew grew, they increasingly had the budget to stay in nicer hotels. The pressure to produce had increased, too. “As the budget got bigger, the amount of content that was needed grew as well, and we had so little time,” Zentoh says. “It was a brutal schedule for the production team. The whole experience was like a goose being made into foie gras. Tony had no time to digest anything–not the food or the experience.”

At the time Bourdain was well on his way to becoming internationally famous. “I met him about halfway into this journey,” Mollica says. “He wasn’t so famous in Italy then.” Still, the Italian fixer glimpsed a hint of what Bourdain was losing during that first shoot. “It was a Sunday in Naples, and all the places we wanted to bring him were closed. Finally someone asked the driver, ‘Where are you eating?’ And he said, ‘My mom’s house.’ So we all went there, to the driver’s mom’s house, this tiny apartment in the historic part of town. Tony came over when lunch was ready, and stayed for three hours. She made ragù. We had been eating in these fantastic restaurants up and down the beautiful Amalfi Coast. But that was the happiest I saw him.”

In 2012, Bourdain announced he was moving from the Travel Channel to CNN to launch Parts Unknown. By all accounts, he was giddily excited about the opportunities the new show and the network’s resources would afford him; within the first few years, he would shoot episodes in Libya, Tanzania, and Iran. But even to a new fixer such as Alex Roa, a local producer who worked with Bourdain on shoots in Mexico City, Oaxaca, and Cuernavaca in 2014, it was evident that the demands—and the constant attention—were weighing on him. “I think it was not only the demands of the job, but also the intensity of it, the constant traveling and being away–in that moment–from his daughter,” says Roa. “Every episode demanded so much of him, because that was the way he was.”

By then, the eating was the least of it. “He told me that food is just a way to get into people’s bodies and minds,” Roa recalls. “It was a way to talk to someone, to get them to go deeper.” The more superficial food-porn stuff was losing its allure. In Oaxaca, when a director wanted to shoot Tony buying and eating tamales, he was frustrated, Roa says. “He just said, ‘That’s horrible. Do you know how many times I’ve done this before?’” In Mexico City a chef at the Four Seasons hotel where Bourdain was staying so wanted to cook for him that he sent word he was going to close a room of the restaurant for him; Bourdain’s response, according to Roa, was a polite but conversation-ending “No thanks.”

Were the fame, the pressure, and weariness from all that travel—and all that food—getting to him? Bourdain remained the consummate professional. “We had to ask his driver to delay and make detours so that he wouldn’t show up too early,” the fixer says. But he didn’t seem to be having as much fun. “He only went out with us one night during the whole 10 days,” Roa recalls. “Otherwise, he would just show up for a call, do the shoot, and go straight back to the hotel. He’d stay in and order room service.”

by Lisa Abend, Vanity Fair |  Read more:
Image: William Mebane

Should I do a PhD?

There are lots of good reasons for deciding to do a PhD. Deepening your knowledge of a subject you love is an excellent one. Wondering what to do with the next three years of your life and finding out your university will pay you to stay isn’t so bad either. But seeing it as a fast track to a cushy academic job probably shouldn’t be one of them.

PhDs are often glamourised in popular culture. If you grew up watching Friends, you might recall Ross Geller celebrating getting tenure at New York University. Getting tenure in a US university means you are virtually impossible to fire. Your university trusts in your intellectual brilliance to the extent that it’s willing to give you total academic freedom to research what you want. In short, it sounds like a dream.

Unfortunately, that’s exactly what it is. If Ross were a real person and not a fictional character, he wouldn’t have been celebrating getting tenure aged about 30 years old – unless he were a palaeontology prodigy. Instead, he’d be on his first or second postdoc, possibly in underpaid, insecure employment. He would also probably be so busy writing research grant applications he’d have no time to hang around in a coffee shop. If – in a decade’s time – he eventually secured a permanent academic position, he’d be one of only 3.5% of his science PhD cohort who did.

The problem with the academic dream is that the pipeline is broken. Employing lots of PhD students is a great deal for universities – they’re a source of inexpensive academic labour for research and teaching. But it’s not such a great deal for the students themselves. The oversupply of PhDs perpetuates the illusion that there are a lot of academic jobs around. There aren’t – and competition for the few that there are is fierce.

The oversupply of early career researchers means they often feel exploited by their universities. According to the University and College Union, which represents lecturers, more than three-quarters of junior academics are on precarious or zero-hours contracts. Meanwhile, competition for research funding and power-imbalanced relationships between supervisors and junior researchers can make labs and libraries ripe for bullying.

The result, according to recent research from the Royal Society and the Wellcome Trust, is that academia is one of the worst careers for stress. Nearly four in 10 academics have reported experiencing mental health conditions.

So why do so many intelligent people who would probably do fantastically well in alternative careers, put themselves through this? Because being an academic can be one of the world’s best jobs. You might get to push the boundaries of knowledge in an area you’re passionate about, work in international teams comprising the world’s greatest minds, and produce work with visible social impact – whether that’s through lecturing students or seeing your research inform policy.

But is it worth it for the majority of PhD students, who’ll never become academics? In some countries, such as the US and Germany, PhDs are increasingly seen not just as a conveyor belt to an academic job, but as an important high-level qualification that leads to a diverse range of careers. In certain industries in the UK, such as science and pharmaceuticals, demand for PhD graduates is growing as their emphasis on research increases.

But at present, a PhD qualification isn’t essential for most jobs. In some industries, a PhD might even set you back, as business leaders often see them as driving a largely pointless three-year wedge between an undergraduate degree and an entry-level position. This is often compounded by unhelpful careers advice from academic supervisors disinterested in the world outside academia.

But doing a PhD in most cases might not hinder your career either. And, if you’re an undergraduate, you certainly won’t be the only one to drift into a three-year stipend while you work out what comes next. Even if you’re not willing to slog it out in pursuit of a professorship, in some subjects more than others, there’s evidence of an earnings premium. In 2010, 3.5 years after graduation, 72% of doctoral graduate respondents were earning more than £30,000 compared with 22% of first-degree graduates.

by Rachel Hall, The Guardian | Read more:
Image: Alarmy

Elizabeth Warren Has a Plan to Save Capitalism

Elizabeth Warren has a big idea that challenges how the Democratic Party thinks about solving the problem of inequality.

Instead of advocating for expensive new social programs like free college or health care, she’s introducing a bill Wednesday, the Accountable Capitalism Act, that would redistribute trillions of dollars from rich executives and shareholders to the middle class — without costing a dime.

Warren’s plan starts from the premise that corporations that claim the legal rights of personhood should be legally required to accept the moral obligations of personhood.

Traditionally, she writes in a companion op-ed for the Wall Street Journal, “corporations sought to succeed in the marketplace, but they also recognized their obligations to employees, customers and the community.” In recent decades they stopped, in favor of a singular devotion to enriching shareholders. And that’s what Warren wants to change.

The new energy on the left is all about making government bigger and bolder, an ideal driven by a burgeoning movement toward democratic socialism. It’s inspired likely 2020 Democratic contenders to draw battle lines around how far they’d go to change the role of government in American life.

Warren supports expanding many of the programs in play, and she’s voted to do so. But the rollout of her bill suggests that as she weighs whether to get into the presidential race, she’ll focus on how to prioritize workers in the American economic system while leaving businesses as the primary driver of it.

Warren wants to eliminate the huge financial incentives that entice CEOs to flush cash out to shareholders rather than reinvest in businesses. She wants to curb corporations’ political activities. And for the biggest corporations, she’s proposing a dramatic step that would ensure workers and not just shareholders get a voice on big strategic decisions.

Warren hopes this will spur a return to greater corporate responsibility, and bring back some other aspects of the more egalitarian era of American capitalism post-World War II — more business investment, more meaningful career ladders for workers, more financial stability, and higher pay.

As much as Warren’s proposal is about ending inequality, it’s also about saving capitalism.

The Accountable Capitalism Act — real citizenship for corporate persons

The conceit tying together Warren’s ideas is that if corporations are going to have the legal rights of persons, they should be expected to act like decent citizens who uphold their fair share of the social contract and not act like sociopaths whose sole obligation is profitability — as is currently conventional in American business thinking.

Warren wants to create an Office of United States Corporations inside the Department of Commerce and require any corporation with revenue over $1 billion — only a few thousand companies, but a large share of overall employment and economic activity — to obtain a federal charter of corporate citizenship.

The charter tells company directors to consider the interests of all relevant stakeholders — shareholders, but also customers, employees, and the communities in which the company operates — when making decisions. That could concretely shift the outcome of some shareholder lawsuits but is aimed more broadly at shifting American business culture out of its current shareholders-first framework and back toward something more like the broad ethic of social responsibility that took hold during WWII and continued for several decades.

Business executives, like everyone else, want to have good reputations and be regarded as good people but, when pressed about topics of social concern, frequently fall back on the idea that their first obligation is to do what’s right for shareholders. A new charter would remove that crutch, and leave executives accountable as human beings for the rights and wrongs of their own decisions.

More concretely, United States Corporations would be required to allow their workers to elect 40 percent of the membership of their board of directors.

Warren also tacks on a couple of more modest ideas. One is to limit corporate executives’ ability to sell shares of stock that they receive as pay — requiring that such shares be held for at least five years after they were received, and at least three years after a share buyback. The aim is to disincentivize stock-based compensation in general as well as the use of share buybacks as a tactic for executives to maximize their one pay.

The other proposal is to require corporate political activity to be authorized specifically by both 75 percent of shareholders and 75 percent of board members (many of whom would be worker representatives under the full bill), to ensure that corporate political activity truly represents a consensus among stakeholders, rather than C-suite class solidarity.

It’s easy to imagine the restrictions on corporate political activity and some curbs on stock sales shenanigans becoming broad consensus points for congressional Democrats, and even part of a 2019 legislative agenda if the midterms go well. But the bigger ideas about corporate governance would be a revolution in American business practice to undo about a generation’s worth of shareholder supremacy.

The rise of shareholder capitalism

The conceptual foundations of the current version of American capitalism are found in Milton Friedman’s well-titled 1970 New York Times Magazine article “The Social Responsibility of Business Is to Increase its Profits.”

Friedman meant this provocative thesis quite literally. In his view, which has since become the dominant perspective in American law and finance, corporate shareholders should be understood to own the company and its executives should be seen as their hired help. The shareholders, as individuals, can obviously have a variety of goals they favor in life. But their common goal is to maximize the value of their shares.

Therefore, for executives to set aside shareholder profits in pursuit of some other goal like environmental protection, racial justice, community stability, or simple common decency would be a form of theft. If reformulating your product to be more addictive or less healthy increases sales, then it’s not only permissible but actually required to do so. If closing a profitable plant and outsourcing the work to a low-wage country could make your company even more profitable, then it’s the right thing to do.

Friedman allows that executives are obligated to follow the law — an important caveat — establishing a conceptual framework in which policy goals should be pursued by the government, while businesses pursue the prime business directive of profitability.

One important real-world complication that Friedman’s article largely neglects is that business lobbying does a great deal to determine what the laws are. It’s all well and good, in other words, to say that businesses should follow the rules and leave worrying about environmental externalities up to the regulators. But in reality, polluting companies invest heavily in making sure that regulators underregulate — and it seems to follow from the doctrine of shareholder supremacy that if lobbying to create bad laws is profitable for shareholders, corporate executives are required to do it.

On the flip side, an investor-friendly policy regime was supposed to supercharge investment, creating a more prosperous economy for everyone. The question is whether that’s really worked out.

The economics of shareholder supremacy

(...) Since 80 percent of the value of the stock market is owned by about 10 percent of the population and half of Americans own no stock at all, this has been a huge triumph for the rich. Meanwhile, CEO pay has soared as executive compensation has been redesigned to incentivize shareholder gains, and the CEOs have delivered. Gains for shareholders and greater inequality in pay has led to a generation of median compensation lagging far behind economy-wide productivity, with higher pay mostly captured by a relatively small number of people rather than being broadly shared.

Investment, however, has not soared. In fact, it’s stagnated.

Whether one sees this as a cause or a consequence of poor growth outcomes is up for debate, but the Warren view is that fundamentally, shareholder supremacy is a cause of poor economic performance by starving the business sector of funds that would otherwise be used to invest in equipment or training or simply to pay people more and increase their purchasing power.

But while on an optimistic view, stakeholder capitalism would produce stronger long-run growth and higher living standards for the vast majority of the population, there’s no getting around the fact that Warren’s proposal would be bad — really bad — for rich people. That’s a fight her team says she welcomes. (...)

In exchange, the laboring majority would make important gains.

Most obviously, the large share of the private sector workforce that is employed by companies with more than $1 billion in revenue would gain a measure of democratic control over the future of their workplace. That wouldn’t make tough business decisions around automation, globalization, scheduling, family responsibilities, etc. go away, but it would ensure that the decisions are made with a balanced set of interests in mind.

Studies from Germany’s experience with codetermination indicate that it leads to less short-termism in corporate decision-making and much higher levels of pay equality, while other studies demonstrate positive results on productivity and innovation.

One intuitive way of thinking about the proposal is that under the American system of shareholder supremacy, an executive increases his pay by finding ways to squeeze workers as hard as possible — kicking out the surplus to shareholders and then watching his stock-linked compensation soar. That’s brought America to the point where CEOs make more than 300 times as much as rank-and-file workers at big companies.

by Matthew Yglesias, Vox |  Read more:
Image: Chip Somodevilla/Getty Images
[ed. I can't wait to give her my vote.]

Tuesday, August 14, 2018

Pat Metheny (feat. Anna Maria Jopek and Pedro Aznar)


What the Year 2050 Has in Store for Humankind

Part one: Change is the only constant

Humankind is facing unprecedented revolutions, all our old stories are crumbling and no new story has so far emerged to replace them. How can we prepare ourselves and our children for a world of such unprecedented transformations and radical uncertainties? A baby born today will be thirty-something in 2050. If all goes well, that baby will still be around in 2100, and might even be an active citizen of the 22nd century. What should we teach that baby that will help him or her survive and flourish in the world of 2050 or of the 22nd century? What kind of skills will he or she need in order to get a job, understand what is happening around them and navigate the maze of life?

Unfortunately, since nobody knows how the world will look in 2050 – not to mention 2100 – we don’t know the answer to these questions. Of course, humans have never been able to predict the future with accuracy. But today it is more difficult than ever before, because once technology enables us to engineer bodies, brains and minds, we can no longer be certain about anything – including things that previously seemed fixed and eternal.

A thousand years ago, in 1018, there were many things people didn’t know about the future, but they were nevertheless convinced that the basic features of human society were not going to change. If you lived in China in 1018, you knew that by 1050 the Song Empire might collapse, the Khitans might invade from the north, and plagues might kill millions. However, it was clear to you that even in 1050 most people would still work as farmers and weavers, rulers would still rely on humans to staff their armies and bureaucracies, men would still dominate women, life expectancy would still be about 40, and the human body would be exactly the same. Hence in 1018, poor Chinese parents taught their children how to plant rice or weave silk, and wealthier parents taught their boys how to read the Confucian classics, write calligraphy or fight on horseback – and taught their girls to be modest and obedient housewives. It was obvious these skills would still be needed in 1050.

In contrast, today we have no idea how China or the rest of the world will look in 2050. We don’t know what people will do for a living, we don’t know how armies or bureaucracies will function, and we don’t know what gender relations will be like. Some people will probably live much longer than today, and the human body itself might undergo an unprecedented revolution thanks to bioengineering and direct brain-computer interfaces. Much of what kids learn today will likely be irrelevant by 2050.

At present, too many schools focus on cramming information. In the past this made sense, because information was scarce, and even the slow trickle of existing information was repeatedly blocked by censorship. If you lived, say, in a small provincial town in Mexico in 1800, it was difficult for you to know much about the wider world. There was no radio, television, daily newspapers or public libraries. Even if you were literate and had access to a private library, there was not much to read other than novels and religious tracts. The Spanish Empire heavily censored all texts printed locally, and allowed only a dribble of vetted publications to be imported from outside. Much the same was true if you lived in some provincial town in Russia, India, Turkey or China. When modern schools came along, teaching every child to read and write and imparting the basic facts of geography, history and biology, they represented an immense improvement.

In contrast, in the 21st century we are flooded by enormous amounts of information, and even the censors don’t try to block it. Instead, they are busy spreading misinformation or distracting us with irrelevancies. If you live in some provincial Mexican town and you have a smartphone, you can spend many lifetimes just reading Wikipedia, watching TED talks, and taking free online courses. No government can hope to conceal all the information it doesn’t like. On the other hand, it is alarmingly easy to inundate the public with conflicting reports and red herrings. People all over the world are but a click away from the latest accounts of the bombardment of Aleppo or of melting ice caps in the Arctic, but there are so many contradictory accounts that it is hard to know what to believe. Besides, countless other things are just a click away, making it difficult to focus, and when politics or science look too complicated it is tempting to switch to funny cat videos, celebrity gossip or porn.

In such a world, the last thing a teacher needs to give her pupils is more information. They already have far too much of it. Instead, people need the ability to make sense of information, to tell the difference between what is important and what is unimportant, and above all to combine many bits of information into a broad picture of the world.

In truth, this has been the ideal of western liberal education for centuries, but up till now even many western schools have been rather slack in fulfilling it. Teachers allowed themselves to focus on shoving data while encouraging pupils “to think for themselves”. Due to their fear of authoritarianism, liberal schools had a particular horror of grand narratives. They assumed that as long as we give students lots of data and a modicum of freedom, the students will create their own picture of the world, and even if this generation fails to synthesise all the data into a coherent and meaningful story of the world, there will be plenty of time to construct a good synthesis in the future. We have now run out of time. The decisions we will take in the next few decades will shape the future of life itself, and we can take these decisions based only on our present world view. If this generation lacks a comprehensive view of the cosmos, the future of life will be decided at random.

Part two: The heat is on

Besides information, most schools also focus too much on providing pupils with a set of predetermined skills such as solving differential equations, writing computer code in C++, identifying chemicals in a test tube or conversing in Chinese. Yet since we have no idea how the world and the job market will look in 2050, we don’t really know what particular skills people will need. We might invest a lot of effort teaching kids how to write in C++ or how to speak Chinese, only to discover that by 2050 AI can code software far better than humans, and a new Google Translate app enables you to conduct a conversation in almost flawless Mandarin, Cantonese or Hakka, even though you only know how to say “Ni hao”.

So what should we be teaching? Many pedagogical experts argue that schools should switch to teaching “the four Cs” – critical thinking, communication, collaboration and creativity. More broadly, schools should downplay technical skills and emphasise general-purpose life skills. Most important of all will be the ability to deal with change, to learn new things and to preserve your mental balance in unfamiliar situations. In order to keep up with the world of 2050, you will need not merely to invent new ideas and products – you will above all need to reinvent yourself again and again.

For as the pace of change increases, not just the economy, but the very meaning of “being human” is likely to mutate. In 1848, the Communist Manifesto declared that “all that is solid melts into air”. Marx and Engels, however, were thinking mainly about social and economic structures. By 2048, physical and cognitive structures will also melt into air, or into a cloud of data bits.

In 1848, millions of people were losing their jobs on village farms, and were going to the big cities to work in factories. But upon reaching the big city, they were unlikely to change their gender or to add a sixth sense. And if they found a job in some textile factory, they could expect to remain in that profession for the rest of their working lives.

By 2048, people might have to cope with migrations to cyberspace, with fluid gender identities, and with new sensory experiences generated by computer implants. If they find both work and meaning in designing up-to-the-minute fashions for a 3D virtual-reality game, within a decade not just this particular profession, but all jobs demanding this level of artistic creation might be taken over by AI. So at 25, you introduce yourself on a dating site as “a twenty-five-year-old heterosexual woman who lives in London and works in a fashion shop.” At 35, you say you are “a gender-non-specific person undergoing age- adjustment, whose neocortical activity takes place mainly in the NewCosmos virtual world, and whose life mission is to go where no fashion designer has gone before”. At 45, both dating and self-definitions are so passé. You just wait for an algorithm to find (or create) the perfect match for you. As for drawing meaning from the art of fashion design, you are so irrevocably outclassed by the algorithms, that looking at your crowning achievements from the previous decade fills you with embarrassment rather than pride. And at 45, you still have many decades of radical change ahead of you.

Please don’t take this scenario literally. Nobody can really predict the specific changes we will witness. Any particular scenario is likely to be far from the truth. If somebody describes to you the world of the mid-21st century and it sounds like science fiction, it is probably false. But then if somebody describes to you the world of the mid 21st-century and it doesn’t sound like science fiction – it is certainly false. We cannot be sure of the specifics, but change itself is the only certainty.

Such profound change may well transform the basic structure of life, making discontinuity its most salient feature. From time immemorial, life was divided into two complementary parts: a period of learning followed by a period of working. In the first part of life you accumulated information, developed skills, constructed a world view, and built a stable identity. Even if at 15 you spent most of your day working in the family’s rice field (rather than in a formal school), the most important thing you were doing was learning: how to cultivate rice, how to conduct negotiations with the greedy rice merchants from the big city and how to resolve conflicts over land and water with the other villagers. In the second part of life you relied on your accumulated skills to navigate the world, earn a living, and contribute to society. Of course, even at 50 you continued to learn new things about rice, about merchants and about conflicts, but these were just small tweaks to well-honed abilities.

By the middle of the 21st century, accelerating change plus longer lifespans will make this traditional model obsolete. Life will come apart at the seams, and there will be less and less continuity between different periods of life. “Who am I?” will be a more urgent and complicated question than ever before.

by Yuval Noah Harari, Wired |  Read more:
Image: Britt Spencer
[ed. See also: Get with the Programme]