Saturday, December 12, 2015

Lexington Lab Band



[ed. Is this the world's greatest cover band or what? I like tondr's guitar lessons on YouTube.]

Jack Barnosky, third penance
via:

Adapting to Climate Change

Yesterday, Thomas Schelling gave a seminar on climate change here at the Center for Study of Public Choice. Schelling’s main argument was that lots of resources are going into predicting and understanding climate change but very little thought or resources are going into planning for adaptation.

If Washington, DC, Boston and Manhattan are to remain dry, for example, we are almost certainly going to need flood control efforts on the level of the Netherlands. It takes twenty years just to come up with a plan and figure out how to pay for these kinds of projects let alone to actually implement them so it’s not too early to beginning planning for adaptation even if we don’t expect to need these adaptations for another forty or fifty years. So far, however, nothing is being done. Climate deniers think planning for adaptation is a waste and many climate change proponents think planning for adaptation is giving up.

Schelling mentioned a few bold ideas. We can protect every city on the Mediterranean from Marseilles to Alexandria to Tel Aviv or we could dam the Strait of Gibraltar. Damming the strait would be the world’s largest construction project–by far–yet by letting the Mediterranean evaporate somewhat it could also generate enough hydro-electric power to replace perhaps all of the fossil fuel stations in Europe and Africa.

Schelling didn’t mention it but in the 1920s German engineer Herman Sörgel proposed such a project calling it Atlantropa (more here). In addition to power, damming the strait would open up a huge swath of valuable land. Gene Roddenberry and Phillip K. Dick were fans but needless to say the idea never got very far. A cost-benefit analysis, however, might show that despite the difficulty, damming the strait would be cheaper than trying to save Mediterranean cities one by one. But, as Schelling argued, no one is thinking seriously about these issues.

I argued that capital depreciates so even many of our buildings, the longest-lived capital, will need to be replaced anyway. Here, for example, is a map showing the age of every building in New York City. A large fraction, though by no means all, are less than one hundred years old. If we let the areas most under threat slowly deteriorate the cost of moving inland won’t be as high as one might imagine–at least if the water rises slowly (not guaranteed!). Schelling agreed that this was the case for private structures but he doubted that we would be willing to let the White House go.

by Alex Tabarrok, Marginal Revolution | Read more:
Image: via:

Setting the Default

I recently did couples therapy with two gay men who’d gotten married a year or so ago. Since then one of them, let’s call him Adam, decided he was bored with his sex life and went to a club where they did some things I will not describe here. His husband, let’s call him Steve, was upset by what he considered infidelity, and they had a big fight. Both of them wanted to stay together for the sake of the kids (did I mention they adopted some kids?) but this club thing was a pretty big deal, so they decided to seek professional help.

Adam made the following proposal: he knew Steve was not very kinky, so Adam would go do his kinky stuff at the club, with Steve’s knowledge and consent. That way everyone could get what they wanted. Sure, it would involve having sex with other people, but it didn’t mean anything, and it was selfish for a spouse to assert some kind of right to “control” the other spouse anyway.

Steve made the following counterproposal: no. He liked monogamy and fidelity and it would make him really jealous and angry to think of Adam going out and having sex with other people, even in a meaningless way. He argued that if Adam didn’t like monogamy, maybe he shouldn’t have proposed entering into a form of life that has been pretty much defined by its insistence on monogamy for the past several thousand years and then sworn adherence to that form of life in front of everyone they knew. If Adam hadn’t liked monogamy, he had ample opportunity to avoid it before he had bound his life together with Steve’s. Now he was stuck.

Adam gave the following counterargument: yeah, marriage usually implies remaining monogamous, but that was all legal boilerplate. He had wanted to get married to symbolize his committment to Steve – committment that he still had! – and he hadn’t realized he was interested in fetish stuff at the time or else he would have brought it up.

Steve gave the following countercounterargument: okay, this is all very sad, but now we are stuck in this position, and clearly only one of the two people could get their preference satisfied, and given the whole marriage-implies-monogamy thing, it seemed pretty clear that that person should be him.

So then of course they both turned to me for advice.

by Scott Alexander, Slate Star Codex |  Read more:
Image:  via:

The German War: A Nation Under Arms, 1939-45

[ed. I just finished reading Anthony Doerr's All the Light We Cannot See, a novel with a similar theme - the average French and German's reaction to, and ultimately, participation in the Second World War. It made me think again about the issue of free will vs. determinism and how a person's moral perspective and/or character could be subsumed (or elevated) by the momentum of larger forces - forces that determine one's fate long before they are felt.]

Most Germans did not want war in 1939. When it came, following Hitler’s invasion of Poland, there was no euphoria and flag-waving, as there had been in 1914, but dejection; the people were downcast, one diarist noted. The mood soon lifted, as the Third Reich overran its neighbours, but most Germans still hoped for a quick conclusion. As Nicholas Stargardt points out in his outstanding history of Germany during the second world war, the Nazi regime was most popular “when it promised peace, prosperity and easy victories”. And yet, German troops continued to fight an ever more protracted battle, with ever more brutality, while the home front held tight. Even when it was clear that all was lost, there was no collapse or uprising, as in 1918. Why?

There are two easy answers. After the war, many Germans claimed to have been cowed by an omnipotent terror apparatus. More recently, some historians have argued the opposite: the Nazi regime was buoyed by fervent support, with ordinary Germans backing Hitler to the end. Stargardt dismisses both answers convincingly. Domestic terror alone, though ever-present, did not ensure the war’s continuation. Neither did popular enthusiasm for nazism. Of course there was significant support for Hitler’s regime, at least as long as the campaign went well. “God couldn’t have sent us a better war,” one soldier wrote to his wife in summer 1940, as the Wehrmacht routed France. But opinion was fickle, fluctuating with the fortunes of war.

Grumbling about rationing and shortages began within weeks, and never ceased, even as the regime alleviated hardships at home through the ruthless exploitation of occupied Europe (midway through the war, almost 30% of Germany’s meat came from abroad). There was plenty of resentment, too, about the privileges of the Nazi elite, which gorged itself on delicacies as ordinary Germans chewed “cutlets” made from cabbage. As a popular joke had it: “When will the war end?” “When Göring fits into Goebbels’s trousers”. Resentment of the regime grew as allied bombs rained on Germany, displacing millions and killing more than 400,000. German civilians criticised their leaders for the porous air defences, and they also turned on each other. Evacuees from the cities complained about the “simple and stupid” peasants who hosted them, while the farmers accused the new arrivals of laziness and loose morals. Back in the urban centres, locals were relieved when they were spared because a different German city was hit instead. The supposedly unified Nazi “national community” was just a fiction.

Despite this lack of national cohesion and the growing war fatigue, Germans kept fighting. Most important, Stargardt suggests, were their feelings of “patriotic defiance”, arising less from fanatical nazism than familial bonds. They had to win the war at any cost, soldiers believed, to protect their loved ones and to make Germany impregnable. “Your father is away,” one soldier lectured his teenage son in 1942, “and is helping to prepare a better future for you, so that you don’t have to do it later yourselves.” Even Germans appalled by the genocidal war waged in their name rallied around their country. Their determination was fuelled by Nazi propaganda, which insisted that this was a defensive war, provoked by Germany’s enemies, and warned that defeat would mean the annihilation of the fatherland. This campaign, based on “strength through fear” (as a British commentator quipped), hit home. As another soldier wrote to his wife just weeks before the final surrender: “If we go to the dogs, then everything goes to the dogs.”

Propaganda and popular opinion are just two key themes in Stargardt’s sweeping history, which takes in almost everything, from battles to religion and entertainment. And although the focus is on wartime Germany, we also see the suffering the war brought to the rest of Europe: pulverised cities, ravaged countryside, countless victims. Crucially, the death and destruction wrought by the German conquerors was not hidden from the population back home. Germans knew that the regime relied on pillage and plunder, bolstering the war effort with raw materials and slave labour from across Europe. And they knew that huge numbers of Jews were murdered in the east.

Historians have long debunked the postwar myth of German ignorance about the Holocaust, and Stargardt presents further evidence that the genocide was an open secret. News spread via German soldiers and officials who witnessed massacres, or participated in them. “The Jews are being completely exterminated,” a policeman wrote in August 1941 to his wife in Bremen. Nazi propaganda also dropped heavy hints, creating a sense of societal complicity: in autumn 1941, for instance, the Nazi party displayed posters across the country, emblazoned with Hitler’s threat that a world war would lead to the “destruction of the Jewish race in Europe”. Ordinary Germans watched the deportations of their Jewish neighbours and purchased their abandoned property at bargain prices. Later on, the authorities distributed the belongings of Jews among bombed-out Germans, though this triggered new complaints about Nazi bigwigs grabbing the best bits and “laying their Aryan arses in the Jewish beds after they have exterminated the Jews”, as one employee in a Bavarian factory exclaimed. There was some popular unease about the genocide, and it came into the open during the intense allied bombing, in a rather twisted manner: many ordinary Germans bought into the Nazi propaganda picture of Jews pulling the strings in Britain and the USA, and understood the air raids as payback for the antisemitic pogroms and mass murders. In this way, writes Stargardt, the Germans “mixed anxieties about their culpability with a sense of their own victimhood”.

by Nikolaus Wachsmann, The Guardian | Read more:
Image: Popperfoto/Getty Images

Thursday, December 10, 2015

What Your Microbiome Wants for Dinner

Let’s admit it. Few of us like to think, much less talk about our colons. But you might be surprised at the importance of what gets into your colon and what goes on inside it. This little-loved part of our bodies is actually less an onboard garbage can and more like the unlikeliest medicine chest.

There is abundant medical evidence that diet greatly influences health, and new science is showing us why this is so. It is also showing us that advocates of trendy paleo and vegan diets are missing the big picture of how our omnivorous digestive system works.

Your colon is the home for much of your microbiome—the community of microbial life that lives on and in you. In a nutshell, for better and worse, what you eat feeds your microbiome. And what they make from what you eat can help keep you healthy or foster chronic disease.

To gain an appreciation of the human colon and the role of microbes in the digestive tract as a whole, it helps to follow the metabolic fate of a meal. But, first, a word about terms. We’ll refer to the digestive tract as the stomach, small intestine, and colon. While the colon is indeed called the “large intestine,” this is a misnomer of sorts. It is no more a large version of the small intestine than a snake is a large earthworm.

The stomach might better be called a dissolver, the small intestine an absorber, and the colon a transformer. These distinct functions help explain why microbial communities of the stomach, small intestine, and colon are as different from one another as a river and a forest. Just as physical conditions like temperature, moisture, and sun strongly influence the plant and animal communities that one sees on a hike from a mountain peak to the valley below, the same holds true along the length of the digestive tract.

Imagine you are at a Fourth of July barbecue. You saunter over to the grill to take a look at the fare. The pork ribs look great so you spear a few and add a heap of homemade sauerkraut on the side. You grab a handful of corn chips and a few pieces of celery. The vegetable skewers look good too, so you add one to the pile on your plate. And what would the Fourth of July be without macaroni salad and pie?

You lift a rib to your mouth and start gnawing. A forkful of sauerkraut mingles well with the meat and you crunch your way through another mouthful. The macaroni squishes between your teeth, but the celery takes some chewing. It all slips down the hatch and lands in the acid vat of your stomach where gastric acids start dissolving the bits of food. On the pH scale, where 7 is neutral and lower values are more acidic, the stomach is impressive. Its acidity ranges from 1 to 3. Lemon juice and white vinegar are about a 2.

After the stomach acids work over your meal, the resultant slurry drops into the top of the small intestine. Right away bile from the liver shoots in and starts working over the fats, breaking them down. Pancreatic juices also squirt into the small intestine to join the digestive party. Your Fourth of July feast is now on its way to full deconstruction into the basic types of molecules—simple and complex carbohydrates (sugars), fats, and proteins. In general, there is an inverse relationship between the size and complexity of these molecules and their fate in the digestive tract. Smaller molecules, primarily the simple sugars that compose the refined carbohydrates in the macaroni, pie crust, and chips are absorbed relatively quickly. Larger or more complex molecules take longer to break down and are absorbed in the lower reaches of the small intestine.

The sausage-like loops of the small intestine provide an entirely different type of habitat for your microbiota than the stomach. Acidity drops off rapidly and, in combination with all the nutrients, the abundance of bacteria shoots up to 10,000 times more than that in the stomach. But conditions still aren’t ideal for bacteria in the small intestine. It’s too much like a flooding river. And understandably so, considering that about seven quarts of bodily fluids, consisting of saliva, gastric and pancreatic juices, bile, and intestinal mucus flow through it every day. And that’s not including the two additional quarts of whatever other liquids you consume. The rushing swirl of fluids entrains food molecules and bacteria and carries them rapidly downstream. The constant motion means that nothing stays put for long, so bacteria can’t really settle in and contribute much to digestion.

By the middle to lower reaches of your small intestine, the fats, proteins, and some of the carbohydrates in the Fourth of July slurry are sufficiently broken down for absorption and pass into the bloodstream through the intestinal wall. Notice we said some of the carbohydrates. A good amount of them aren’t broken down at all. These complex carbohydrates, what your doctor calls fiber, have a completely different fate than simple carbohydrates.

They drop, undigested, into the slough-like environment of the colon. With a neutral pH of about 7, the colon is a paradise for bacteria compared to the acid vat of the stomach or the churning rapids of the small intestine, where the pH is slightly lower.

Deep within the safety of our inner sanctum, communities of microbial alchemists use our colon as a transformative cauldron in which to ferment the fiber-rich complex carbohydrates we can’t digest. But it takes the right microbes. For example, Bacteroides thetaiotaomicron makes over 260 enzymes that break apart complex carbohydrates. In contrast, the human genome codes for a paltry number. We can only make about 20 enzymes to break down complex carbohydrates.

by David R. Montgomery and Anne Biklé, Nautilus | Read more:
Image: Courtesy of the authors

Golf's Iconoclast Comes Clean

Next year, golf is returning to the Olympics for the first time in more than a century – and a Vandyke-bearded bipolar alcoholic who sometimes covers PGA tournaments while dressed like a pirate will be doing the play-by-play.

"I've never been sure about the whole drug-testing aspect of the Olympics," says David Feherty, 57, a former European Tour player from Northern Ireland whose training regimen once included weed, cocaine and a daily dose of 40 Vicodin and two and a half bottles of whiskey. "If they come up with a drug that helps you play golf better, I am going to be so pissed – I looked for that for years."

In the staid world of pro golf, Feherty is a smart, funny wild card whose cult celebrity is transcending the sport. He covers PGA tournaments while describing a player as having "a face like a warthog stung by a wasp" on live TV, does standup, writes bestselling novels and hosts a Golf Channel show where he gets guests like Bill Clinton and Larry David to open up about their games and lives. Feherty's secret? Sober since 2005, he's now got nothing to hide. "One of the advantages of having a fucked-up life is that other people are more comfortable telling you about theirs," he says. "I see from a different side of the street than most people."

Born on the outskirts of Belfast, Feherty turned pro at 18 and quickly embraced the European Tour's hard-living lifestyle. In 1986, after winning the Scottish Open in Glasgow, he went on a bender and awoke two days later on a putting green 150 miles away – alongside Led Zeppelin's road manager, with no recollection of getting there or what happened to his silver trophy. Once while playing in the Swedish Open, he went out for a drink and arose the next day in Denmark. "After that, I always kept $600 in my wallet," he says, "because that's exactly what it cost me to get back to the golf club just in time to miss my starting time."

After a middling pro career, he became a PGA Tour commentator in 1997, eventually moving to Dallas, raising a family, getting diagnosed with bipolar disorder and sobering up. An insomniac who still struggles with depression – "I get overwhelmed by sadness several times a day and spend a lot of time in tears" – Feherty has managed to achieve success by channeling his restlessness into his work. "I now take 14 pills a day – antidepressants, mood stabilizers and amphetamines," he says. "The Adderall is enough to tear most people off the ceiling, but I can take a nap."

For Feherty, 2016 will be a turning point. After 19 years working as a commentator for CBS, he'll move to NBC – a transition that allows him to take his talent beyond the fairways. In addition to the Olympics, he'll cover the international Ryder Cup and other tournaments while continuing to host his talk show – and is even looking to conquer new sports.

"Remember Fred Willard in Best in Show?" he asks. "If there's a place somewhere for a golf analyst where no technical knowledge is required, I would love to jump in – I just want to be challenged again."

As he prepares for the next chapter in his improbable career, Feherty spoke to Rolling Stone about partying like a rock star, cultivating his rumpled mystique and changing the face of golf.

A lot of musicians are also avid golfers – why do you think that is?

So many musicians play golf, especially people in rock & roll, but most of them use golf as an alternative to drugs and alcohol. I think for addicts, spare time is their worst enemy. And you know, golf takes up time – actually it's one of the problems with the game, but it works in our favor.

by Stayton Bonner, Rolling Stone |  Read more:
Image: Chris Condon/PGA/Getty

A Colorblind Constitution: What Abigail Fisher’s Affirmative Action Case Is Really About

[ed. From earlier this year - this case is acutally being heard right now. See also: Supreme Court Justices’ Comments Don’t Bode Well for Affirmative Action]

Court on Monday announced that it would again hear Fisher v. Texas, an affirmative action case in which a white woman claims she was denied admission to the University of Texas because of her race. In 2013, the Court ruled narrowly on the case, requiring the federal appeals court that had ruled against the woman, Abigail Fisher, to re-examine her arguments. Last year, the appeals court again decided against Fisher, affirming that race could be one of the factors considered in trying to diversify the student body at the university.

Months ago, Linda Greenhouse, the Supreme Court expert, asked of the Fisher case: “What will the court do? Let the latest Fifth Circuit opinion, with its endorsement of race-conscious admissions, stand unreviewed? Or plunge back into the culture wars with a case that sorely tested collegial relations among the justices two years ago and that promises to be at least as challenging a second time around?”

The court has now chosen its path. It will re-engage.

In 2013, ProPublica published what became one of the most provocative analyses of the Fisher case. It highlighted an overlooked, deeply ironic fact about the case: when one actually looked at Fisher’s arguments, she actually had not been denied admission because of her race, but rather because of her inadequate academic achievements. Read that analysis, originally published March 18, 2013, below.

Original story:

When the NAACP began challenging Jim Crow laws across the South, it knew that, in the battle for public opinion, the particular plaintiffs mattered as much as the facts of the case. The group meticulously selected the people who would elicit both sympathy and outrage, who were pristine in form and character. And they had to be ready to step forward at the exact moment when both public sentiment and the legal system might be swayed.

That's how Oliver Brown, a hard-working welder and assistant pastor in Topeka, Kan., became the lead plaintiff in the lawsuit that would obliterate the separate but equal doctrine. His daughter, whose third-grade innocence posed a searing rebuff to legal segregation, became its face.

Nearly 60 years after that Supreme Court victory, which changed the nation, conservatives freely admit they have stolen that page from the NAACP's legal playbook as they attempt to roll back many of the civil rights group's landmark triumphs.

In 23-year-old Abigail Noel Fisher they've put forward their version of the perfect plaintiff to challenge the use of race in college admissions decisions.

Publicly, Fisher and her supporters, chief among them the conservative activist who conceived of the case, have worked to make Fisher the symbol of racial victimization in modern America. As their narratives goes, she did everything right. She worked hard, received good grades, and rounded out her high school years with an array of extracurricular activities. But she was cheated, they say, her dream snatched away by a university that closed its doors to her because she had been born the wrong color: White.

The daughter of suburban Sugar Land, Texas, played the cello. Since the second grade, she said, she dreamed of carrying on the family tradition by joining her sister and father among the ranks of University of Texas at Austin alumni.

And the moment for her to lend her name to the lawsuit might never be riper: The Supreme Court has seated its most conservative bench since the 1930s. The Court is expected to issue a decision any week now in what is considered one of the most important civil rights cases in years.

On a YouTube video posted by Edward Blum, a 1973 University of Texas graduate whose nonprofit organization is bankrolling the lawsuit, she is soft-spoken, her strawberry blond hair tucked behind one ear. Not even a swipe of lip gloss adorns her girlish face.

"There were people in my class with lower grades who weren't in all the activities I was in, who were being accepted into UT, and the only other difference between us was the color of our skin," she says. "I was taught from the time I was a little girl that any kind of discrimination was wrong. And for an institution of higher learning to act this way makes no sense to me. What kind of example does it set for others?"

It's a deeply emotional argument delivered by an earnest young woman, one that's been quoted over and over again.

Except there's a problem. The claim that race cost Fisher her spot at the University of Texas isn't really true.

In the hundreds of pages of legal filings, Fisher's lawyers spend almost no time arguing that Fisher would have gotten into the university but for her race.

If you're confused, it is no doubt in part because of how Blum, Fisher and others have shaped the dialogue as the case worked its way to the country's top court.

Journalists and bloggers have written dozens of articles on the case, including profiles of Fisher and Blum. News networks have aired panel after panel about the future of affirmative action. Yet for all the front-page attention, angry debate and exchanges before the justices, some of the more fundamental elements of the case have been little reported.

Race probably had nothing to do with the University of Texas's decision to deny admission to Abigail Fisher.

by Nikole Hannah-Jones, ProPublica | Read more:
Image Susan Walsh/AP

Wednesday, December 9, 2015

In Texting, Punctuation Conveys Different Emotions. Period.

[ed. See also: What’s Really Hot on Dating Sites? Proper Grammar.]

Technology is changing language, period

The use of a period in text messages conveys insincerity, annoyance and abruptness, according to a new study from the State University of New York Binghamton. Omitting better communicates the conversational tone of a text message, the study says.

As with any study by university researchers, though, it’s not that simple. The study found that some punctuation expresses sincerity. An exclamation point is viewed as the most sincere. (I overuse exclamation points!)

“It’s not simply that including punctuation implies a lack of sincerity,” said the study’s lead author, Celia Klin, an associate professor of psychology at Binghamton. “There’s something specific about the use of the period.”

by Christina Passariello, WSJ |  Read more:
Image: via:

The Man Who Would Make the World a Prettier Place

Matthew Moneypenny (his real name) is Hollywood handsome, with a dressed-down wardrobe of Saint Laurent and an easy patter somewhere between pitchman and showman.

He laughs loudly and has a favorite table at Sant Ambroeus in the West Village, as well as a preferred room at the Chateau Marmont in West Hollywood, Calif. He looks like what industry types call “talent,” but Mr. Moneypenny, 46, isn’t talent (though talented). In the congested little world of fashion image-making, he is talent’s agent, the bargaining power behind the throne.

Mr. Moneypenny’s job is to secure high-revenue deals for top-tier images and image-makers. In effect, he said with a practiced twinkle over cookies at a Sant Ambroeus corner table, “to make the world a prettier place.”

Mr. Moneypenny is the president and chief executive of Trunk Archive, a photography licensing agency, whose back catalog of images run in magazines and product packaging, are loaded as smartphone backdrops and hang on hotel walls. In Trunk’s trunk are images by hundreds of photographers, including many of fashion’s marquee names: Annie Leibovitz, Bruce Weber, Arthur Elgort and Patrick Demarchelier.

Mr. Moneypenny has built Trunk into a digital, long-tail boutique of stylish imagery, making the photographers (and the company) significant amounts of money in the process. They are high-end images for high-end prices.

“If Corbis and Getty are Kmart and Walmart,” Mr. Moneypenny said, ticking off two of the larger stock-photo agencies, “we’re Bergdorf Goodman.”

Having made a success of reselling existing images, Mr. Moneypenny is now getting into the business of creating new ones.

Fueled by investment capital from Waddell & Reed, which has taken positions in companies like Richemont and LVMH, Mr. Moneypenny has spent the last two years quietly buying creative agencies. The result is Great Bowery, a group that will include Trunk Archive and 11 other assignment and licensing agencies under its umbrella.

Great Bowery is a mega-agency, one whose ambition is to rebalance the scales, empowering those who make fashion’s imagery — photographers, fashion stylists, hair and makeup artists and set designers — and checking, implicitly, the powerful and increasingly integrated luxury companies and media conglomerates that have traditionally commissioned their work.

He sees it as nothing less than the fashion analogue of the rise of the agency system in Hollywood, which unseated the film studios as the sole kingmakers and deal-brokers.

“If one can say three pillars were originally music, television and film, I would argue that fashion is now the fourth pillar,” Mr. Moneypenny said. “Twenty years ago, it was the socialite on the Upper East Side or the resident of Mayfair or Beverly Hills that was aware of what was coming down the Chanel or Dior runway. There’s so much more interest about the creativity that comes out of this world.” (...)

Most of the agencies now under the aegis of Great Bowery, like CLM, M.A.P, Management & Artists and Streeters, are unfamiliar to the public. So, too, are many of the artists they represent.

Their work is not. You have seen it in the glossy spreads of fashion magazines, the ad campaigns that precede them, the billboards and bus shelters luxury companies commandeer and the videos that run on their Instagram accounts and websites.

If they are not, on the whole, household names the way CAA clients like Julia Roberts, George Clooney and Madonna are, Mr. Moneypenny is betting that they can be. Their rates can already run Hollywood stratospheric (“It’s not unusual for a highly talented artist in our world to generate a seven-figure annual income,” he said), and their visibility is rising to match. Is Pat McGrath, the most in-demand of the runway makeup artists, say, primed to become a name-brand megastar?

“Some designers are big famous stars,” Mr. Baron said. “Also photographers are, and stylists are, and models are. Everybody’s kind of important. It’s become popular culture.”

by Matthew Schneier, NY Times |  Read more:
Image: Damon Winter/The New York Times

How The Big Short Hollywood-ized the Financial Collapse


The Big Short—based on Michael Lewis’s account of the 2008 financial crisis and directed by Will Ferrell’s writing partner, Adam McKay—is a ruthless takedown of Wall Street disguised as a snarky Hollywood romp. In the movie, a group of renegade brokers and traders bet against the housing market and make a killing while the rest of the finance world weeps. Fun! The way those renegades did it, however, presents a very big filmmaking problem: How do you dramatize a series of financial trades so convoluted, so abstruse, that even the people in on the deals didn’t always understand what was going on?

by Claire Suddath, Bloomberg | Read more:
Image: The Big Short

Sniff 'n' The Tears

Forget Sexy: Cutting-Edge Design Gives Taiwan's Giant Bicycles the Edge

Among professional riders and cycling magazine reviewers, the Propel, which retails in the U.S. for $2,200-$9,000 depending on the model, is more than a high-performance racing bicycle. It’s an engineering marvel.

It’s so light you can lift it with one hand. It’s so fast, promising to shave 12 to 36 seconds off race time over 40 kilometers (25 miles), that it was picked by German rider John Degenkolb for a final sprint in this year’s Tour de France.

The Propel, named Cycling Plus Magazine’s "Bike of the Year" both this year and last, isn’t the handiwork of prestige Italian or North American brands such as Cannodale, Colnago, Pinarello or Cervélo. It’s made by Taiwan’s Giant Manufacturing Co., the biggest bike manufacturer in the world better known until recently as a contract manufacturer for Trek, Scott and other bikes—not for its high-end, carbon-fiber racing bicycles.

"I think Giant’s technical prowess and abilities are amongst the very best in the whole industry," said Warren Rossiter, senior technical editor for road for the London-based magazine group that publishes Cycling Plus and Bikeradar, and whose team tests more than 200 bikes a year. "Giant may lack the cachet of historic Italian or American innovators like Cannondale, but for those in the know, the Giant brand represents truly cutting-edge design and technology."

But while serious enthusiasts now recognize Giant’s engineering and design chops, casual riders haven’t always—some even spray paint away Giant’s logo on the frame. So to improve its image overseas, Giant is planning an expansion in the U.S., from the 125 bike shops now offering Giant bikes as at least half of their inventory to 155 by the end of next near, adding to the almost 1,000 stores that carry Giant bikes in lesser proportions with other models.

Still, Giant wants consumers to know that its selling strategy is based on quality, not flash. "Tony" Lo Hsiang-an, the chief executive officer of three decades, said he realizes Giant is "not as sexy as some of the brands." He said in a 90-minute interview in a bike workroom at headquarters in Taiwan’s west coast city of Taichung that the brand’s image is improving, because innovation speaks for itself.

"Strategy wise, we have no intention to become just a very fancy brand," he said. "Our root is still technology and quality. Everything we do, we must have very good reasons why we do that. I think some brands, they are more marketing, more talk, but I believe ours should be real."

by Sheridan Prasso and Cindy Wang, Bloomberg | Read more:
Image: Maurice Tsui

Nizo Yamamoto
The Girl Who Leapt Through Time
via:

Tuesday, December 8, 2015


Ayano Imai
, "The Town Mouse & The Country Mouse"
via:

What If?

What if Adolf Hitler’s paintings had been acclaimed, rather than met with faint praise, and he had gone into art instead of politics? Have you ever wondered whether John F Kennedy would have such a shining reputation if he had survived his assassination and been elected to a second term? Or how the United States might have fared under Japanese occupation? Or what the world would be like if nobody had invented the airplane?

If you enjoy speculating about history in these counterfactual terms, there are many books and movies to satisfy you. The counterfactual is a friend to science-fiction writers and chatting partygoers alike. Yet ‘What if?’ is not a mode of discussion you’ll commonly hear in a university history seminar. At some point in my own graduate-school career, I became well-acculturated to the idea that counterfactualism was (as the British historian E P Thompson wrote in 1978) ‘Geschichtwissenschlopff, unhistorical shit.’

‘“What if?” is a waste of time’ went the headline to the Cambridge historian Richard Evans’ piece in The Guardian last year. Surveying the many instances of public counterfactual discourse in the anniversary commemorations of the First World War, Evans wrote: ‘This kind of fantasising is now all the rage, and threatens to overwhelm our perceptions of what really happened in the past, pushing aside our attempts to explain it in favour of a futile and misguided attempt to decide whether the decisions taken in August 1914 were right or wrong.’ It’s hard enough to do the reading and research required to understand the complexity of actual events, Evans argues. Let’s stay away from alternative universes.

But hold on a minute. In October 2015, when asked if, given the chance, he would kill the infant Hitler, the US presidential candidate Jeb Bush retorted with an enthusiastic: ‘Hell yeah, I would!’ Laughter was a first response: what a ridiculous question! And didn’t Bush sound a lot like his brash ‘Mission Accomplished’ brother George W just then? When The New York Times Magazine had asked its readers to make the same choice, only 42 per cent responded with an equally unequivocal ‘Yes’. And as The Atlantic’s thoughtful piece on the question by Matt Ford illustrated, in order to truly answer this apparently silly hypothetical, you have to define your own beliefs about the nature of progress, the inherent contingency of events, and the influence of individuals – even very charismatic ones – on the flow of historical change. These are big, important questions. If well-done counterfactuals can help us think them through, shouldn’t we allow what-ifs some space at the history table?

One reason professional historians disdain counterfactuals is that they swing so free from the evidence. The work of academic historical writing depends on the marshalling of primary and secondary sources, and the historian is judged on her interpretations of the evidence that’s available. Did she try hard enough to find the kind of evidence that would answer her questions? Does she extrapolate too much meaning from a scanty partial archive? Does she misunderstand the meaning of the evidence, in historical context? Or should she have taken another related group of sources into account? For the professional historian, these sources are not incidental to interpreting history; they are the lifeblood of doing so. In a counterfactual speculation, the usual standards for the use of evidence are upended, and the writer can find herself far afield from the record – a distance that leaves too much room for fancy and interpretation, making a supposedly historical argument sound more and more like fiction.

What is worse, counterfactual speculations spring naturally from deeply conservative assumptions about what makes history tick. Like bestselling popular histories, counterfactuals usually take as their subjects war, biography or an old-school history of technology that emphasises the importance of the inventor. (This is part of why Evans termed counterfactualism ‘a form of intellectual atavism’.) Popular counterfactuals dwell on the outcomes of military conflicts (the Civil War and the Second World War are disproportionately popular), or ponder what would have happened if a leader with the fame of Hitler had (or, in some cases, hadn’t) been assassinated. These kinds of counterfactual speculations assign an overwhelming importance to political and military leaders – a focus that seems regressive to many historians who consider historical events as the result of complicated social and cultural processes, not the choices of a small group of ‘important’ people.

The ‘wars and great men’ approach to history not only appears intellectually bankrupt to many historians, it also excludes all those whose voices from the past historians have laboured to recover in recent decades. Women – as individuals, or as a group – almost never appear, and social, cultural, and environmental history are likewise absent. Evans, for his part, thinks this is because complex cultural topics are not easy to understand through the simplifying lens of the ‘what if’. He uses that resistance as evidence against the validity of the practice itself: ‘You seldom find counterfactuals about topics such as the transition from the classical sensibility to the Romantic at the end of the 18th century, or the emergence of modern industry, or the French revolution, because they’re just too obviously complicated to be susceptible of simplistic “what-if” speculation.’

Despite all these criticisms, a few historians have recently been making persuasive arguments that counterfactualism can be good – for readers, for students, and for writers. Historical speculation, they say, can be a healthy exercise for historians looking to think hard about their own motives and methods. Counterfactuals, if done well, can force a super-meticulous look at the way historians use evidence. And counterfactuals can encourage readers to think about the contingent nature of history – an exercise that can help build empathy and diminish feelings of national, cultural, and racial exceptionalism. Was the US always destined (as its 19th-century ideologues believed) to occupy the middle swath of the North American continent, from sea to shining sea? Or is its national geography the result of a series of decisions and compromises – some of which, if reversed, could have led to a different outcome? The latter view leaves more space for analysis, more chance to examine how power worked during expansion; it’s also the realm of counterfactuals.

One of the fundamental premises of the new pro-counterfactualists is this: just as there are good and bad ways to write standard histories, so too there are good and bad ways to put together a counterfactual. The historian Gavriel Rosenfeld at Fairfield University in Connecticut is working on an edited collection of Jewish alternative histories, and maintains a blog called the Counterfactual History Review, where he aggregates and analyses examples of counterfactualism in public discourse, many of which relate to the Nazi period: Amazon’s recent adaptation of Philip K Dick’s novel The Man in the High Castle (1962); the US presidential candidate Ben Carson’s argument that the Holocaust could have been prevented if Jewish people were better armed; and, yes, the ‘Killing Baby Hitler’ kerfuffle. Rosenfeld argues that a counterfactual’s point of departure from the actual timeline has to be plausible; in other words, it’s much more productive, analytically speaking, to speculate about a situation that was likely to come about, than one that is completely improbable. He also cites a ‘minimal rewrite rule’ that asks the speculator to think about only one major point of divergence, and not to assume two or more big changes in an alternative timeline.

The historian Timothy Burke at Swarthmore College in Pennsylvania teaches a seminar on the topic, and wrote on his blog about a class project in which he gave groups of students counterfactual scenarios (‘Mary Wollstonecraft does not die after the birth of her daughter but in fact lives into old age’; ‘Native American societies have robust resistance to Old War diseases at the time of contact with Europeans in the 15th century’) and asked them to game out the scenario in stages. The experience shows students how to use both direct and contextual evidence from our own timeline to support counterfactual assertions. A good counterfactual scenario must be generated with attention to what’s actually known – about the setting, the time, or the people involved. The closer the counterfactual can hew to actual historical possibility, the more plausible it can be judged to be. The end result should be a counterfactual that is relatively close to the given historical record, and offers a new way to think about the period under discussion. Looked at this way, the exercise of constructing a counterfactual has real pedagogical value. In order to do it well, students must figure out what factors matter in writing history, argue for the importance of the factors they’ve chosen to discuss, and deploy the most helpful existing evidence. It’s a tall order, and pretty far from idle speculation.

by Rebecca Onion, Aeon |  Read more:
Image: Crowds cheer Hitler's Austrian election campaign, April 1938. Photo by LIFE/Getty

Stanley Clarke Band Feat. Hiromi


Yoko, Neko and Mom
via: