Tuesday, December 15, 2015


Daniel Egneus
via:

Marion Fayolle
via:

Your Face Is Covered in Mites, and They're Full of Secrets

When you look in the mirror, you’re not just looking at you—you’re looking at a whole mess of face mites. Yeah, you’ve got ‘em. Guaranteed. The little arachnids have a fondness for your skin, shoving their tubular bodies down your hair follicles, feeding on things like oil or skin cells or even bacteria. The good news is, they don’t do you any harm. The better news is, they’ve got fascinating secrets to tell about your ancestry.

New research out today in the Proceedings of the National Academy of Sciences reveals four distinct lineages of the face mite Demodex folliculorum that correspond to different regions of the world. African faces have genetically distinct African mites, Asian faces have Asian mites, and so too do Europeans and Latin Americans have their own varieties. Even if your family moved to a different continent long ago, your forebears passed down their brand of mites to their children, who themselves passed them on down the line.

Looking even farther back, the research also hints at how face mites hitchhiked on early humans out of Africa, evolving along with them into lineages specialized for certain groups of people around the planet. It seems we’ve had face mites for a long, long while, passing them back and forth between our family members and love-ahs with a kiss—and a little bit of face-to-face skin contact.

Leading the research was entomologist Michelle Trautwein of the California Academy of Sciences, who with her colleagues scraped people’s faces—hey, there are worse ways to make a living—then analyzed the DNA of all the mites they’d gathered. “We found four major lineages,” says Trautwein, “and the first three lineages were restricted to people of African, Asian, and Latin American ancestry.”

The fourth lineage, the European variety, is a bit different. It’s not restricted—it shows up in the three other groups of peoples. But Europeans tend to have only European mites, not picking up the mites of African, Asian, or Latin American folks. (It should be noted that the study didn’t delve into the face mites of all the world’s peoples. The researchers didn’t test populations like Aboriginal Australians, for instance, so there may be still more lineages beyond the four.)

So what’s going on here? Well, ever since Homo sapiens radiated out of Africa, those four groups of people have evolved in their isolation in obvious ways, like developing darker or lighter skin color. But more subtly, all manner of microorganisms have evolved right alongside humans. And with different skin types come different environments for tiny critters like mites.

by Matt Simon, Wired |  Read more:
Image: USDA

Monday, December 14, 2015

Why Are There So Many Mattress Stores?

Dear Cecil:

How do mattress stores manage to stay in business? They're all over the place, but the average adult buys a mattress once every five to ten years. With high overhead and infrequent purchases, how are they around? (This question was inspired by a friend, Bethany.)

— Not Bethany


Cecil replies:

I see your query, NB, and raise you. To my mind, it’s not just about how these stores manage to stay in business: the question is, moreover, how are there so goddamn many of them — particularly right now? Where I live, in Chicago, entire blocks are all but overrun with the places, which frankly don’t do much for a street’s aesthetics. In June a Texas Monthly article described the worrisome proliferation of mattress stores in Houston, where the venerably groovy Montrose neighborhood has become known as “the Mattrose” on account of all the new sleep shops. An April headline in the Northwest Indiana Times asked, apropos the town of Schererville, “Why the heck are so many mattress stores opening?” So: you and I aren’t the only ones wondering. What gives?

One thing that jars about this state of affairs is that, in the age of Amazon, there’s something very old-economy about mattress stores, beyond their relentlessly cheesy look. No one goes to bookstores to buy books anymore, right? Well, not exactly. A 2014 report by the consulting firm A.T. Kearney found that despite the digital hype, overall a full 90 percent of retail transactions still take place in physical stores. And according to an investor presentation by industry giant Mattress Firm, dedicated mattress stores account for 46 percent of total mattress sales, handily beating out furniture stores (35 percent) and department stores (5 percent) for the largest share of the market.

So mattress delivery by drone is still a ways off. But again, these stores aren’t just surviving, they’re flourishing — that market share has more than doubled in the last 20 years. Why open a mattress store when there’s another just down the street? Turns out the economics make perfect sense:

Running a mattress store doesn’t cost much.

by Cecil Adams, The Straight Dope | Read more:
Image: via:

How to build a better PhD

“Since 1977, we've been recommending that graduate departments partake in birth control, but no one has been listening,” said Paula Stephan to more than 200 postdocs and PhD students at a symposium in Boston, Massachusetts, in October this year.

Stephan is a renowned labour economist at Georgia State University in Atlanta who has spent much of her career trying to understand the relationships between economics and science, particularly biomedical science. And the symposium, 'Future of Research', discussed the issue to which Stephan finds so many people deaf: the academic research system is generating progeny at a startling rate. In biomedicine, said Stephan. “We are definitely producing many more PhDs than there is demand for them in research positions.”

The numbers show newly minted PhD students flooding out of the academic pipeline. In 2003, 21,343 science graduate students in the United States received a doctorate. By 2013, this had increased by almost 41% — and the life sciences showed the greatest growth. That trend is mirrored elsewhere. According to a 2014 report looking at the 34 countries that make up the Organisation for Economic Co-operation and Development, the proportion of people who leave tertiary education with a doctorate has doubled from 0.8% to 1.6% over the past 17 years.

Not all of these students want to pursue academic careers — but many do, and they find it tough because there has been no equivalent growth in secure academic positions. The growing gap between the numbers of PhD graduates and available jobs has attracted particular attention in the United States, where students increasingly end up stuck in lengthy, insecure postdoctoral research positions. Although the unemployment rate for people with science doctorates is relatively low, in 2013 some 42% of US life-sciences PhD students graduated without a job commitment of any kind, up from 28% a decade earlier. “But still students continue to enrol in PhD programmes,” Stephan wrote in her 2012 book How Economics Shapes Science. “Why? Why, given such bleak job prospects, do people continue to come to graduate school?”

One reason is that there is little institutional incentive to turn them away. Faculty members rely on cheap PhD students and postdocs because they are trying to get the most science out of stretched grants. Universities, in turn, know that PhD students help faculty members to produce the world-class research on which their reputations rest. “The biomedical research system is structured around a large workforce of graduate students and postdocs,” says Michael Teitelbaum, a labour economist at Harvard Law School in Cambridge, Massachusetts. “Many find it awkward to talk about change.”

But there are signs that the issue is becoming less taboo. In September, a group of high-profile US scientists (Harold Varmus, Marc Kirschner, Shirley Tilghman and Bruce Alberts, colloquially known as 'the Quartet') launched Rescuing Biomedical Research, a website where scientists can make recommendations on how to 'fix' different aspects of the broken biomedical research system in the United States — the PhD among them. “How can we improve graduate education so as to produce a more effective scientific workforce, while also reducing the ever-expanding PhD workforce in search of biomedical research careers?” the site asks.

Nature put a similar question to 33 PhD students, scientists, postdocs and labour economists and uncovered a range of opinions on how to build a better PhD system, from small adjustments to major overhauls. All agreed on one thing: change is urgent. “Academia really is going to have to be dragged kicking and screaming into the twenty-first century,” says Gary McDowell, a postdoctoral fellow at Tufts University in Medford, Massachusetts, and a leader of the group behind the Future of Research symposium. The renovation needs to happen now, says Jon Lorsch, director of the US National Institute of General Medical Sciences in Bethesda, Maryland. “We need to transform graduate education within five years. It's imperative. There's a lot at stake for scientists, and hence for science.”

by Julie Gould, Nature |  Read more:
Image: Oliver Munday

[ed. Take a moment and imagine a million-watt Trump marquee over the front of the White House.]
via:

Sunday, December 13, 2015

What Happens When Computers Learn to Read Books?

In Kurt Vonnegut's classic novel Cat's Cradle, the character Claire Minton has the most fantastic ability; simply by reading the index of the book, she can deduce almost every biographical detail about the author. From scanning a sample of text in the index, she is able to figure out with near certainty that a main character in the book is gay (and therefore unlikely to marry his girlfriend). Claire Minton knows this because she is a professional indexer of books.

And that's what computers are today -- professional indexers of books.

Give a computer a piece of text from the 1950s, and based on the frequency of just fifteen words, the machine will be able to tell you whether the race of the author is white or black. That's the claim from two researchers at the University of Chicago, Hoyt Long and Richard So, who deploy complicated algorithms to examine huge bodies of text. They feed the machine thousands of scanned novels-worth of data, which it analyzes for patterns in the language -- frequency, presence, absence and combinations of words -- and then they test big questions about literary style.

"The machine can always -- with greater than a 95 percent accuracy -- separate white and black writers," So says. "That's how different their language is."

This is just an example. The group is digging deeper on other questions of race in literature but isn't ready to share the findings yet. In this case, minority writers represent a tiny fraction of American literature's canonical text. They hope that by shining a spotlight at unreviewed, unpublished or forgotten authors -- now easier to identify with digital tools -- or by simply approaching popular texts with different examination techniques, they can shake up conventional views on American literature. Though far from a perfect tool, scholars across the digital humanities are increasingly training big computers on big collections of text to answer and pose new questions about the past.

"We really need to consider rewriting American literary history when we look at things at scale," So says.

Who Made Who


A culture's corpus of celebrated literature functions like its Facebook profile. Mob rule curates what to teach future generations and does so with certain biases. It's not an entirely nefarious scheme. According to Dr. So, people can only process about 200 books. We can only compare a few at a time. So all analysis is reductive. The novel changed our relationship with complicated concepts like superiority or how we relate to the environment. Yet we needed to describe -- and communicate -- those huge shifts with mere words.

In machine learning, algorithms process reams of data on a particular topic or question. This eventually allows a computer to recognize certain patterns, whether that means spotting tumors, cycles in the weather or a quirk of the stock market. Over the last decade this has given rise to the digital humanities, where professors with large corpuses of text -- or any data, really -- use computers to develop hard metrics for areas that might be previously seen as more abstract. (...)

Mark Algee-Hewitt's group in Stanford's English department used machines to examine paragraph structure in 19th century literature. We all know that in most literature, when the writer moves to a new paragraph, the topic of the paragraph will change. That's English 101.

But Algee-Hewitt says they also found something that surprised them: whether a paragraph had a single or multiple topic was not governed by the paragraphs' length. One might think that a long paragraph would cover lots of ground. That wasn't the case. Topic variance within a paragraph has more to do with story genre and setting than the length.

Now they are looking for a pattern by narrative type.

"The truth is that we really don't know that much about the American novel because there's so much of it, so much was produced," says So. "We're finding that with these tools, we can do more scientific verification of these hypotheses. And frankly we often find that they're incorrect."

The Blind Men and The Elephant

But a computer can't read. In a human sense. Words create sentences, paragraphs, settings, characters, feelings, dreams, empathy and all the intangible bits in between. A computer simply detects, counts and follows the instructions provided by humans. No machine on earth understands Toni Morrison's Beloved.

At the same time, no human can examine, in any way, 10,000 books at a time. We're in this funny place where people assess the fundamental unit of literature (the story) while a computer assesses all the units in totality. The disparity -- that gap -- between what a human can understand and what a machine can understand is one of the root disagreements, among others, in academia when it comes to methodology around deploying computers to ask big questions about history.

Does a computer end up analyzing literature, itself or those who coded the question?

by Caleb Garling, Pricenomics | Read more:
Image: uncredited

[ed. Nice day to stay inside and watch a little football. (Go Seahawks!)]
via:

Saturday, December 12, 2015

Lexington Lab Band



[ed. Is this the world's greatest cover band or what? I like tondr's guitar lessons on YouTube.]

Jack Barnosky, third penance
via:

Adapting to Climate Change

Yesterday, Thomas Schelling gave a seminar on climate change here at the Center for Study of Public Choice. Schelling’s main argument was that lots of resources are going into predicting and understanding climate change but very little thought or resources are going into planning for adaptation.

If Washington, DC, Boston and Manhattan are to remain dry, for example, we are almost certainly going to need flood control efforts on the level of the Netherlands. It takes twenty years just to come up with a plan and figure out how to pay for these kinds of projects let alone to actually implement them so it’s not too early to beginning planning for adaptation even if we don’t expect to need these adaptations for another forty or fifty years. So far, however, nothing is being done. Climate deniers think planning for adaptation is a waste and many climate change proponents think planning for adaptation is giving up.

Schelling mentioned a few bold ideas. We can protect every city on the Mediterranean from Marseilles to Alexandria to Tel Aviv or we could dam the Strait of Gibraltar. Damming the strait would be the world’s largest construction project–by far–yet by letting the Mediterranean evaporate somewhat it could also generate enough hydro-electric power to replace perhaps all of the fossil fuel stations in Europe and Africa.

Schelling didn’t mention it but in the 1920s German engineer Herman Sörgel proposed such a project calling it Atlantropa (more here). In addition to power, damming the strait would open up a huge swath of valuable land. Gene Roddenberry and Phillip K. Dick were fans but needless to say the idea never got very far. A cost-benefit analysis, however, might show that despite the difficulty, damming the strait would be cheaper than trying to save Mediterranean cities one by one. But, as Schelling argued, no one is thinking seriously about these issues.

I argued that capital depreciates so even many of our buildings, the longest-lived capital, will need to be replaced anyway. Here, for example, is a map showing the age of every building in New York City. A large fraction, though by no means all, are less than one hundred years old. If we let the areas most under threat slowly deteriorate the cost of moving inland won’t be as high as one might imagine–at least if the water rises slowly (not guaranteed!). Schelling agreed that this was the case for private structures but he doubted that we would be willing to let the White House go.

by Alex Tabarrok, Marginal Revolution | Read more:
Image: via:

Setting the Default

I recently did couples therapy with two gay men who’d gotten married a year or so ago. Since then one of them, let’s call him Adam, decided he was bored with his sex life and went to a club where they did some things I will not describe here. His husband, let’s call him Steve, was upset by what he considered infidelity, and they had a big fight. Both of them wanted to stay together for the sake of the kids (did I mention they adopted some kids?) but this club thing was a pretty big deal, so they decided to seek professional help.

Adam made the following proposal: he knew Steve was not very kinky, so Adam would go do his kinky stuff at the club, with Steve’s knowledge and consent. That way everyone could get what they wanted. Sure, it would involve having sex with other people, but it didn’t mean anything, and it was selfish for a spouse to assert some kind of right to “control” the other spouse anyway.

Steve made the following counterproposal: no. He liked monogamy and fidelity and it would make him really jealous and angry to think of Adam going out and having sex with other people, even in a meaningless way. He argued that if Adam didn’t like monogamy, maybe he shouldn’t have proposed entering into a form of life that has been pretty much defined by its insistence on monogamy for the past several thousand years and then sworn adherence to that form of life in front of everyone they knew. If Adam hadn’t liked monogamy, he had ample opportunity to avoid it before he had bound his life together with Steve’s. Now he was stuck.

Adam gave the following counterargument: yeah, marriage usually implies remaining monogamous, but that was all legal boilerplate. He had wanted to get married to symbolize his committment to Steve – committment that he still had! – and he hadn’t realized he was interested in fetish stuff at the time or else he would have brought it up.

Steve gave the following countercounterargument: okay, this is all very sad, but now we are stuck in this position, and clearly only one of the two people could get their preference satisfied, and given the whole marriage-implies-monogamy thing, it seemed pretty clear that that person should be him.

So then of course they both turned to me for advice.

by Scott Alexander, Slate Star Codex |  Read more:
Image:  via:

The German War: A Nation Under Arms, 1939-45

[ed. I just finished reading Anthony Doerr's All the Light We Cannot See, a novel with a similar theme - the average French and German's reaction to, and ultimately, participation in the Second World War. It made me think again about the issue of free will vs. determinism and how a person's moral perspective and/or character could be subsumed (or elevated) by the momentum of larger forces - forces that determine one's fate long before they are felt.]

Most Germans did not want war in 1939. When it came, following Hitler’s invasion of Poland, there was no euphoria and flag-waving, as there had been in 1914, but dejection; the people were downcast, one diarist noted. The mood soon lifted, as the Third Reich overran its neighbours, but most Germans still hoped for a quick conclusion. As Nicholas Stargardt points out in his outstanding history of Germany during the second world war, the Nazi regime was most popular “when it promised peace, prosperity and easy victories”. And yet, German troops continued to fight an ever more protracted battle, with ever more brutality, while the home front held tight. Even when it was clear that all was lost, there was no collapse or uprising, as in 1918. Why?

There are two easy answers. After the war, many Germans claimed to have been cowed by an omnipotent terror apparatus. More recently, some historians have argued the opposite: the Nazi regime was buoyed by fervent support, with ordinary Germans backing Hitler to the end. Stargardt dismisses both answers convincingly. Domestic terror alone, though ever-present, did not ensure the war’s continuation. Neither did popular enthusiasm for nazism. Of course there was significant support for Hitler’s regime, at least as long as the campaign went well. “God couldn’t have sent us a better war,” one soldier wrote to his wife in summer 1940, as the Wehrmacht routed France. But opinion was fickle, fluctuating with the fortunes of war.

Grumbling about rationing and shortages began within weeks, and never ceased, even as the regime alleviated hardships at home through the ruthless exploitation of occupied Europe (midway through the war, almost 30% of Germany’s meat came from abroad). There was plenty of resentment, too, about the privileges of the Nazi elite, which gorged itself on delicacies as ordinary Germans chewed “cutlets” made from cabbage. As a popular joke had it: “When will the war end?” “When Göring fits into Goebbels’s trousers”. Resentment of the regime grew as allied bombs rained on Germany, displacing millions and killing more than 400,000. German civilians criticised their leaders for the porous air defences, and they also turned on each other. Evacuees from the cities complained about the “simple and stupid” peasants who hosted them, while the farmers accused the new arrivals of laziness and loose morals. Back in the urban centres, locals were relieved when they were spared because a different German city was hit instead. The supposedly unified Nazi “national community” was just a fiction.

Despite this lack of national cohesion and the growing war fatigue, Germans kept fighting. Most important, Stargardt suggests, were their feelings of “patriotic defiance”, arising less from fanatical nazism than familial bonds. They had to win the war at any cost, soldiers believed, to protect their loved ones and to make Germany impregnable. “Your father is away,” one soldier lectured his teenage son in 1942, “and is helping to prepare a better future for you, so that you don’t have to do it later yourselves.” Even Germans appalled by the genocidal war waged in their name rallied around their country. Their determination was fuelled by Nazi propaganda, which insisted that this was a defensive war, provoked by Germany’s enemies, and warned that defeat would mean the annihilation of the fatherland. This campaign, based on “strength through fear” (as a British commentator quipped), hit home. As another soldier wrote to his wife just weeks before the final surrender: “If we go to the dogs, then everything goes to the dogs.”

Propaganda and popular opinion are just two key themes in Stargardt’s sweeping history, which takes in almost everything, from battles to religion and entertainment. And although the focus is on wartime Germany, we also see the suffering the war brought to the rest of Europe: pulverised cities, ravaged countryside, countless victims. Crucially, the death and destruction wrought by the German conquerors was not hidden from the population back home. Germans knew that the regime relied on pillage and plunder, bolstering the war effort with raw materials and slave labour from across Europe. And they knew that huge numbers of Jews were murdered in the east.

Historians have long debunked the postwar myth of German ignorance about the Holocaust, and Stargardt presents further evidence that the genocide was an open secret. News spread via German soldiers and officials who witnessed massacres, or participated in them. “The Jews are being completely exterminated,” a policeman wrote in August 1941 to his wife in Bremen. Nazi propaganda also dropped heavy hints, creating a sense of societal complicity: in autumn 1941, for instance, the Nazi party displayed posters across the country, emblazoned with Hitler’s threat that a world war would lead to the “destruction of the Jewish race in Europe”. Ordinary Germans watched the deportations of their Jewish neighbours and purchased their abandoned property at bargain prices. Later on, the authorities distributed the belongings of Jews among bombed-out Germans, though this triggered new complaints about Nazi bigwigs grabbing the best bits and “laying their Aryan arses in the Jewish beds after they have exterminated the Jews”, as one employee in a Bavarian factory exclaimed. There was some popular unease about the genocide, and it came into the open during the intense allied bombing, in a rather twisted manner: many ordinary Germans bought into the Nazi propaganda picture of Jews pulling the strings in Britain and the USA, and understood the air raids as payback for the antisemitic pogroms and mass murders. In this way, writes Stargardt, the Germans “mixed anxieties about their culpability with a sense of their own victimhood”.

by Nikolaus Wachsmann, The Guardian | Read more:
Image: Popperfoto/Getty Images

Thursday, December 10, 2015

What Your Microbiome Wants for Dinner

Let’s admit it. Few of us like to think, much less talk about our colons. But you might be surprised at the importance of what gets into your colon and what goes on inside it. This little-loved part of our bodies is actually less an onboard garbage can and more like the unlikeliest medicine chest.

There is abundant medical evidence that diet greatly influences health, and new science is showing us why this is so. It is also showing us that advocates of trendy paleo and vegan diets are missing the big picture of how our omnivorous digestive system works.

Your colon is the home for much of your microbiome—the community of microbial life that lives on and in you. In a nutshell, for better and worse, what you eat feeds your microbiome. And what they make from what you eat can help keep you healthy or foster chronic disease.

To gain an appreciation of the human colon and the role of microbes in the digestive tract as a whole, it helps to follow the metabolic fate of a meal. But, first, a word about terms. We’ll refer to the digestive tract as the stomach, small intestine, and colon. While the colon is indeed called the “large intestine,” this is a misnomer of sorts. It is no more a large version of the small intestine than a snake is a large earthworm.

The stomach might better be called a dissolver, the small intestine an absorber, and the colon a transformer. These distinct functions help explain why microbial communities of the stomach, small intestine, and colon are as different from one another as a river and a forest. Just as physical conditions like temperature, moisture, and sun strongly influence the plant and animal communities that one sees on a hike from a mountain peak to the valley below, the same holds true along the length of the digestive tract.

Imagine you are at a Fourth of July barbecue. You saunter over to the grill to take a look at the fare. The pork ribs look great so you spear a few and add a heap of homemade sauerkraut on the side. You grab a handful of corn chips and a few pieces of celery. The vegetable skewers look good too, so you add one to the pile on your plate. And what would the Fourth of July be without macaroni salad and pie?

You lift a rib to your mouth and start gnawing. A forkful of sauerkraut mingles well with the meat and you crunch your way through another mouthful. The macaroni squishes between your teeth, but the celery takes some chewing. It all slips down the hatch and lands in the acid vat of your stomach where gastric acids start dissolving the bits of food. On the pH scale, where 7 is neutral and lower values are more acidic, the stomach is impressive. Its acidity ranges from 1 to 3. Lemon juice and white vinegar are about a 2.

After the stomach acids work over your meal, the resultant slurry drops into the top of the small intestine. Right away bile from the liver shoots in and starts working over the fats, breaking them down. Pancreatic juices also squirt into the small intestine to join the digestive party. Your Fourth of July feast is now on its way to full deconstruction into the basic types of molecules—simple and complex carbohydrates (sugars), fats, and proteins. In general, there is an inverse relationship between the size and complexity of these molecules and their fate in the digestive tract. Smaller molecules, primarily the simple sugars that compose the refined carbohydrates in the macaroni, pie crust, and chips are absorbed relatively quickly. Larger or more complex molecules take longer to break down and are absorbed in the lower reaches of the small intestine.

The sausage-like loops of the small intestine provide an entirely different type of habitat for your microbiota than the stomach. Acidity drops off rapidly and, in combination with all the nutrients, the abundance of bacteria shoots up to 10,000 times more than that in the stomach. But conditions still aren’t ideal for bacteria in the small intestine. It’s too much like a flooding river. And understandably so, considering that about seven quarts of bodily fluids, consisting of saliva, gastric and pancreatic juices, bile, and intestinal mucus flow through it every day. And that’s not including the two additional quarts of whatever other liquids you consume. The rushing swirl of fluids entrains food molecules and bacteria and carries them rapidly downstream. The constant motion means that nothing stays put for long, so bacteria can’t really settle in and contribute much to digestion.

By the middle to lower reaches of your small intestine, the fats, proteins, and some of the carbohydrates in the Fourth of July slurry are sufficiently broken down for absorption and pass into the bloodstream through the intestinal wall. Notice we said some of the carbohydrates. A good amount of them aren’t broken down at all. These complex carbohydrates, what your doctor calls fiber, have a completely different fate than simple carbohydrates.

They drop, undigested, into the slough-like environment of the colon. With a neutral pH of about 7, the colon is a paradise for bacteria compared to the acid vat of the stomach or the churning rapids of the small intestine, where the pH is slightly lower.

Deep within the safety of our inner sanctum, communities of microbial alchemists use our colon as a transformative cauldron in which to ferment the fiber-rich complex carbohydrates we can’t digest. But it takes the right microbes. For example, Bacteroides thetaiotaomicron makes over 260 enzymes that break apart complex carbohydrates. In contrast, the human genome codes for a paltry number. We can only make about 20 enzymes to break down complex carbohydrates.

by David R. Montgomery and Anne Biklé, Nautilus | Read more:
Image: Courtesy of the authors

Golf's Iconoclast Comes Clean

Next year, golf is returning to the Olympics for the first time in more than a century – and a Vandyke-bearded bipolar alcoholic who sometimes covers PGA tournaments while dressed like a pirate will be doing the play-by-play.

"I've never been sure about the whole drug-testing aspect of the Olympics," says David Feherty, 57, a former European Tour player from Northern Ireland whose training regimen once included weed, cocaine and a daily dose of 40 Vicodin and two and a half bottles of whiskey. "If they come up with a drug that helps you play golf better, I am going to be so pissed – I looked for that for years."

In the staid world of pro golf, Feherty is a smart, funny wild card whose cult celebrity is transcending the sport. He covers PGA tournaments while describing a player as having "a face like a warthog stung by a wasp" on live TV, does standup, writes bestselling novels and hosts a Golf Channel show where he gets guests like Bill Clinton and Larry David to open up about their games and lives. Feherty's secret? Sober since 2005, he's now got nothing to hide. "One of the advantages of having a fucked-up life is that other people are more comfortable telling you about theirs," he says. "I see from a different side of the street than most people."

Born on the outskirts of Belfast, Feherty turned pro at 18 and quickly embraced the European Tour's hard-living lifestyle. In 1986, after winning the Scottish Open in Glasgow, he went on a bender and awoke two days later on a putting green 150 miles away – alongside Led Zeppelin's road manager, with no recollection of getting there or what happened to his silver trophy. Once while playing in the Swedish Open, he went out for a drink and arose the next day in Denmark. "After that, I always kept $600 in my wallet," he says, "because that's exactly what it cost me to get back to the golf club just in time to miss my starting time."

After a middling pro career, he became a PGA Tour commentator in 1997, eventually moving to Dallas, raising a family, getting diagnosed with bipolar disorder and sobering up. An insomniac who still struggles with depression – "I get overwhelmed by sadness several times a day and spend a lot of time in tears" – Feherty has managed to achieve success by channeling his restlessness into his work. "I now take 14 pills a day – antidepressants, mood stabilizers and amphetamines," he says. "The Adderall is enough to tear most people off the ceiling, but I can take a nap."

For Feherty, 2016 will be a turning point. After 19 years working as a commentator for CBS, he'll move to NBC – a transition that allows him to take his talent beyond the fairways. In addition to the Olympics, he'll cover the international Ryder Cup and other tournaments while continuing to host his talk show – and is even looking to conquer new sports.

"Remember Fred Willard in Best in Show?" he asks. "If there's a place somewhere for a golf analyst where no technical knowledge is required, I would love to jump in – I just want to be challenged again."

As he prepares for the next chapter in his improbable career, Feherty spoke to Rolling Stone about partying like a rock star, cultivating his rumpled mystique and changing the face of golf.

A lot of musicians are also avid golfers – why do you think that is?

So many musicians play golf, especially people in rock & roll, but most of them use golf as an alternative to drugs and alcohol. I think for addicts, spare time is their worst enemy. And you know, golf takes up time – actually it's one of the problems with the game, but it works in our favor.

by Stayton Bonner, Rolling Stone |  Read more:
Image: Chris Condon/PGA/Getty

A Colorblind Constitution: What Abigail Fisher’s Affirmative Action Case Is Really About

[ed. From earlier this year - this case is acutally being heard right now. See also: Supreme Court Justices’ Comments Don’t Bode Well for Affirmative Action]

Court on Monday announced that it would again hear Fisher v. Texas, an affirmative action case in which a white woman claims she was denied admission to the University of Texas because of her race. In 2013, the Court ruled narrowly on the case, requiring the federal appeals court that had ruled against the woman, Abigail Fisher, to re-examine her arguments. Last year, the appeals court again decided against Fisher, affirming that race could be one of the factors considered in trying to diversify the student body at the university.

Months ago, Linda Greenhouse, the Supreme Court expert, asked of the Fisher case: “What will the court do? Let the latest Fifth Circuit opinion, with its endorsement of race-conscious admissions, stand unreviewed? Or plunge back into the culture wars with a case that sorely tested collegial relations among the justices two years ago and that promises to be at least as challenging a second time around?”

The court has now chosen its path. It will re-engage.

In 2013, ProPublica published what became one of the most provocative analyses of the Fisher case. It highlighted an overlooked, deeply ironic fact about the case: when one actually looked at Fisher’s arguments, she actually had not been denied admission because of her race, but rather because of her inadequate academic achievements. Read that analysis, originally published March 18, 2013, below.

Original story:

When the NAACP began challenging Jim Crow laws across the South, it knew that, in the battle for public opinion, the particular plaintiffs mattered as much as the facts of the case. The group meticulously selected the people who would elicit both sympathy and outrage, who were pristine in form and character. And they had to be ready to step forward at the exact moment when both public sentiment and the legal system might be swayed.

That's how Oliver Brown, a hard-working welder and assistant pastor in Topeka, Kan., became the lead plaintiff in the lawsuit that would obliterate the separate but equal doctrine. His daughter, whose third-grade innocence posed a searing rebuff to legal segregation, became its face.

Nearly 60 years after that Supreme Court victory, which changed the nation, conservatives freely admit they have stolen that page from the NAACP's legal playbook as they attempt to roll back many of the civil rights group's landmark triumphs.

In 23-year-old Abigail Noel Fisher they've put forward their version of the perfect plaintiff to challenge the use of race in college admissions decisions.

Publicly, Fisher and her supporters, chief among them the conservative activist who conceived of the case, have worked to make Fisher the symbol of racial victimization in modern America. As their narratives goes, she did everything right. She worked hard, received good grades, and rounded out her high school years with an array of extracurricular activities. But she was cheated, they say, her dream snatched away by a university that closed its doors to her because she had been born the wrong color: White.

The daughter of suburban Sugar Land, Texas, played the cello. Since the second grade, she said, she dreamed of carrying on the family tradition by joining her sister and father among the ranks of University of Texas at Austin alumni.

And the moment for her to lend her name to the lawsuit might never be riper: The Supreme Court has seated its most conservative bench since the 1930s. The Court is expected to issue a decision any week now in what is considered one of the most important civil rights cases in years.

On a YouTube video posted by Edward Blum, a 1973 University of Texas graduate whose nonprofit organization is bankrolling the lawsuit, she is soft-spoken, her strawberry blond hair tucked behind one ear. Not even a swipe of lip gloss adorns her girlish face.

"There were people in my class with lower grades who weren't in all the activities I was in, who were being accepted into UT, and the only other difference between us was the color of our skin," she says. "I was taught from the time I was a little girl that any kind of discrimination was wrong. And for an institution of higher learning to act this way makes no sense to me. What kind of example does it set for others?"

It's a deeply emotional argument delivered by an earnest young woman, one that's been quoted over and over again.

Except there's a problem. The claim that race cost Fisher her spot at the University of Texas isn't really true.

In the hundreds of pages of legal filings, Fisher's lawyers spend almost no time arguing that Fisher would have gotten into the university but for her race.

If you're confused, it is no doubt in part because of how Blum, Fisher and others have shaped the dialogue as the case worked its way to the country's top court.

Journalists and bloggers have written dozens of articles on the case, including profiles of Fisher and Blum. News networks have aired panel after panel about the future of affirmative action. Yet for all the front-page attention, angry debate and exchanges before the justices, some of the more fundamental elements of the case have been little reported.

Race probably had nothing to do with the University of Texas's decision to deny admission to Abigail Fisher.

by Nikole Hannah-Jones, ProPublica | Read more:
Image Susan Walsh/AP

Wednesday, December 9, 2015

In Texting, Punctuation Conveys Different Emotions. Period.

[ed. See also: What’s Really Hot on Dating Sites? Proper Grammar.]

Technology is changing language, period

The use of a period in text messages conveys insincerity, annoyance and abruptness, according to a new study from the State University of New York Binghamton. Omitting better communicates the conversational tone of a text message, the study says.

As with any study by university researchers, though, it’s not that simple. The study found that some punctuation expresses sincerity. An exclamation point is viewed as the most sincere. (I overuse exclamation points!)

“It’s not simply that including punctuation implies a lack of sincerity,” said the study’s lead author, Celia Klin, an associate professor of psychology at Binghamton. “There’s something specific about the use of the period.”

by Christina Passariello, WSJ |  Read more:
Image: via: