Monday, June 6, 2016
The Art of Pivoting
It’s June 2016 and I’m packing my bags to move back to Germany after 12 years of academic research at the University of Cambridge and surrounding institutes, like the famous MRC Laboratory of Molecular Biology, forge of Nobel Prizes and home to eminent scientists like Watson & Crick, Sanger, Perutz, the ones you know from Jeopardy or biochemistry textbooks. I had come from a Max-Planck-Institute in Germany, where I had previously completed a life science PhD in slightly under three years. When I started my degree there in 2001, I had been the fastest student to fulfil the requirements for the Diplom in biology at my home university — and already had two peer-reviewed publications in my pocket. You may see the trajectory: success, efficiency, coming from good places, going to good places; the basic ingredients for a successful academic career.
My wife and I had moved to Cambridge in 2004 to both do a brief postdoc abroad. Spice up the CV a bit, meet interesting people before settling down with a normal job back in the home country, that sort of stuff. The work I did was advanced and using technology not available to many people in Europe outside Cambridge at the time, but not revolutionary. However, combining experimental molecular biology and computational analysis of large biological datasets had just seen its first great successes, and I was a man in demand with my coding skills. Publications are the number one currency to climb the academic ladder and, by 2007, I had accumulated enough credit both in terms of scientific output as well as reputation in the field that I seriously considered an academic career for life.
Here, it may need to be explained to everyone who hasn’t spent time in academia why seriously considered is the appropriate phrase. It was a conscious decision for the long game. It’s the Tour de France or Iron Man of a career. You have to believe that you can do it and secure a position against all odds and a fierce competition. You have to be in it to win it. Chances are that you’re not going to make it, a fear that’s constantly present but there’s normally no-one you know who you could ask what life on the other side looks like, because failed academics -an arrogant view I held myself for a long time about those not making it- tend to disappear, ashamed and silent. Or get normal, unglorious jobs. According to my wife, who left academia when our second child was on the way, you got to be “stupid enough to commit to that”, given that academic salaries are poor compared even to entry-level industry positions, the workload is bigger, quite similar to that of running a start-up, and the so-called academic freedom is these days reduced to framing your interests into what funding bodies consider worth supporting.
Speaking of start-ups. 90% of start-ups fail. That’s a slightly better success rate than getting into the game that allows you to fight for a permanent academic position in the first place. In my cohort of Royal Society University Research Fellows, the success rate of obtaining a salary whilst building up a team was about 3%. What happens to the others who want to do science in academia? I’m sure many would not mind to stay postdoctoral scientists forever and pursue research in support of some other principal investigator (PI, a research group leader), but the system doesn’t cater well for that career track. Up is the only way. If you can’t make it to the group leader level, chances are that sooner or later you’re running out of funding. That’s because on the postgraduate level, especially after the financial crisis, there is a rather limited amount of money in the system that allows employment which resembles a regular job. Ambition, ego or an almost unreasonable love for the subject is the key driver for everyone else. Money is dished out competitively, and of course it’s considered an honour to be bringing your own salary to work unsocial hours for a rising star or established hot-shot. This sees many PhD level researchers leave academia sooner or later.
This isn’t necessarily a bad thing. It’s just not what many of them had envisaged when they started their journey in university because they were hoping to do independent research in an academic setting. (...)
It’s a common joke that academics have a problem with time management because of their inability to say no. Everyone higher up the food chain tells young investigators to say no. No to teaching. No to committees. No to administrative duties. “Concentrate on your science, because that’s what you’re going to be assessed on”. At the same time, it’s very clear that if the choice is between two candidates, the better departmental citizen is more likely to be successful. In fact, my good citizenship was explicitly spelled out in my Head of Department’s recommendation letter to the Royal Society, while at the same time pointing out to me that I might want to consider a few less activities.
The rules about departmental citizenship are nowhere written. It’s just what you hear between the lines in comments about the poor performer who failed to do submit his part for a communal bid or the raised eyebrow about some lazy bastard who refused to teach. Unless the system discourages anyone with the ambition to secure a permanent post actively from taking on additional responsibilities, unestablished PIs are going to pour themselves into research, teaching, administration, outreach, you name it — at 110% of what’s healthy.
Add three little kids into that mix, and it may become clear why over time I’ve acquired a collection of meds vast enough to run a burn-out clinic. (...)
Five years into my Fellowship, I felt more and more like a chased rabbit. Work was not about science anymore, work had become that abstract thing you need to do in order to secure a post. Also, with all the activities I agreed to do and to participate in, the time I actually spent doing my own hands-on research had become marginal. While my research group was at its peak and, from the outside, I looked like a very successful scientist, my job and my attitude towards it had completely changed. I began to hate my job.
Running a prolific computational biology research team at the University of Cambridge, I imagined it would be easy to switch into a management role in pharmaceutical R&D. I sent a few applications and had a few telephone conversations, but very soon it emerged that I did not have the relevant qualifications -that is: no business experience- to successfully run a group in industry. My wife explained to me that I had long surpassed the point-of-no-return, because just as you have to earn your stripes in academia to be trusted with directing research, you do have to have industrial project experience and considerable domain knowledge about drug development to be trusted with a R&D team. My most realistic chance would be a more technical role, at least to start with.
Swallowing my pride, I applied for Senior Scientist positions, or, as I thought of it, I applied to become a compute monkey for someone with a lot less academic credibility. However, while next-generation sequencing, gene expression analysis, pathway reconstruction and pipeline development were all happening in my own research group, I was clearly not the one who knew the nitty-gritty of their implementation anymore. The interviews were humiliating. “What’s your favourite Bioconductor package for RNA-seq?” — “Uh, I’d have to ask my PhD student for that.” “How do you force the precise calculation of p-values in kruskal.test?” — “I’d google it!”. Needless to say, I didn’t get a single offer.
by Boris Adryan, Medium | Read more:
Image: uncredited

Here, it may need to be explained to everyone who hasn’t spent time in academia why seriously considered is the appropriate phrase. It was a conscious decision for the long game. It’s the Tour de France or Iron Man of a career. You have to believe that you can do it and secure a position against all odds and a fierce competition. You have to be in it to win it. Chances are that you’re not going to make it, a fear that’s constantly present but there’s normally no-one you know who you could ask what life on the other side looks like, because failed academics -an arrogant view I held myself for a long time about those not making it- tend to disappear, ashamed and silent. Or get normal, unglorious jobs. According to my wife, who left academia when our second child was on the way, you got to be “stupid enough to commit to that”, given that academic salaries are poor compared even to entry-level industry positions, the workload is bigger, quite similar to that of running a start-up, and the so-called academic freedom is these days reduced to framing your interests into what funding bodies consider worth supporting.
Speaking of start-ups. 90% of start-ups fail. That’s a slightly better success rate than getting into the game that allows you to fight for a permanent academic position in the first place. In my cohort of Royal Society University Research Fellows, the success rate of obtaining a salary whilst building up a team was about 3%. What happens to the others who want to do science in academia? I’m sure many would not mind to stay postdoctoral scientists forever and pursue research in support of some other principal investigator (PI, a research group leader), but the system doesn’t cater well for that career track. Up is the only way. If you can’t make it to the group leader level, chances are that sooner or later you’re running out of funding. That’s because on the postgraduate level, especially after the financial crisis, there is a rather limited amount of money in the system that allows employment which resembles a regular job. Ambition, ego or an almost unreasonable love for the subject is the key driver for everyone else. Money is dished out competitively, and of course it’s considered an honour to be bringing your own salary to work unsocial hours for a rising star or established hot-shot. This sees many PhD level researchers leave academia sooner or later.
This isn’t necessarily a bad thing. It’s just not what many of them had envisaged when they started their journey in university because they were hoping to do independent research in an academic setting. (...)
It’s a common joke that academics have a problem with time management because of their inability to say no. Everyone higher up the food chain tells young investigators to say no. No to teaching. No to committees. No to administrative duties. “Concentrate on your science, because that’s what you’re going to be assessed on”. At the same time, it’s very clear that if the choice is between two candidates, the better departmental citizen is more likely to be successful. In fact, my good citizenship was explicitly spelled out in my Head of Department’s recommendation letter to the Royal Society, while at the same time pointing out to me that I might want to consider a few less activities.
The rules about departmental citizenship are nowhere written. It’s just what you hear between the lines in comments about the poor performer who failed to do submit his part for a communal bid or the raised eyebrow about some lazy bastard who refused to teach. Unless the system discourages anyone with the ambition to secure a permanent post actively from taking on additional responsibilities, unestablished PIs are going to pour themselves into research, teaching, administration, outreach, you name it — at 110% of what’s healthy.
Add three little kids into that mix, and it may become clear why over time I’ve acquired a collection of meds vast enough to run a burn-out clinic. (...)
Five years into my Fellowship, I felt more and more like a chased rabbit. Work was not about science anymore, work had become that abstract thing you need to do in order to secure a post. Also, with all the activities I agreed to do and to participate in, the time I actually spent doing my own hands-on research had become marginal. While my research group was at its peak and, from the outside, I looked like a very successful scientist, my job and my attitude towards it had completely changed. I began to hate my job.
Running a prolific computational biology research team at the University of Cambridge, I imagined it would be easy to switch into a management role in pharmaceutical R&D. I sent a few applications and had a few telephone conversations, but very soon it emerged that I did not have the relevant qualifications -that is: no business experience- to successfully run a group in industry. My wife explained to me that I had long surpassed the point-of-no-return, because just as you have to earn your stripes in academia to be trusted with directing research, you do have to have industrial project experience and considerable domain knowledge about drug development to be trusted with a R&D team. My most realistic chance would be a more technical role, at least to start with.
Swallowing my pride, I applied for Senior Scientist positions, or, as I thought of it, I applied to become a compute monkey for someone with a lot less academic credibility. However, while next-generation sequencing, gene expression analysis, pathway reconstruction and pipeline development were all happening in my own research group, I was clearly not the one who knew the nitty-gritty of their implementation anymore. The interviews were humiliating. “What’s your favourite Bioconductor package for RNA-seq?” — “Uh, I’d have to ask my PhD student for that.” “How do you force the precise calculation of p-values in kruskal.test?” — “I’d google it!”. Needless to say, I didn’t get a single offer.
by Boris Adryan, Medium | Read more:
Image: uncredited
How Americans Came to Die in the Middle East
The writing of this historical synopsis began yesterday, Memorial Day. It is an attempt by this former artillery officer with a father buried in a veteran’s cemetery to understand why brave Americans were sent to their death in the Middle East and are still dying there.
The hope is that we finally can learn from history and not keep repeating the same mistakes.
It’s important to stick to the facts, since the history of the Middle East already has been grossly distorted by partisan finger-pointing and by denial and cognitive dissonance among the politicians, foreign policy experts (in their own minds), and media blowhards and literati on the left and right, who now claim that they had nothing to do with grievous policy mistakes that they had once endorsed.
The key question, as in all history, is where to begin the history lesson.
We could go all the way back to religious myths, especially the ones about Moses and the Ten Commandments and about Mohammed and his flying horse. Or on a related note, we could go back to the schism that took place between Shia and Sunni Muslims in the seventh century. Such history is relevant, because American soldiers have been foolishly inserted in the middle of the competing myths and irreconcilable schism, but without the inserters acknowledging the religious minefields and steering clear of them.
We also could go back to the First World War and the defeat of the Ottoman Empire, when France and Britain carved up the Middle East into unnatural client states, when Arabs were given false promises of self-determination, when American geologists masqueraded as archeologists as they surreptitiously surveyed for oil, and when the United States joined Saudi Arabia at the hip through the joint oil venture of Aramco.
Another starting point could be 1948, when the United States, under the lead of President Truman, supported the formal establishment of the Jewish State of Israel, thus reversing the longstanding opposition to Zionism by many (most?) American and European Jews and non-Jews. One can endlessly debate the plusses and minuses of our alliance with Israel, as well as the morality of Israel’s violent founding and the violent Palestinian resistance. But it’s undeniable that the alliance has led many Muslims to put a target on Uncle Sam’s back.
Still another starting point could be the 1953 coup d’état against the democratically-elected Iranian President Mohammad Mosaddegh, orchestrated by the CIA in conjunction with the Brits. The coup was triggered when Mosaddegh demanded an auditing of the books of the Anglo-Iranian Oil Company, a British company known today as BP. He threatened nationalization when the British refused to allow the audit. He was replaced by the Shah of Iran, who was seen by many Iranians and Arabs as a puppet of the United States. (Ironically, during the Second World War, Great Britain and the Soviet Union had occupied Iran and deposed an earlier shah.)
It’s considered unpatriotic to ask how my fellow Americans would feel if the tables had been turned and Iranians had deposed an American president and replaced him with their lackey. Therefore, I won’t ask.
It also would be unpatriotic to ask how we’d feel if Iranians had shot down one of our passenger jets, as we had shot down one of theirs in 1988 as it was crossing the Persian Gulf to Dubai from Tehran. Again, I’m not asking.
Anyway, let’s return to the Shah. Starting with President Nixon and continuing with President Carter, the USA sold weapons to the Shah worth billions of dollars. There was even an agreement to sell nuclear reactors to him. Those weapons would later be used by Iran against the U.S. in the Persian Gulf after we had sided with Saddam Hussein in his war against Iran.
At a state dinner in Tehran on December 31, 1977, the Shah toasted President Carter. Carter responded effusively, saying that Iran was “an island of stability in one of the more troubled areas of the world.” He went on to say: This is a great tribute to you, Your Majesty, and to your leadership and to the respect and the admiration and love which your people give to you.”
Actually, most Iranians hated the Shah. Two years later, on January 16, 1979, the unpopular Shah fled into exile after losing control of the country to Shiite cleric Ayatollah Ruhollah Khomeini and his Iranian Revolution.
Then in October of that year, Carter allowed the Shah to come to the USA for medical treatment. Responding with rage, Iranian students stormed the U.S. embassy in Tehran and took embassy personnel hostage, in a hostage drama that would last 444 days, including a failed attempt to rescue the hostages that left dead American soldiers and burnt helicopters in Iran. The drama ended on the day that Carter left office.
But none of the above events is where our history of American lives lost in the Middle East should begin. It should begin in the summer of 1979, with a report written by a low-level Defense Department official by the name of Paul Wolfowitz. His “Limited Contingency Study” assessed the political, geopolitical, sectarian, ethnic, and military situation in the Middle East and recommended a more active American involvement in the region, including possible military intervention to blunt the Soviet Union’s influence, protect our access to oil, and thwart the ambitions of Iraq under its dictator, Saddam Hussein.
Wolfowitz would later become a deputy to Defense Secretary Donald Rumsfeld under the presidency of George W. Bush.
Note that Wolfowitz’s paper was written long before 9/11 and long before the toppling of Saddam Hussein in the Second Gulf War after he was accused of having weapons of mass destruction.
Until the Wolfowitz report, the USA had taken a rather passive and indirect role in the Middle East, placing it secondary to other geopolitical matters and using proxies and intelligence “spooks” to protect its interests in the region. Of course this low-level interference in the affairs of other nations was not seen as low level by the targets of the actions. To use common vernacular, it pissed them off, just as it would have pissed us off if the roles had been reversed. But again, it’s unpatriotic to consider the feelings of others, especially if they are seen as the enemy, or backwards, or religious zealots.
Strategic and tactical thinking began to change with the Wolfowitz paper. Plans started to be developed for military action to replace more benign approaches. Eventually, the plans indeed resulted in military actions, ranging from full-scale war to bombing from the air to drone warfare, in such places as Lebanon, Afghanistan, Iraq, Kuwait, Libya, Syria, Yemen, Pakistan, and Somalia (the locale of “Blackhawk Down”), with side actions outside of the Middle East in Bosnia and Kosovo.
In each case the American military performed admirably and often exceptionally, but less so for Defense Department analysts, for Congress and the White House, for the press on the left and right, or for the public at large—most of whom got caught up in the passions of the moment and didn’t understand the cultures they were dealing with and didn’t think through the unintended consequences of military actions in lands where Western concepts of justice, fairness, equality, tolerance, pluralism, religious freedom, diversity, and multiculturalism were as foreign and out of place as an American tourist wearing flipflops and shorts in a mosque.
by Craig Cantoni, Mish Talk | Read more:
Image: via:
The hope is that we finally can learn from history and not keep repeating the same mistakes.

The key question, as in all history, is where to begin the history lesson.
We could go all the way back to religious myths, especially the ones about Moses and the Ten Commandments and about Mohammed and his flying horse. Or on a related note, we could go back to the schism that took place between Shia and Sunni Muslims in the seventh century. Such history is relevant, because American soldiers have been foolishly inserted in the middle of the competing myths and irreconcilable schism, but without the inserters acknowledging the religious minefields and steering clear of them.
We also could go back to the First World War and the defeat of the Ottoman Empire, when France and Britain carved up the Middle East into unnatural client states, when Arabs were given false promises of self-determination, when American geologists masqueraded as archeologists as they surreptitiously surveyed for oil, and when the United States joined Saudi Arabia at the hip through the joint oil venture of Aramco.
Another starting point could be 1948, when the United States, under the lead of President Truman, supported the formal establishment of the Jewish State of Israel, thus reversing the longstanding opposition to Zionism by many (most?) American and European Jews and non-Jews. One can endlessly debate the plusses and minuses of our alliance with Israel, as well as the morality of Israel’s violent founding and the violent Palestinian resistance. But it’s undeniable that the alliance has led many Muslims to put a target on Uncle Sam’s back.
Still another starting point could be the 1953 coup d’état against the democratically-elected Iranian President Mohammad Mosaddegh, orchestrated by the CIA in conjunction with the Brits. The coup was triggered when Mosaddegh demanded an auditing of the books of the Anglo-Iranian Oil Company, a British company known today as BP. He threatened nationalization when the British refused to allow the audit. He was replaced by the Shah of Iran, who was seen by many Iranians and Arabs as a puppet of the United States. (Ironically, during the Second World War, Great Britain and the Soviet Union had occupied Iran and deposed an earlier shah.)
It’s considered unpatriotic to ask how my fellow Americans would feel if the tables had been turned and Iranians had deposed an American president and replaced him with their lackey. Therefore, I won’t ask.
It also would be unpatriotic to ask how we’d feel if Iranians had shot down one of our passenger jets, as we had shot down one of theirs in 1988 as it was crossing the Persian Gulf to Dubai from Tehran. Again, I’m not asking.
Anyway, let’s return to the Shah. Starting with President Nixon and continuing with President Carter, the USA sold weapons to the Shah worth billions of dollars. There was even an agreement to sell nuclear reactors to him. Those weapons would later be used by Iran against the U.S. in the Persian Gulf after we had sided with Saddam Hussein in his war against Iran.
At a state dinner in Tehran on December 31, 1977, the Shah toasted President Carter. Carter responded effusively, saying that Iran was “an island of stability in one of the more troubled areas of the world.” He went on to say: This is a great tribute to you, Your Majesty, and to your leadership and to the respect and the admiration and love which your people give to you.”
Actually, most Iranians hated the Shah. Two years later, on January 16, 1979, the unpopular Shah fled into exile after losing control of the country to Shiite cleric Ayatollah Ruhollah Khomeini and his Iranian Revolution.
Then in October of that year, Carter allowed the Shah to come to the USA for medical treatment. Responding with rage, Iranian students stormed the U.S. embassy in Tehran and took embassy personnel hostage, in a hostage drama that would last 444 days, including a failed attempt to rescue the hostages that left dead American soldiers and burnt helicopters in Iran. The drama ended on the day that Carter left office.
But none of the above events is where our history of American lives lost in the Middle East should begin. It should begin in the summer of 1979, with a report written by a low-level Defense Department official by the name of Paul Wolfowitz. His “Limited Contingency Study” assessed the political, geopolitical, sectarian, ethnic, and military situation in the Middle East and recommended a more active American involvement in the region, including possible military intervention to blunt the Soviet Union’s influence, protect our access to oil, and thwart the ambitions of Iraq under its dictator, Saddam Hussein.
Wolfowitz would later become a deputy to Defense Secretary Donald Rumsfeld under the presidency of George W. Bush.
Note that Wolfowitz’s paper was written long before 9/11 and long before the toppling of Saddam Hussein in the Second Gulf War after he was accused of having weapons of mass destruction.
Until the Wolfowitz report, the USA had taken a rather passive and indirect role in the Middle East, placing it secondary to other geopolitical matters and using proxies and intelligence “spooks” to protect its interests in the region. Of course this low-level interference in the affairs of other nations was not seen as low level by the targets of the actions. To use common vernacular, it pissed them off, just as it would have pissed us off if the roles had been reversed. But again, it’s unpatriotic to consider the feelings of others, especially if they are seen as the enemy, or backwards, or religious zealots.
Strategic and tactical thinking began to change with the Wolfowitz paper. Plans started to be developed for military action to replace more benign approaches. Eventually, the plans indeed resulted in military actions, ranging from full-scale war to bombing from the air to drone warfare, in such places as Lebanon, Afghanistan, Iraq, Kuwait, Libya, Syria, Yemen, Pakistan, and Somalia (the locale of “Blackhawk Down”), with side actions outside of the Middle East in Bosnia and Kosovo.
In each case the American military performed admirably and often exceptionally, but less so for Defense Department analysts, for Congress and the White House, for the press on the left and right, or for the public at large—most of whom got caught up in the passions of the moment and didn’t understand the cultures they were dealing with and didn’t think through the unintended consequences of military actions in lands where Western concepts of justice, fairness, equality, tolerance, pluralism, religious freedom, diversity, and multiculturalism were as foreign and out of place as an American tourist wearing flipflops and shorts in a mosque.
by Craig Cantoni, Mish Talk | Read more:
Image: via:
A New Theory Explains How Consciousness Evolved
Ever since Charles Darwin published On the Origin of Species in 1859, evolution has been the grand unifying theory of biology. Yet one of our most important biological traits, consciousness, is rarely studied in the context of evolution. Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions. The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence. If the theory is right—and that has yet to be determined—then consciousness evolved gradually over the past half billion years and is present in a range of vertebrate species.
Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition. Neurons act like candidates in an election, each one shouting and trying to suppress its fellows. At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing.

Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition. Neurons act like candidates in an election, each one shouting and trying to suppress its fellows. At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing.
by Michael Graziano, The Atlantic | Read more:
Image: Chris Helgren / ReutersSaturday, June 4, 2016
We Have No Idea What Aging Looks Like
My friend Deborah from college loves to tell this story: One of the first times we hung out, we started talking about her solo travels to Burma and assorted other spots in Southeast Asia. I was 19 years old, and like most 19-year-olds, nearly all my friends were people I met through school in some fashion, meaning that virtually all my friends were people within a two-year age range of myself (four years max, though given the dynamics of high school and even collegiate hierarchies, anything more than two years was a stretch). But as she was regaling me with her thrilling tales, I realized she couldn’t have traveled so extensively if she were my age, and it dawned on me that I was talking to someone Older.
I’d heard you weren’t supposed to ask people how old they were—what if they were Old?!—but I couldn’t help myself. I asked her how old she was, and she told me, and, according to her, I gasped, fluttered my hand to my chest, and said, “But you look so good!“
Deborah was 26.
I turn 40 this week, and this story, which was embarrassing to me the first time she told it—she had the good sense to wait to relay it to me until I was in my 30s and therefore old enough to appreciate it—has now become hilarious. It’s hilarious that I thought 26 was shockingly old, and that I thought 26 would be old enough to show signs of aging in a way that would be detrimental to one’s conventional beauty. (In fact, it seems that would be anything over 31, if we’re going by sheer numbers here—and while I’m tempted to call bullshit on that, given that people may be more satisfied with their looks the older they get, I also know that age 31 was probably when I looked objectively my best.)
We still don’t really know what aging looks like. Certainly younger people don’t, and everyone reading this is younger than someone. I used to be vaguely flattered when younger people would express surprise when I’d mention my age, until I recalled my own response to Deborah’s ancient 26. It wasn’t that I knew what 26 looked like and that she looked younger than that; it was that I had no idea what looking 26 might actually entail, just that it was older than what I’d been led to believe was the height of my own attractiveness, and that therefore the fact that she looked great at 26 meant she was an outlier and therefore warranted a cry of “But you look sogood!” When a younger person tells me I “don’t look 40”—or, my favorite, that I’m “well preserved” (!), I accept it with grace but always wonder if they’ll later recall that moment with their own embarrassment. Because I do look 40, and I’mnotparticularly “preserved.” They just have no idea what 40 looks like, and it’s not their fault. Until it was within eyeshot, I didn’t know myself.
What we consider older (or younger) is always in relation to ourselves. Older was once my 26-year-old friend; now that my circle of friends has loosened beyond the age constrictions of school and I have friends in their 50s, even people in their 60s don’t seem so old to me. My parents, once hopelessly old to me, I now see as—I can’t say young, but when I wanted to talk about Mad Men with them, my mother said they were saving television for “deep retirement.” Meaning not the retirement they’re in now—my father retired from paid work nearly 10 years ago, and my mother retired from homemaking as well, a feminist arrangement I adore—but a later form of retirement, when they’re too frail to travel extensively as they’re doing now. That is: When they’re Old.
There’s a particular sort of human-interest news piece that takes a person over 70 who is doing something—anything, really—and treats the fact that they are not sewn into a La-Z-Boy as a small miracle. We are supposed to find this inspiring, and I suppose it is. But it is not unique. The fact that younger folk still regard active elderly people as outliers says little about them, and everything about us. We expect old people to curl up and—well, die, I suppose (though our society is still so scared shitless of death that we spend 28 percent of our Medicare dollars in the last six months of life). So when they don’t, we’re surprised, even though we shouldn’t be. There are indeed old people who spend their days mostly watching television and complaining about their aches, but there are young people who do that too. My grandmother, who turns 90 next month, teaches line dancing lessons at her retirement home. I’m proud of her. She is not an outlier.
This idea that old people—whatever each of us considers to be old—are outliers for not fitting into what we expect of them goes double for beauty. That makes a sort of sense, given that the hallmarks of beauty are so closely associated with youth, so when a woman of a certain age still has some of those hallmarks, it is remarkable. Except: It’s not, not really, given that so much of the attention we do give to famous older women has less to do with their beauty and more with their grooming. Take the case of Helen Mirren, whom the media has long crowned as the sexy senior (which started happening 15 years ago, incidentally, back when she was the same age Julia Louis-Dreyfus is now). She’s a lovely woman, and exceptionally accomplished, but the attention paid to her sex appeal after age 50 has largely been about her refusal to style herself in a matronly fashion. (I don’t know enough about celebrity fashion to say for sure, but I’m guessing that she ushered in today’s era, when celebrities over 50 aren’t afraid to show some skin, and look great in it.) When I walk through this city, I see a lot of older women who groom themselves just as beautifully, and I’m not just talking about the Iris Apfels of the world. I’m talking my gym buddy Lynn, whose loose bun and oh-so-slightly-off-the-shoulder tees echo her life as a dancer; I’m talking my neighbor Dorothy, whose loose movie-star curls fall in her face when she talks; I’m talking real women you know, who take care of themselves, and who may or may not have the bone structure of Carmen Dell’Orefice but who look pretty damn good anyway. Part of the joke of Amy Schumer’s sublime “Last Fuckable Day” sketch was the fact that all of the women in it were perfectly good-looking. We know that women don’t shrivel up and die after 50, but we’re still not sure how to truly acknowledge it, so we continue to rely on outdated conversations about aging. I mean, the opening slide of that Amy Schumer sketch is: “Uncensored: Hide Your Mom.”
There’s a paradox built into acknowledging older women’s beauty: By calling attention to both their appearance and their age, we continue to treat older women who continue an otherwise unremarkable level of grooming as exceptions. That’s not to say that we shouldn’t do so; Advanced Style, for example, is near-radical in its presentation of older women, and I’d hate for it to become just…Style. And I absolutely don’t want to say that we should start sweeping older women back under the male gaze; escaping that level of scrutiny is one of the benefits of growing older. I’m also aware of the folly of using the way we talk about celebrities as a stand-in for how we talk about age more generally—the only people whose ages we collectively examine are famous people, whose ages only come up for discussion in regard to looks if we’re all like A) Wow, that person doesn’t look that old (Cicely Tyson, 91), or B) Wow, that person looks way older than that (Ted Cruz, 45). Nobody is like, Wow, Frances McDormand is 58? And she looks it too! Still, celebrities are a useful comparison point for how our notions of age are changing, even if the ways we talk about it aren’t. Anne Bancroft was 36 when she was cast as Mrs. Robinson. A selection of women who are 36 today: Zooey Deschanel, Laura Prepon, Mindy Kaling, Rosamund Pike, Claire Danes. Kim Kardashian turns 36 in October. Can you imagine any of these people being cast as a scandalously older woman today?
I’d heard you weren’t supposed to ask people how old they were—what if they were Old?!—but I couldn’t help myself. I asked her how old she was, and she told me, and, according to her, I gasped, fluttered my hand to my chest, and said, “But you look so good!“
Deborah was 26.

We still don’t really know what aging looks like. Certainly younger people don’t, and everyone reading this is younger than someone. I used to be vaguely flattered when younger people would express surprise when I’d mention my age, until I recalled my own response to Deborah’s ancient 26. It wasn’t that I knew what 26 looked like and that she looked younger than that; it was that I had no idea what looking 26 might actually entail, just that it was older than what I’d been led to believe was the height of my own attractiveness, and that therefore the fact that she looked great at 26 meant she was an outlier and therefore warranted a cry of “But you look sogood!” When a younger person tells me I “don’t look 40”—or, my favorite, that I’m “well preserved” (!), I accept it with grace but always wonder if they’ll later recall that moment with their own embarrassment. Because I do look 40, and I’mnotparticularly “preserved.” They just have no idea what 40 looks like, and it’s not their fault. Until it was within eyeshot, I didn’t know myself.
What we consider older (or younger) is always in relation to ourselves. Older was once my 26-year-old friend; now that my circle of friends has loosened beyond the age constrictions of school and I have friends in their 50s, even people in their 60s don’t seem so old to me. My parents, once hopelessly old to me, I now see as—I can’t say young, but when I wanted to talk about Mad Men with them, my mother said they were saving television for “deep retirement.” Meaning not the retirement they’re in now—my father retired from paid work nearly 10 years ago, and my mother retired from homemaking as well, a feminist arrangement I adore—but a later form of retirement, when they’re too frail to travel extensively as they’re doing now. That is: When they’re Old.
There’s a particular sort of human-interest news piece that takes a person over 70 who is doing something—anything, really—and treats the fact that they are not sewn into a La-Z-Boy as a small miracle. We are supposed to find this inspiring, and I suppose it is. But it is not unique. The fact that younger folk still regard active elderly people as outliers says little about them, and everything about us. We expect old people to curl up and—well, die, I suppose (though our society is still so scared shitless of death that we spend 28 percent of our Medicare dollars in the last six months of life). So when they don’t, we’re surprised, even though we shouldn’t be. There are indeed old people who spend their days mostly watching television and complaining about their aches, but there are young people who do that too. My grandmother, who turns 90 next month, teaches line dancing lessons at her retirement home. I’m proud of her. She is not an outlier.
This idea that old people—whatever each of us considers to be old—are outliers for not fitting into what we expect of them goes double for beauty. That makes a sort of sense, given that the hallmarks of beauty are so closely associated with youth, so when a woman of a certain age still has some of those hallmarks, it is remarkable. Except: It’s not, not really, given that so much of the attention we do give to famous older women has less to do with their beauty and more with their grooming. Take the case of Helen Mirren, whom the media has long crowned as the sexy senior (which started happening 15 years ago, incidentally, back when she was the same age Julia Louis-Dreyfus is now). She’s a lovely woman, and exceptionally accomplished, but the attention paid to her sex appeal after age 50 has largely been about her refusal to style herself in a matronly fashion. (I don’t know enough about celebrity fashion to say for sure, but I’m guessing that she ushered in today’s era, when celebrities over 50 aren’t afraid to show some skin, and look great in it.) When I walk through this city, I see a lot of older women who groom themselves just as beautifully, and I’m not just talking about the Iris Apfels of the world. I’m talking my gym buddy Lynn, whose loose bun and oh-so-slightly-off-the-shoulder tees echo her life as a dancer; I’m talking my neighbor Dorothy, whose loose movie-star curls fall in her face when she talks; I’m talking real women you know, who take care of themselves, and who may or may not have the bone structure of Carmen Dell’Orefice but who look pretty damn good anyway. Part of the joke of Amy Schumer’s sublime “Last Fuckable Day” sketch was the fact that all of the women in it were perfectly good-looking. We know that women don’t shrivel up and die after 50, but we’re still not sure how to truly acknowledge it, so we continue to rely on outdated conversations about aging. I mean, the opening slide of that Amy Schumer sketch is: “Uncensored: Hide Your Mom.”
There’s a paradox built into acknowledging older women’s beauty: By calling attention to both their appearance and their age, we continue to treat older women who continue an otherwise unremarkable level of grooming as exceptions. That’s not to say that we shouldn’t do so; Advanced Style, for example, is near-radical in its presentation of older women, and I’d hate for it to become just…Style. And I absolutely don’t want to say that we should start sweeping older women back under the male gaze; escaping that level of scrutiny is one of the benefits of growing older. I’m also aware of the folly of using the way we talk about celebrities as a stand-in for how we talk about age more generally—the only people whose ages we collectively examine are famous people, whose ages only come up for discussion in regard to looks if we’re all like A) Wow, that person doesn’t look that old (Cicely Tyson, 91), or B) Wow, that person looks way older than that (Ted Cruz, 45). Nobody is like, Wow, Frances McDormand is 58? And she looks it too! Still, celebrities are a useful comparison point for how our notions of age are changing, even if the ways we talk about it aren’t. Anne Bancroft was 36 when she was cast as Mrs. Robinson. A selection of women who are 36 today: Zooey Deschanel, Laura Prepon, Mindy Kaling, Rosamund Pike, Claire Danes. Kim Kardashian turns 36 in October. Can you imagine any of these people being cast as a scandalously older woman today?
by Autumn Whitefield-Madrano, New Inquiry | Read more:
Image: uncredited
Muhammad Ali (January, 1942 - June, 2016)
[ed. The Greatest. See also: The Outsized Life of Muhammad Ali.]
Friday, June 3, 2016
Bots are awesome! Humans? Not so much.
[ed. wtf?]
In the past few days my personal resume bot has exchanged over 24,000 messages via Facebook Messenger and SMS. It’s chatted with folks from every industry and has introduced me to people at Facebook, Microsoft, and Google — plus a half dozen small, compelling teams.
What I learned about humans and AI while sifting through those conversations is fascinating and also a little disturbing.
I’ve distilled that data into useful nuggets you should consider before jumping on the bot bandwagon.
The Backstory of #EstherBot
Earlier this week I built and launched EstherBot, a personal resume bot that can tell you about my career, interests, and values. It shot to the #2 spot on Product Hunt and my Medium post about why and how I built it spread like wildfire – racking up over 1k recommends. (Get instructions for building your own free bot here.)
EstherBot speaks to the current zeitgeist. The era of messaging has arrived along with a botpocalypse, but few people have seen examples that go beyond the personal assistant, travel butler, or shopping concierge. To some, those feel like solutions for the 1% rather than the 99%.
EstherBot is relatable and understandable. The idea is simple — the resume hasn’t really changed that much in the digital age. While you’re producing all this information about yourself in the way that you use social media, your resume doesn’t actively seek out opportunities that you might be interested in. Your resume doesn’t constantly learn and get better by observing you. Instead, you have to do all this manual work, just like you used to. Why?
There’s a ton of data that could be used to connect you to better opportunities. Data including hobbies, values, location preferences, multimedia samples of your work. On and on. A resume simply can’t hold all of that, but a bot can.
In the past few days my personal resume bot has exchanged over 24,000 messages via Facebook Messenger and SMS. It’s chatted with folks from every industry and has introduced me to people at Facebook, Microsoft, and Google — plus a half dozen small, compelling teams.

I’ve distilled that data into useful nuggets you should consider before jumping on the bot bandwagon.
The Backstory of #EstherBot
Earlier this week I built and launched EstherBot, a personal resume bot that can tell you about my career, interests, and values. It shot to the #2 spot on Product Hunt and my Medium post about why and how I built it spread like wildfire – racking up over 1k recommends. (Get instructions for building your own free bot here.)
EstherBot speaks to the current zeitgeist. The era of messaging has arrived along with a botpocalypse, but few people have seen examples that go beyond the personal assistant, travel butler, or shopping concierge. To some, those feel like solutions for the 1% rather than the 99%.
EstherBot is relatable and understandable. The idea is simple — the resume hasn’t really changed that much in the digital age. While you’re producing all this information about yourself in the way that you use social media, your resume doesn’t actively seek out opportunities that you might be interested in. Your resume doesn’t constantly learn and get better by observing you. Instead, you have to do all this manual work, just like you used to. Why?
There’s a ton of data that could be used to connect you to better opportunities. Data including hobbies, values, location preferences, multimedia samples of your work. On and on. A resume simply can’t hold all of that, but a bot can.
by Esther Crawford, Chatbots Magazine | Read more:
Image: uncredited
Fraying at the Edges
It began with what she saw in the bathroom mirror. On a dull morning, Geri Taylor padded into the shiny bathroom of her Manhattan apartment. She casually checked her reflection in the mirror, doing her daily inventory. Immediately, she stiffened with fright.
Huh? What?
She didn’t recognize herself.
She gazed saucer-eyed at her image, thinking: Oh, is this what I look like? No, that’s not me. Who’s that in my mirror?
This was in late 2012. She was 69, in her early months getting familiar with retirement. For some time she had experienced the sensation of clouds coming over her, mantling thought. There had been a few hiccups at her job. She had been a nurse who climbed the rungs to health care executive. Once, she was leading a staff meeting when she had no idea what she was talking about, her mind like a stalled engine that wouldn’t turn over.
“Fortunately I was the boss and I just said, ‘Enough of that; Sally, tell me what you’re up to,’” she would say of the episode.
Certain mundane tasks stumped her. She told her husband, Jim Taylor, that the blind in the bedroom was broken. He showed her she was pulling the wrong cord. Kept happening. Finally, nothing else working, he scribbled on the adjacent wall which cord was which.
Then there was the day she got off the subway at 14th Street and Seventh Avenue unable to figure out why she was there.
So, yes, she had had inklings that something was going wrong with her mind. She held tight to these thoughts. She even hid her suspicions from Mr. Taylor, who chalked up her thinning memory to the infirmities of age. “I thought she was getting like me,” he said. “I had been forgetful for 10 years.”
But to not recognize her own face! To Ms. Taylor, this was the “drop-dead moment” when she had to accept a terrible truth. She wasn’t just seeing the twitches of aging but the early fumes of the disease.
She had no further issues with mirrors, but there was no ignoring that something important had happened. She confided her fears to her husband and made an appointment with a neurologist. “Before then I thought I could fake it,” she would explain. “This convinced me I had to come clean.”
In November 2012, she saw the neurologist who was treating her migraines. He listened to her symptoms, took blood, gave her the Mini Mental State Examination, a standard cognitive test made up of a set of unremarkable questions and commands. (For instance, she was asked to count backward from 100 in intervals of seven; she had to say the phrase: “No ifs, ands or buts”; she was told to pick up a piece of paper, fold it in half and place it on the floor beside her.)
He told her three common words, said he was going to ask her them in a little bit. He emphasized this by pointing a finger at his head — remember those words. That simple. Yet when he called for them, she knew only one: beach. In her mind, she would go on to associate it with the doctor, thinking of him as Dr. Beach.
He gave a diagnosis of mild cognitive impairment, a common precursor to Alzheimer’s disease. The first label put on what she had. Even then, she understood it was the footfall of what would come. Alzheimer’s had struck her father, a paternal aunt and a cousin. She long suspected it would eventually find her.
Every 67 seconds, with monotonous cruelty, Alzheimer’s takes up residence in another American. Degenerative and incurable, it is democratic in its reach. People live with it about eight to 10 years on average, though some people last for 20 years. More than five million Americans are believed to have it, two-thirds of them women, and now Ms. Taylor would join them.
The disease, with its thundering implications, moves in worsening stages to its ungraspable end. That is the familiar face of Alzheimer’s, the withered person with the scrambled mind marooned in a nursing home, memories sealed away, aspirations for the future discontinued. But there is also the beginning, the waiting period.
That was Geri Taylor. Waiting.
Right now, she remained energized, in control of her life, the silent attack on her brain not yet in full force. But what about next week? Next month? Next year? The disease would be there then. And the year after. And forever. It has no easy parts. It nicks away at you, its progress messy and unpredictable.
“The beginning is like purgatory,” she said one day. “It’s kind of a grace period. You’re waiting for something. Something you don’t want to come. It’s like a before-hell purgatory.”
Huh? What?
She didn’t recognize herself.

This was in late 2012. She was 69, in her early months getting familiar with retirement. For some time she had experienced the sensation of clouds coming over her, mantling thought. There had been a few hiccups at her job. She had been a nurse who climbed the rungs to health care executive. Once, she was leading a staff meeting when she had no idea what she was talking about, her mind like a stalled engine that wouldn’t turn over.
“Fortunately I was the boss and I just said, ‘Enough of that; Sally, tell me what you’re up to,’” she would say of the episode.
Certain mundane tasks stumped her. She told her husband, Jim Taylor, that the blind in the bedroom was broken. He showed her she was pulling the wrong cord. Kept happening. Finally, nothing else working, he scribbled on the adjacent wall which cord was which.
Then there was the day she got off the subway at 14th Street and Seventh Avenue unable to figure out why she was there.
So, yes, she had had inklings that something was going wrong with her mind. She held tight to these thoughts. She even hid her suspicions from Mr. Taylor, who chalked up her thinning memory to the infirmities of age. “I thought she was getting like me,” he said. “I had been forgetful for 10 years.”
But to not recognize her own face! To Ms. Taylor, this was the “drop-dead moment” when she had to accept a terrible truth. She wasn’t just seeing the twitches of aging but the early fumes of the disease.
She had no further issues with mirrors, but there was no ignoring that something important had happened. She confided her fears to her husband and made an appointment with a neurologist. “Before then I thought I could fake it,” she would explain. “This convinced me I had to come clean.”
In November 2012, she saw the neurologist who was treating her migraines. He listened to her symptoms, took blood, gave her the Mini Mental State Examination, a standard cognitive test made up of a set of unremarkable questions and commands. (For instance, she was asked to count backward from 100 in intervals of seven; she had to say the phrase: “No ifs, ands or buts”; she was told to pick up a piece of paper, fold it in half and place it on the floor beside her.)
He told her three common words, said he was going to ask her them in a little bit. He emphasized this by pointing a finger at his head — remember those words. That simple. Yet when he called for them, she knew only one: beach. In her mind, she would go on to associate it with the doctor, thinking of him as Dr. Beach.
He gave a diagnosis of mild cognitive impairment, a common precursor to Alzheimer’s disease. The first label put on what she had. Even then, she understood it was the footfall of what would come. Alzheimer’s had struck her father, a paternal aunt and a cousin. She long suspected it would eventually find her.
Every 67 seconds, with monotonous cruelty, Alzheimer’s takes up residence in another American. Degenerative and incurable, it is democratic in its reach. People live with it about eight to 10 years on average, though some people last for 20 years. More than five million Americans are believed to have it, two-thirds of them women, and now Ms. Taylor would join them.
The disease, with its thundering implications, moves in worsening stages to its ungraspable end. That is the familiar face of Alzheimer’s, the withered person with the scrambled mind marooned in a nursing home, memories sealed away, aspirations for the future discontinued. But there is also the beginning, the waiting period.
That was Geri Taylor. Waiting.
Right now, she remained energized, in control of her life, the silent attack on her brain not yet in full force. But what about next week? Next month? Next year? The disease would be there then. And the year after. And forever. It has no easy parts. It nicks away at you, its progress messy and unpredictable.
“The beginning is like purgatory,” she said one day. “It’s kind of a grace period. You’re waiting for something. Something you don’t want to come. It’s like a before-hell purgatory.”
by N.R. Kleinfield, NY Times | Read more:
Image: Michael Kirby Smith
13, Right Now
She slides into the car, and even before she buckles her seat belt, her phone is alight in her hands. A 13-year-old girl after a day of eighth grade.
She says hello. Her au pair asks, “Ready to go?”
She doesn’t respond, her thumb on Instagram. A Barbara Walters meme is on the screen. She scrolls, and another meme appears. Then another meme, and she closes the app. She opens BuzzFeed. There’s a story about Florida Gov. Rick Scott, which she scrolls past to get to a story about Janet Jackson, then “28 Things You’ll Understand If You’re Both British and American.” She closes it. She opens Instagram. She opens the NBA app. She shuts the screen off. She turns it back on. She opens Spotify. Opens Fitbit. She has 7,427 steps. Opens Instagram again. Opens Snapchat. She watches a sparkly rainbow flow from her friend’s mouth. She watches a YouTube star make pouty faces at the camera. She watches a tutorial on nail art. She feels the bump of the driveway and looks up. They’re home. Twelve minutes have passed.
Katherine Pommerening’s iPhone is the place where all of her friends are always hanging out. So it’s the place where she is, too. She’s on it after it rings to wake her up in the mornings. She’s on it at school, when she can sneak it. She’s on it while her 8-year-old sister, Lila, is building crafts out of beads. She sets it down to play basketball, to skateboard, to watch PG-13 comedies and sometimes to eat dinner, but when she picks it back up, she might have 64 unread messages.
Now she’s on it in the living room of her big house in McLean, Va., while she explains what it’s like to be a 13-year-old today.
“Over 100 likes is good, for me. And comments. You just comment to make a joke or tag someone.”
The best thing is the little notification box, which means someone liked, tagged or followed her on Instagram. She has 604 followers. There are only 25 photos on her page because she deletes most of what she posts. The ones that don’t get enough likes, don’t have good enough lighting or don’t show the coolest moments in her life must be deleted.
“I decide the pictures that look good,” she says. “Ones with my friends, ones that are a really nice-looking picture.”
Somewhere, maybe at this very moment, neurologists are trying to figure out what all this screen time is doing to the still-forming brains of people Katherine’s age, members of what’s known as Generation Z. Educators are trying to teach them that not all answers are Googleable. Counselors are prying them out of Internet addictions. Parents are trying to catch up by friending their kids on Facebook. (P.S. Facebook is obsolete.) Sociologists, advertisers, stock market analysts – everyone wants to know what happens when the generation born glued to screens has to look up and interact with the world.
Right now, Katherine is still looking down.
“See this girl,” she says, “she gets so many likes on her pictures because she’s posted over nine pictures saying, ‘Like all my pictures for a tbh, comment when done.’ So everyone will like her pictures, and she’ll just give them a simple tbh.”
A tbh is a compliment. It stands for “to be heard” or “to be honest.”
Katherine tosses her long brown hair behind her shoulder and ignores her black lab, Lucy, who is barking to be let out.
“It kind of, almost, promotes you as a good person. If someone says, ‘tbh you’re nice and pretty,’ that kind of, like, validates you in the comments. Then people can look at it and say ‘Oh, she’s nice and pretty.’ ”
by Jessica Contrera, Washington Post | Read more:
She says hello. Her au pair asks, “Ready to go?”

Katherine Pommerening’s iPhone is the place where all of her friends are always hanging out. So it’s the place where she is, too. She’s on it after it rings to wake her up in the mornings. She’s on it at school, when she can sneak it. She’s on it while her 8-year-old sister, Lila, is building crafts out of beads. She sets it down to play basketball, to skateboard, to watch PG-13 comedies and sometimes to eat dinner, but when she picks it back up, she might have 64 unread messages.
Now she’s on it in the living room of her big house in McLean, Va., while she explains what it’s like to be a 13-year-old today.
“Over 100 likes is good, for me. And comments. You just comment to make a joke or tag someone.”
The best thing is the little notification box, which means someone liked, tagged or followed her on Instagram. She has 604 followers. There are only 25 photos on her page because she deletes most of what she posts. The ones that don’t get enough likes, don’t have good enough lighting or don’t show the coolest moments in her life must be deleted.
“I decide the pictures that look good,” she says. “Ones with my friends, ones that are a really nice-looking picture.”
Somewhere, maybe at this very moment, neurologists are trying to figure out what all this screen time is doing to the still-forming brains of people Katherine’s age, members of what’s known as Generation Z. Educators are trying to teach them that not all answers are Googleable. Counselors are prying them out of Internet addictions. Parents are trying to catch up by friending their kids on Facebook. (P.S. Facebook is obsolete.) Sociologists, advertisers, stock market analysts – everyone wants to know what happens when the generation born glued to screens has to look up and interact with the world.
Right now, Katherine is still looking down.
“See this girl,” she says, “she gets so many likes on her pictures because she’s posted over nine pictures saying, ‘Like all my pictures for a tbh, comment when done.’ So everyone will like her pictures, and she’ll just give them a simple tbh.”
A tbh is a compliment. It stands for “to be heard” or “to be honest.”
Katherine tosses her long brown hair behind her shoulder and ignores her black lab, Lucy, who is barking to be let out.
“It kind of, almost, promotes you as a good person. If someone says, ‘tbh you’re nice and pretty,’ that kind of, like, validates you in the comments. Then people can look at it and say ‘Oh, she’s nice and pretty.’ ”
by Jessica Contrera, Washington Post | Read more:
Image: Victoria Milko
Thursday, June 2, 2016
The NFL’s Brewing Information War
When every important decision-maker in the NFL shuffled into a hotel conference room in Boca Raton, Florida, in March for the league’s annual meeting, the scene was initially predictable: Most of the head coaches wore golf shirts, and most everyone had found a way to make the meetings double as a family vacation. The atmosphere was festive, and the pools were full of league titans. It did not appear to be the setting for the opening salvo of a war over the future of the NFL, but that’s what it became.
The meeting shifted toward discussing whether coaches should be allowed to watch game film on the sidelines during contests, a practice never before allowed. According to multiple people who were in the room at the time, Ron Rivera, the head coach of the defending NFC champion Carolina Panthers, stood up and asked what the point of coaching was if, after preparing all week, video would be readily available on the sidelines anyway.
“Where does it end?” Rivera said this week in an interview with The Ringer. “Can you get text messages or go out there with an iPhone and figure out where to go? What are we creating? I know there are millennial players, but this is still a game created 100 years ago.”
Rivera’s stance was among the most notable scenes during a spring in which it became increasingly clear that technology’s role in football has created a divide. On one side, there are coaches who have an old-school view of the craft; on the other are the coaches, executives, and owners who anticipate the sport undergoing the same sort of data revolution that most industries experienced long ago.
The NFL could have the technological capabilities to make a sideline look like a Blade Runner reboot. But it already has a mountain of data — it’s just that the mountain is largely inaccessible. In an effort to facilitate progress, league officials in Boca Raton pitched the NFL’s latest data technology: a system that would allow franchises to view player-tracking data for all 32 teams. If implemented, the technology would enable clubs to monitor every movement on the field for the first time, yielding raw data on player performance. For example: A team concerned about a slow cornerback could actually find out how much slower he is than Antonio Brown, who, according to data shared on a 2015 Thursday Night Football broadcast, posted a maximum speed of 21.8 mph during the season.
The proposal for teams to have access to all raw player-tracking data did not make it past the league’s Competition Committee, a group of team executives, owners, and coaches, according to an NFL official. Certain coaches griped about what might happen if other teams or the public had access to this data, and the committee told team representatives that it was too much, too soon, preventing the matter from reaching the teams for a vote at the March meeting.
“In other industries it is crazy to think you are going to limit innovation just to protect the people who aren’t ready,” said Brian Kopp, president of North America for Catapult Sports, which says it has deals with 19 NFL teams to provide practice data, but not game data. “Let’s make it all equally competitive, which is: You don’t figure it out, you start losing and you lose your job.”
Rivera, who’s coached the Panthers since 2011 and serves on a subcommittee of the NFL Competition Committee, said that introducing too much technology could “take the essence” out of the sport.
“I want to get beat on the field. I don’t want to get beat because someone used a tool or technology — that is not coaching at that point,” Rivera said. “I work all week, I’m preparing and kicking your ass. All of the sudden you see a piece of live video and you figure out, ‘Oh crap, that’s what he’s doing.’ And how fair is that?”
Two seasons ago, some NFL players began wearing two tiny chips in their shoulder pads during games. The program expanded to all players this past season, when Zebra Technologies, the company that produces the chips, also outfitted every stadium with receivers that decipher all movements on the field, measuring everything from player speed to how open a pass-catcher manages to get on a given play.
If you’re suddenly worried that you’re the only one missing out on crucial pieces of football analysis, rest assured, you’re not: Aside from a few nuggets sprinkled into television broadcasts, fans don’t have access to most league-wide data. More alarmingly, teams don’t have access to any league-wide data during the season, and, according to league officials, didn’t even get their own data from the 2015 season until three weeks ago.
Though football enthusiasts often praise the NFL for being forward-thinking, it has actually lagged behind other professional leagues amid an otherwise widespread analytics revolution, with a player-tracking section on NBA.com and MLB allowing the public to access its PITCHf/x data for research and modeling purposes. While NFL teams have hired analysts for front-office roles and external parties have created websites aimed at tracking advanced statistics for fans and media, when league employees actually started to pitch head coaches on “Next Gen” statistics and technological advancements about four years ago, they were stunned at the reception.

“Where does it end?” Rivera said this week in an interview with The Ringer. “Can you get text messages or go out there with an iPhone and figure out where to go? What are we creating? I know there are millennial players, but this is still a game created 100 years ago.”
Rivera’s stance was among the most notable scenes during a spring in which it became increasingly clear that technology’s role in football has created a divide. On one side, there are coaches who have an old-school view of the craft; on the other are the coaches, executives, and owners who anticipate the sport undergoing the same sort of data revolution that most industries experienced long ago.
The NFL could have the technological capabilities to make a sideline look like a Blade Runner reboot. But it already has a mountain of data — it’s just that the mountain is largely inaccessible. In an effort to facilitate progress, league officials in Boca Raton pitched the NFL’s latest data technology: a system that would allow franchises to view player-tracking data for all 32 teams. If implemented, the technology would enable clubs to monitor every movement on the field for the first time, yielding raw data on player performance. For example: A team concerned about a slow cornerback could actually find out how much slower he is than Antonio Brown, who, according to data shared on a 2015 Thursday Night Football broadcast, posted a maximum speed of 21.8 mph during the season.
The proposal for teams to have access to all raw player-tracking data did not make it past the league’s Competition Committee, a group of team executives, owners, and coaches, according to an NFL official. Certain coaches griped about what might happen if other teams or the public had access to this data, and the committee told team representatives that it was too much, too soon, preventing the matter from reaching the teams for a vote at the March meeting.
“In other industries it is crazy to think you are going to limit innovation just to protect the people who aren’t ready,” said Brian Kopp, president of North America for Catapult Sports, which says it has deals with 19 NFL teams to provide practice data, but not game data. “Let’s make it all equally competitive, which is: You don’t figure it out, you start losing and you lose your job.”
Rivera, who’s coached the Panthers since 2011 and serves on a subcommittee of the NFL Competition Committee, said that introducing too much technology could “take the essence” out of the sport.
“I want to get beat on the field. I don’t want to get beat because someone used a tool or technology — that is not coaching at that point,” Rivera said. “I work all week, I’m preparing and kicking your ass. All of the sudden you see a piece of live video and you figure out, ‘Oh crap, that’s what he’s doing.’ And how fair is that?”
Two seasons ago, some NFL players began wearing two tiny chips in their shoulder pads during games. The program expanded to all players this past season, when Zebra Technologies, the company that produces the chips, also outfitted every stadium with receivers that decipher all movements on the field, measuring everything from player speed to how open a pass-catcher manages to get on a given play.
If you’re suddenly worried that you’re the only one missing out on crucial pieces of football analysis, rest assured, you’re not: Aside from a few nuggets sprinkled into television broadcasts, fans don’t have access to most league-wide data. More alarmingly, teams don’t have access to any league-wide data during the season, and, according to league officials, didn’t even get their own data from the 2015 season until three weeks ago.
Though football enthusiasts often praise the NFL for being forward-thinking, it has actually lagged behind other professional leagues amid an otherwise widespread analytics revolution, with a player-tracking section on NBA.com and MLB allowing the public to access its PITCHf/x data for research and modeling purposes. While NFL teams have hired analysts for front-office roles and external parties have created websites aimed at tracking advanced statistics for fans and media, when league employees actually started to pitch head coaches on “Next Gen” statistics and technological advancements about four years ago, they were stunned at the reception.
by Kevin Clark, The Ringer | Read more:
Image: Getty
Facial Recognition Will Soon End Your Anonymity
Nearly 250 million video surveillance cameras have been installed throughout the world, and chances are you’ve been seen by several of them today. Most people barely notice their presence anymore — on the streets, inside stores, and even within our homes. We accept the fact that we are constantly being recorded because we expect this to have virtually no impact on our lives. But this balance may soon be upended by advancements in facial recognition technology.
Soon anybody with a high-resolution camera and the right software will be able to determine your identity. That’s because several technologies are converging to make this accessible. Recognition algorithms have become far more accurate, the devices we carry can process huge amounts of data, and there’s massive databases of faces now available on social media that are tied to our real names. As facial recognition enters the mainstream, it will have serious implications for your privacy.
A new app called FindFace, recently released in Russia, gives us a glimpse into what this future might look like. Made by two 20-something entrepreneurs, FindFace allows anybody to snap a photo of a passerby and discover their real name — already with 70% reliability. The app allows people to upload photos and compare faces to user profiles from the popular social network Vkontakte, returning a result in a matter of seconds. According to an interview in the Guardian, the founders claim to already have 500,000 users and have processed over 3 million searches in the two months since they’ve launched.
What’s particularly unsettling are the use cases they advocate: identifying strangers to send them dating requests, helping government security agencies to determine the identities of dissenters, and allowing retailers to bombard you with advertisements based on what you look at in stores.
While there are reasons to be skeptical of their claims, FindFace is already being deployed in questionable ways. Some users have tried to identify fellow riders on the subway, while others are using the app to reveal the real names of porn actresses against their will. Powerful facial recognition technology is now in the hands of consumers to use how they please.
It’s not just Russians who have to worry about the implications of ubiquitous facial recognition. Whenever a technology becomes cheap and powerful, it begins to show up in the unlikeliest of places.
by Tarun Wadhwa, MarketWatch | Read more:
Image: via:

A new app called FindFace, recently released in Russia, gives us a glimpse into what this future might look like. Made by two 20-something entrepreneurs, FindFace allows anybody to snap a photo of a passerby and discover their real name — already with 70% reliability. The app allows people to upload photos and compare faces to user profiles from the popular social network Vkontakte, returning a result in a matter of seconds. According to an interview in the Guardian, the founders claim to already have 500,000 users and have processed over 3 million searches in the two months since they’ve launched.
What’s particularly unsettling are the use cases they advocate: identifying strangers to send them dating requests, helping government security agencies to determine the identities of dissenters, and allowing retailers to bombard you with advertisements based on what you look at in stores.
While there are reasons to be skeptical of their claims, FindFace is already being deployed in questionable ways. Some users have tried to identify fellow riders on the subway, while others are using the app to reveal the real names of porn actresses against their will. Powerful facial recognition technology is now in the hands of consumers to use how they please.
It’s not just Russians who have to worry about the implications of ubiquitous facial recognition. Whenever a technology becomes cheap and powerful, it begins to show up in the unlikeliest of places.
by Tarun Wadhwa, MarketWatch | Read more:
Image: via:
A Fishy Business
The story of the world’s best fish sauce begins, like so many others, with a son who just wanted to make his mother happy. Cuong Pham and his parents came to the United States from Saigon as refugees in 1979. They settled in northern California, where Cuong eventually became an engineer, spending 16 years with Apple. His mother, however, could never find the fish sauce (nuoc mam in Vietnamese) she remembered from Vietnam. Cuong’s family owned a fish-sauce factory; his uncle would bring his mother 20-litre cans of specially selected, just-for-family nuoc mam. In America she had to settle for commercial fish sauce, often the saltier Thai variety – designed, in the words of Andrea Nguyen, a cookbook author and proprietor of the indispensable Viet World Kitchen website, “for the lusty highs and lows of Thai food, [not] the rolling hills and valleys of Viet food”). So Cuong did what any son would do: he started his own fish-sauce company.
Fish sauce – the liquid produced from anchovies salted and left to ferment in the heat for months – has long repelled most Western palates. That is starting to change. It adds a savoury depth to soups and stocks that salt alone cannot provide. If soy sauce is a single trumpet played at full blast, fish sauce is a dozen bowed double-basses; and Cuong’s fish sauce is without parallel. And while he may have made it for expats like his mother, chefs across the Pacific and in Europe have grown to love it. (...)
It forms the chief protein source for millions, and is as central to the diverse cuisines of mainland South-East Asia as olive oil is to southern Italian and Levantine food. It goes by different names: nam pla in Thailand, tuk trey in Cambodia and patis in the Philippines. A similar condiment called garum featured in ancient Roman cuisine, and indeed south-west Italy still produces small amounts of colatura di alici, an anchovy liquid similar to nuoc mam. In other parts of South-East Asia, notably Myanmar and Cambodia, people eat fermented-fish pastes, which tend to be more assertive – often used as a central ingredient rather than a flavouring. These products wring value from abundant, tiny fish too small to eat on their own; like pickling, fish sauces preserve a bountiful harvest’s nutrition.
Fish sauce can repel first-timers: it often has an intensely fishy odour, especially in the cheaper varieties, with a rubbishy edge. But its flavour rounds and mellows with cooking. Eventually it becomes addictive, essential: I’m about as Vietnamese as a bagel, and I can’t imagine my kitchen without it. You can build a marinade for nearly any grilled thing – meat, fish or vegetable – around its umami sturdiness. Greens stir-fried with garlic, nuoc mam and a squeeze of lime or splash of white wine make a happy light lunch, served over steamed rice. Mix it with lime juice, sugar, water and perhaps some sliced chillies or chopped garlic, and it becomes nuoc cham, a dip that makes everything taste better (it pairs especially, if unconventionally, well with soft, watery fruits such as pineapple and strawberry).

It forms the chief protein source for millions, and is as central to the diverse cuisines of mainland South-East Asia as olive oil is to southern Italian and Levantine food. It goes by different names: nam pla in Thailand, tuk trey in Cambodia and patis in the Philippines. A similar condiment called garum featured in ancient Roman cuisine, and indeed south-west Italy still produces small amounts of colatura di alici, an anchovy liquid similar to nuoc mam. In other parts of South-East Asia, notably Myanmar and Cambodia, people eat fermented-fish pastes, which tend to be more assertive – often used as a central ingredient rather than a flavouring. These products wring value from abundant, tiny fish too small to eat on their own; like pickling, fish sauces preserve a bountiful harvest’s nutrition.
Fish sauce can repel first-timers: it often has an intensely fishy odour, especially in the cheaper varieties, with a rubbishy edge. But its flavour rounds and mellows with cooking. Eventually it becomes addictive, essential: I’m about as Vietnamese as a bagel, and I can’t imagine my kitchen without it. You can build a marinade for nearly any grilled thing – meat, fish or vegetable – around its umami sturdiness. Greens stir-fried with garlic, nuoc mam and a squeeze of lime or splash of white wine make a happy light lunch, served over steamed rice. Mix it with lime juice, sugar, water and perhaps some sliced chillies or chopped garlic, and it becomes nuoc cham, a dip that makes everything taste better (it pairs especially, if unconventionally, well with soft, watery fruits such as pineapple and strawberry).
by Jon Fasman, 1843/The Economist | Read more:
Image: Quinn Ryan Mattingly
$4.5 Billion to Zero
Elizabeth Holmes of Theranos: From $4.5 Billion To Nothing
Image: Glen Davis/Forbes
[ed. You think you had a rough day...?!]
Subscribe to:
Posts (Atom)