Friday, January 8, 2016
Human-Animal Chimeras Are Gestating on U.S. Research Farms
Braving a funding ban put in place by America’s top health agency, some U.S. research centers are moving ahead with attempts to grow human tissue inside pigs and sheep with the goal of creating hearts, livers, or other organs needed for transplants.
The effort to incubate organs in farm animals is ethically charged because it involves adding human cells to animal embryos in ways that could blur the line between species.
Last September, in a reversal of earlier policy, the National Institutes of Health announced it would not support studies involving such “human-animal chimeras” until it had reviewed the scientific and social implications more closely.
The agency, in a statement, said it was worried about the chance that animals’ “cognitive state” could be altered if they ended up with human brain cells.
The NIH action was triggered after it learned that scientists had begun such experiments with support from other funding sources, including from California’s state stem-cell agency. The human-animal mixtures are being created by injecting human stem cells into days-old animal embryos, then gestating these in female livestock. (...)
The experiments rely on a cutting-edge fusion of technologies, including recent breakthroughs in stem-cell biology and gene-editing techniques. By modifying genes, scientists can now easily change the DNA in pig or sheep embryos so that they are genetically incapable of forming a specific tissue. Then, by adding stem cells from a person, they hope the human cells will take over the job of forming the missing organ, which could then be harvested from the animal for use in a transplant operation.
“We can make an animal without a heart. We have engineered pigs that lack skeletal muscles and blood vessels,” says Daniel Garry, a cardiologist who leads a chimera project at the University of Minnesota. While such pigs aren’t viable, they can develop properly if a few cells are added from a normal pig embryo. Garry says he’s already melded two pigs in this way and recently won a $1.4 million grant from the U.S. Army, which funds some biomedical research, to try to grow human hearts in swine.
Because chimeras could provide a new supply of organs for needy patients and also lead to basic discoveries, researchers including Garry say they intend to press forward despite the NIH position. In November, he was one of 11 authors who published a letter criticizing the agency for creating “a threat to progress” that “casts a shadow of negativity” on their work.
The worry is that the animals might turn out to be a little too human for comfort, say ending up with human reproductive cells, patches of people hair, or just higher intelligence. “We are not near the island of Dr. Moreau, but science moves fast,” NIH ethicist David Resnik said during the agency’s November meeting. “The specter of an intelligent mouse stuck in a laboratory somewhere screaming ‘I want to get out’ would be very troubling to people.”
Hiromitsu Nakauchi, a stem-cell biologist at Stanford University, began trying to make human-sheep chimeras this year. He says that so far the contribution by human cells to the animals’ bodies appears to be relatively small. “If the extent of human cells is 0.5 percent, it’s very unlikely to get thinking pigs or standing sheep,” he says. “But if it’s large, like 40 percent, then we’d have to do something about that.”
by Antonio Regalado, MIT Technology Review | Read more:
Image: Ping Zhu
The effort to incubate organs in farm animals is ethically charged because it involves adding human cells to animal embryos in ways that could blur the line between species.

The agency, in a statement, said it was worried about the chance that animals’ “cognitive state” could be altered if they ended up with human brain cells.
The NIH action was triggered after it learned that scientists had begun such experiments with support from other funding sources, including from California’s state stem-cell agency. The human-animal mixtures are being created by injecting human stem cells into days-old animal embryos, then gestating these in female livestock. (...)
The experiments rely on a cutting-edge fusion of technologies, including recent breakthroughs in stem-cell biology and gene-editing techniques. By modifying genes, scientists can now easily change the DNA in pig or sheep embryos so that they are genetically incapable of forming a specific tissue. Then, by adding stem cells from a person, they hope the human cells will take over the job of forming the missing organ, which could then be harvested from the animal for use in a transplant operation.
“We can make an animal without a heart. We have engineered pigs that lack skeletal muscles and blood vessels,” says Daniel Garry, a cardiologist who leads a chimera project at the University of Minnesota. While such pigs aren’t viable, they can develop properly if a few cells are added from a normal pig embryo. Garry says he’s already melded two pigs in this way and recently won a $1.4 million grant from the U.S. Army, which funds some biomedical research, to try to grow human hearts in swine.
Because chimeras could provide a new supply of organs for needy patients and also lead to basic discoveries, researchers including Garry say they intend to press forward despite the NIH position. In November, he was one of 11 authors who published a letter criticizing the agency for creating “a threat to progress” that “casts a shadow of negativity” on their work.
The worry is that the animals might turn out to be a little too human for comfort, say ending up with human reproductive cells, patches of people hair, or just higher intelligence. “We are not near the island of Dr. Moreau, but science moves fast,” NIH ethicist David Resnik said during the agency’s November meeting. “The specter of an intelligent mouse stuck in a laboratory somewhere screaming ‘I want to get out’ would be very troubling to people.”
Hiromitsu Nakauchi, a stem-cell biologist at Stanford University, began trying to make human-sheep chimeras this year. He says that so far the contribution by human cells to the animals’ bodies appears to be relatively small. “If the extent of human cells is 0.5 percent, it’s very unlikely to get thinking pigs or standing sheep,” he says. “But if it’s large, like 40 percent, then we’d have to do something about that.”
by Antonio Regalado, MIT Technology Review | Read more:
Image: Ping Zhu
So Long, and Thanks for All the Fish
Squeezed between the Sumida river and the Ginza shopping district, Tsukiji is creaking at the seams. Some 60,000 people work under its leaky roof, and hundreds of forklifts, carrying everything from sea urchins to whale meat, careen across bumpy floors. The site’s owner, the city government, wants it moved.
That is unpopular. Traders resent being yanked to a sterile new site to the south. The new market is being built on a wharf whose soil is contaminated by the toxic effluent from a former gasworks. The clean-up and negotiations delayed the move for over a decade.
The final blow was Tokyo’s successful bid to host the 2020 Olympics. A new traffic artery will cut through Tsukiji, transporting visitors to the games’ venues. Part of the site will become a temporary press centre, says Yutaka Maeyasui, the executive in charge of shifting the market. Our time is up, he says, glancing around his decrepit office. The site has become too small, old and crowded. An earthquake could bring the roof down.
by The Economist | Read more:
Image: uncredited
Thursday, January 7, 2016
After Capitalism
Where we're going we don't need roads
How will it end? For centuries even the most sanguine of capitalism’s theorists have thought it not long for this world. Smith, Ricardo, and Mill pointed to a “falling rate of profit” linked to inevitable declines in agricultural productivity. Marx applied the same concept to industrial production, suggesting that the tendency to replace workers with machines would lead to a chronic and insurmountable lack of demand. Sombart saw the restive adventurousness of capitalism as the key to its success—and, ultimately, its failure: though the appearance of new peripheries had long funneled profits back to the center, the days of “stout Cortez” had ended and there would one day be no empires or hinterlands to subdue.
Schumpeter was the gloomiest of all. He opened a chapter titled “Can Capitalism Survive?” (in his Capitalism, Socialism, and Democracy) with the definitive answer, “No. I do not think it can.” Inspired by Marx, he imagined that the very success of capitalism—the creation of large enterprises through continuous innovation—would lead to profound fatigue as innovation came to be merely routine, and the bourgeoisie turned its attention toward the banalities of office life: “Success in industry and commerce requires a lot of stamina, yet industrial and commercial activity is essentially unheroic in the knight’s sense—no flourishing of swords about it, not much physical prowess, no chance to gallop the armored horse into the enemy, preferably a heretic or heathen — and the ideology that glorifies the idea of fighting for fighting’s sake and of victory for victory’s sake understandably withers in the office among all the columns of figures.” He foresaw a world in which intellectuals, a marginalized and unhappy lot, would turn their discontent into politics and lead the discontented castoffs of capitalism toward socialism.
These predictions, however, failed to describe what was actually happening with capitalism in the 20th century. By the 1980s people had turned toward a different proposition of Schumpeter’s: that competition “from the new commodity, the new technology, the new source of supply, the new type of organization” was the source of dynamism in a swiftly growing economy. For Schumpeter, the crises of capitalism were signs not of the system’s debility but of its secret health. Business cycles were zesty, violent guarantees of continued growth. Monopolies were only temporary and could be broken up by the “perennial gale of creative destruction.” When in the 1960s and ’70s the otherwise impregnable position of American industry was broken by competition from Germany and Japan, Schumpeter seemed prescient. The response of corporations in the 1980s—enormous mergers, leveraged buyouts, union busting, corporate raiding, mass layoffs, and upward redistribution of wealth—seemed almost to be taking his words as prescriptive.
But while the economy has been dynamic, it has not been healthy. Several crashes later, the gloom has returned, and the signs of autumn are once again most recognizable in the pronouncements of free-market capitalism’s erstwhile boosters. In the past year, many have taken up Larry Summers’s remark that we have entered a period of “secular stagnation,” marked by persistent and slow growth worldwide. Fiscal austerity is general, taxes remain low, and debt levels continue to rise—which means that Western countries, by selling treasury bonds to the rich through capital markets, are actually paying their elites in bond yields to avoid having to go through the politically impossible process of taxing them. Absent any political recourse to countercyclical fiscal policy, central banks in the US, the Eurozone, and Japan have kept interest rates low and pumped trillions of dollars of fiat money into the financial system, keeping banks and dot-com companies liquid and driving the rich to put their money into the condos now flooding Manhattan, all while leaving median wages pleasantly low. It’s kept things humming along, but not much more than that. Fear courses through the veins of the free-marketers, who recognize that all is not well with the system they love.
One form that such worry takes is that robots are coming to take our jobs. From The Second Machine Aget to Rise of the Robots, a new wave of technofuturists predicts that most manufacturing and a good deal of white-collar work in “services” can and will be subject to automation. The special force of the technofuturists’ predictions today lies in the fact that many of us read their work on devices we carry in our pockets that have already destroyed jobs, or at least made them more precarious, at newspapers, record companies, travel agencies, taxi services, and even casinos. The statistics they purvey are worrying, among them the fact that the share of workers in global manufacturing is on the decline. China’s share peaked in the 1990s at 15 percent and has decreased since. Dani Rodrik calls this process “premature deindustrialization”: the ability of more and more developing countries to “skip” the usual stages of capital accumulation (mass industrialization accompanied by adding workers in services) by replacing more workers with machines and moving others into services.
The surprise is that a number of prominent left intellectuals have begun to view the idea of automation with equanimity, even optimism. Most prominent among them are the accelerationists, whose widely circulated “Manifesto for an Accelerationist Politics” is the inspiration for a new book, Inventing the Future, by the manifesto’s original authors Nick Srnicek and Alex Williams. Their motto seems to be “I for one welcome our new robot overlords”—for the principle of “accelerationism” is that automation is likely to become general, and so the left needs once and for all to cease imagining that blue-collar unionism and socialist parties will drive us toward communism.
The accelerationists insist that the future will be one in which, thanks to computer assisted advances in automation, wage labor is a condition guaranteed to very few, and “surplus populations,” already large, will dominate the planet. Prior socialists imagined that victory would come through the workplace; the accelerationists argue that, in the future, the workplace won’t exist in anything like the form we have now, and in any case it will have very few permanent workers. Assuming this position, they ask: What would be the social vision appropriate to a jobless future? What, after the end of working-class socialist dreams, should the left propose?
How will it end? For centuries even the most sanguine of capitalism’s theorists have thought it not long for this world. Smith, Ricardo, and Mill pointed to a “falling rate of profit” linked to inevitable declines in agricultural productivity. Marx applied the same concept to industrial production, suggesting that the tendency to replace workers with machines would lead to a chronic and insurmountable lack of demand. Sombart saw the restive adventurousness of capitalism as the key to its success—and, ultimately, its failure: though the appearance of new peripheries had long funneled profits back to the center, the days of “stout Cortez” had ended and there would one day be no empires or hinterlands to subdue.

These predictions, however, failed to describe what was actually happening with capitalism in the 20th century. By the 1980s people had turned toward a different proposition of Schumpeter’s: that competition “from the new commodity, the new technology, the new source of supply, the new type of organization” was the source of dynamism in a swiftly growing economy. For Schumpeter, the crises of capitalism were signs not of the system’s debility but of its secret health. Business cycles were zesty, violent guarantees of continued growth. Monopolies were only temporary and could be broken up by the “perennial gale of creative destruction.” When in the 1960s and ’70s the otherwise impregnable position of American industry was broken by competition from Germany and Japan, Schumpeter seemed prescient. The response of corporations in the 1980s—enormous mergers, leveraged buyouts, union busting, corporate raiding, mass layoffs, and upward redistribution of wealth—seemed almost to be taking his words as prescriptive.
But while the economy has been dynamic, it has not been healthy. Several crashes later, the gloom has returned, and the signs of autumn are once again most recognizable in the pronouncements of free-market capitalism’s erstwhile boosters. In the past year, many have taken up Larry Summers’s remark that we have entered a period of “secular stagnation,” marked by persistent and slow growth worldwide. Fiscal austerity is general, taxes remain low, and debt levels continue to rise—which means that Western countries, by selling treasury bonds to the rich through capital markets, are actually paying their elites in bond yields to avoid having to go through the politically impossible process of taxing them. Absent any political recourse to countercyclical fiscal policy, central banks in the US, the Eurozone, and Japan have kept interest rates low and pumped trillions of dollars of fiat money into the financial system, keeping banks and dot-com companies liquid and driving the rich to put their money into the condos now flooding Manhattan, all while leaving median wages pleasantly low. It’s kept things humming along, but not much more than that. Fear courses through the veins of the free-marketers, who recognize that all is not well with the system they love.
One form that such worry takes is that robots are coming to take our jobs. From The Second Machine Aget to Rise of the Robots, a new wave of technofuturists predicts that most manufacturing and a good deal of white-collar work in “services” can and will be subject to automation. The special force of the technofuturists’ predictions today lies in the fact that many of us read their work on devices we carry in our pockets that have already destroyed jobs, or at least made them more precarious, at newspapers, record companies, travel agencies, taxi services, and even casinos. The statistics they purvey are worrying, among them the fact that the share of workers in global manufacturing is on the decline. China’s share peaked in the 1990s at 15 percent and has decreased since. Dani Rodrik calls this process “premature deindustrialization”: the ability of more and more developing countries to “skip” the usual stages of capital accumulation (mass industrialization accompanied by adding workers in services) by replacing more workers with machines and moving others into services.
The surprise is that a number of prominent left intellectuals have begun to view the idea of automation with equanimity, even optimism. Most prominent among them are the accelerationists, whose widely circulated “Manifesto for an Accelerationist Politics” is the inspiration for a new book, Inventing the Future, by the manifesto’s original authors Nick Srnicek and Alex Williams. Their motto seems to be “I for one welcome our new robot overlords”—for the principle of “accelerationism” is that automation is likely to become general, and so the left needs once and for all to cease imagining that blue-collar unionism and socialist parties will drive us toward communism.
The accelerationists insist that the future will be one in which, thanks to computer assisted advances in automation, wage labor is a condition guaranteed to very few, and “surplus populations,” already large, will dominate the planet. Prior socialists imagined that victory would come through the workplace; the accelerationists argue that, in the future, the workplace won’t exist in anything like the form we have now, and in any case it will have very few permanent workers. Assuming this position, they ask: What would be the social vision appropriate to a jobless future? What, after the end of working-class socialist dreams, should the left propose?
by The Editors, N+1 | Read more:
Image: Derek Paul Boyle, Salt and Pennies, 2015A Sad State of Affairs - Most Americans Are One Paycheck Away From the Street
Whenever I see one of these stories about how little Americans have available for an emergency, my blood starts to boil. I understand that poor people making $25,000 per year are forced to live paycheck to paycheck. But when 63% of all Americans can’t handle a $500 emergency, and 46% of households making over $75,000 can’t handle a $500 emergency, then they are just plain stupid, frivolous, and incapable of distinguishing between wants and needs. Delayed gratification is a trait almost non-existent among Americans today.
The first thing that infuriates me is the assumption that a $500 car repair or house repair is an unexpected emergency. It’s a fucking living expense. It’s not a fucking surprise. Your car will need new tires every few years. That’s $500 or more. Your hot water heater, air conditioner, roof, windows, etc. will need to be replaced. Everyone gets sick. That is not unexpected. Anyone who lives their life as if these expenses are a shocking surprise is a blithering idiot. And this country is crawling with blithering idiots.
So the majority of Americans can’t handle a $500 expense, but for the last two years there have been 35 million new cars “sold” to blithering idiots on credit or leases. Even though they have no money, they decide it’s a brilliant idea to commit to a 7 year payment of $300 to $500 per month on an asset that declines in value rapidly. Morons abound. These are the same people who must have their Starbucks coffee every day. These math challenged boobs could defer buying a Starbucks coffee every day, save the $3, and accumulate $750 of emergency savings in one year. (...)
Quentin Fottrell: Most Americans are one paycheck away from the street:
Approximately 63% of Americans have no emergency savings for things such as a $1,000 emergency room visit or a $500 car repair, according to a survey released Wednesday of 1,000 adults by personal finance website Bankrate.com, up slightly from 62% last year. Faced with an emergency, they say they would raise the money by reducing spending elsewhere (23%), borrowing from family and/or friends (15%) or using credit cards to bridge the gap (15%).
This lack of emergency savings could be a problem for millions of Americans. More than four in 10 Americans either experienced a major unexpected expense over the past 12 months or had an immediate family member who had an unexpected expense, Bankrate found. (The survey didn’t specify the impact of that expense.) “Without emergency savings, you may not have money to cover needed home repairs,” says Signe-Mary McKernan, senior fellow and economist at the Urban Institute, a nonprofit organization that focuses on social and economic policy. “Similarly, without emergency savings, people could raid their retirement account.”
The findings are strikingly similar to two other reports, one by the U.S. Federal Reserve survey of more than 4,000 adults released in 2014. “Savings are depleted for many households after the recession,” it found. Among those who had savings prior to 2008, 57% said they’d used up some or all of their savings in the Great Recession and its aftermath. And another survey of 1,000 adults released last year by personal finance website GOBankingRates.com found that most Americans (62%) have less than $1,000 in their savings account (although that doesn’t include retirement or other investment accounts).
Why aren’t people saving? Millions of Americans are struggling with student loans, medical bills and other debts, says Andrew Meadows, a San Francisco-based producer of “Broken Eggs,” a documentary about retirement. Central bankers hiked their short-term interest rate target last month to a range of 0.25% to 0.50% from near-zero, but that’s still a small return for savings left in bank accounts. Indeed, personal savings rates as a percentage of disposable income dropped from 11% in December 2012 to 4.6% in August 2015, according to the Bureau of Economic Analysis, and now hover at 5.5%.
More money and education can help. The latest Bankrate survey found that savings increased with income and education: Just 46% of the highest-income households ($75,000-plus per year) and 52% of college graduates lack enough savings to cover a $500 car repair or $1,000 emergency room visit. And while those figures could still be lower, Americans are willing to cut back on at least some expenses when money is tight: 58% say they’re “very/somewhat” likely to cut back on eating out, are likely to decrease their cable bill and 41% are likely to spend less on coffee at places like Starbucks, while 39% will seek out lower-cost cellphone bills.
by Jim Quinn, The Burning Platform, and Quentin Fottrell, Market Watch | Read more:
Image: Shutterstock

So the majority of Americans can’t handle a $500 expense, but for the last two years there have been 35 million new cars “sold” to blithering idiots on credit or leases. Even though they have no money, they decide it’s a brilliant idea to commit to a 7 year payment of $300 to $500 per month on an asset that declines in value rapidly. Morons abound. These are the same people who must have their Starbucks coffee every day. These math challenged boobs could defer buying a Starbucks coffee every day, save the $3, and accumulate $750 of emergency savings in one year. (...)
Quentin Fottrell: Most Americans are one paycheck away from the street:
Approximately 63% of Americans have no emergency savings for things such as a $1,000 emergency room visit or a $500 car repair, according to a survey released Wednesday of 1,000 adults by personal finance website Bankrate.com, up slightly from 62% last year. Faced with an emergency, they say they would raise the money by reducing spending elsewhere (23%), borrowing from family and/or friends (15%) or using credit cards to bridge the gap (15%).
This lack of emergency savings could be a problem for millions of Americans. More than four in 10 Americans either experienced a major unexpected expense over the past 12 months or had an immediate family member who had an unexpected expense, Bankrate found. (The survey didn’t specify the impact of that expense.) “Without emergency savings, you may not have money to cover needed home repairs,” says Signe-Mary McKernan, senior fellow and economist at the Urban Institute, a nonprofit organization that focuses on social and economic policy. “Similarly, without emergency savings, people could raid their retirement account.”
The findings are strikingly similar to two other reports, one by the U.S. Federal Reserve survey of more than 4,000 adults released in 2014. “Savings are depleted for many households after the recession,” it found. Among those who had savings prior to 2008, 57% said they’d used up some or all of their savings in the Great Recession and its aftermath. And another survey of 1,000 adults released last year by personal finance website GOBankingRates.com found that most Americans (62%) have less than $1,000 in their savings account (although that doesn’t include retirement or other investment accounts).
Why aren’t people saving? Millions of Americans are struggling with student loans, medical bills and other debts, says Andrew Meadows, a San Francisco-based producer of “Broken Eggs,” a documentary about retirement. Central bankers hiked their short-term interest rate target last month to a range of 0.25% to 0.50% from near-zero, but that’s still a small return for savings left in bank accounts. Indeed, personal savings rates as a percentage of disposable income dropped from 11% in December 2012 to 4.6% in August 2015, according to the Bureau of Economic Analysis, and now hover at 5.5%.
More money and education can help. The latest Bankrate survey found that savings increased with income and education: Just 46% of the highest-income households ($75,000-plus per year) and 52% of college graduates lack enough savings to cover a $500 car repair or $1,000 emergency room visit. And while those figures could still be lower, Americans are willing to cut back on at least some expenses when money is tight: 58% say they’re “very/somewhat” likely to cut back on eating out, are likely to decrease their cable bill and 41% are likely to spend less on coffee at places like Starbucks, while 39% will seek out lower-cost cellphone bills.
by Jim Quinn, The Burning Platform, and Quentin Fottrell, Market Watch | Read more:
Image: Shutterstock
Saturday, January 2, 2016
Friday, January 1, 2016
What Was Volkswagen Thinking?
One day in 1979, James Burke, the chief executive of Johnson & Johnson, summoned more than 20 of his key people into a room, jabbed his finger at an internal document, and proposed destroying it.
The document was hardly incriminating. Entitled “Our Credo,” its plainspoken list of principles—including a higher duty to “mothers, and all others who use our products”—had been a fixture on company walls since 1943. But Burke was worried that managers had come to regard it as something like the Magna Carta: an important historical document, but hardly a tool for modern decision making. “If we’re not going to live by it, let’s tear it off the wall,” Burke told the group, using the weight of his office to force a debate. And that is what he got: a room full of managers debating the role of moral duties in daily business, and then choosing to resuscitate the credo as a living document.
Three years later, after reports emerged of a deadly poisoning of Tylenol capsules in Chicago-area stores, Johnson & Johnson’s reaction became the gold standard of corporate crisis response. But the company’s swift decisions—to remove every bottle of Tylenol capsules from store shelves nationwide, publicly warn people not to consume its product, and take a $100 million loss—weren’t really decisions. They flowed more or less automatically from the signal sent three years earlier. Burke, in fact, was on a plane when news of the poisoning broke. By the time he landed, employees were already ordering Tylenol off store shelves.
On the face of it, you’d be hard-pressed to find an episode less salient to the emissions-cheating scandal at Volkswagen—a company that, by contrast, seems intent on poisoning its own product, name, and future. But although the details behind VW’s installation of “defeat devices” in its vehicles are only beginning to trickle out, the decision process is very likely to resemble a bizarro version of Johnson & Johnson’s, with opposite choices every step of the way.
The sociologist Diane Vaughan coined the phrase the normalization of deviance to describe a cultural drift in which circumstances classified as “not okay” are slowly reclassified as “okay.” In the case of the Challenger space-shuttle disaster—the subject of a landmark study by Vaughan—damage to the crucial O‑rings had been observed after previous shuttle launches. Each observed instance of damage, she found, was followed by a sequence “in which the technical deviation of the [O‑rings] from performance predictions was redefined as an acceptable risk.” Repeated over time, this behavior became routinized into what organizational psychologists call a “script.” Engineers and managers “developed a definition of the situation that allowed them to carry on as if nothing was wrong.” To clarify: They were not merelyacting as if nothing was wrong. They believed it, bringing to mind Orwell’s concept of doublethink, the method by which a bureaucracy conceals evil not only from the public but from itself.
If that comparison sounds overwrought, consider the words of Denny Gioia, a management professor at Penn State who, in the early 1970s, was the coordinator of product recalls at Ford. At the time, the Ford Pinto was showing a tendency to explode when hit from behind, incinerating passengers. Twice, Gioia and his team elected not to recall the car—a fact that, when revealed to his M.B.A. students, goes off like a bomb. “Before I went to Ford I would have argued strongly that Ford had an ethical obligation to recall,” he wrote in the Journal of Business Ethics some 17 years after he’d left the company. “I now argue and teach that Ford had an ethical obligation to recall. But, while I was there, I perceived no strong obligation to recall and I remember no strong ethical overtones to the case whatsoever.”The O-ring engineers were not merely acting as if nothing was wrong. They believed it.
What, Gioia the professor belatedly asked, had Gioia the auto executive been thinking? The best answer, he concluded, is that he hadn’t been. Executives are bombarded with information. To ease the cognitive load, they rely on a set of unwritten scripts imported from the organization around them. You could even define corporate culture as a collection of scripts. Scripts are undoubtedly efficient. Managers don’t have to muddle through each new problem afresh, Gioia wrote, because “the mode of handling such problems has already been worked out in advance.” But therein lies the danger. Scripts can be flawed, and grow more so over time, yet they discourage active analysis. Based on the information Gioia had at the time, the Pinto didn’t fit the criteria for recall that his team had already agreed upon (a clearly documentable pattern of failure of a specific part). No further thought necessary.
Sometimes a jarring piece of evidence does intrude, forcing a conscious reassessment. For Gioia, it was the moment he saw the charred hulk of a Pinto at a company depot known internally as “The Chamber of Horrors.” The revulsion it evoked gave him pause. He called a meeting. But nothing changed. “After the usual round of discussion about criteria and justification for recall, everyone voted against recommending recall—including me.”
The most troubling thing, says Vaughan, is the way scripts “expand like an elastic waistband” to accommodate more and more divergence. Morton-Thiokol, the NASA contractor charged with engineering the O-rings, requested a teleconference on the eve of the fatal Challenger launch. After a previous launch, its engineers had noticed O-ring damage that looked different from damage they’d seen before. Suspecting that cold was a factor, the engineers saw the near-freezing forecast and made a “no launch” recommendation—something they had never done before. But the data they faxed to NASA to buttress their case were the same data they had earlier used to argue that the space shuttle was safe to fly. NASA pounced on the inconsistency. Embarrassed and unable to overturn the script they themselves had built in the preceding years, Morton-Thiokol’s brass buckled. The “no launch” recommendation was reversed to “launch.”
“It’s like losing your virginity,” a NASA teleconference participant later told Vaughan. “Once you’ve done it, you can’t go back.” If you try, you face a credibility spiral: Were you lying then or are you lying now?
The document was hardly incriminating. Entitled “Our Credo,” its plainspoken list of principles—including a higher duty to “mothers, and all others who use our products”—had been a fixture on company walls since 1943. But Burke was worried that managers had come to regard it as something like the Magna Carta: an important historical document, but hardly a tool for modern decision making. “If we’re not going to live by it, let’s tear it off the wall,” Burke told the group, using the weight of his office to force a debate. And that is what he got: a room full of managers debating the role of moral duties in daily business, and then choosing to resuscitate the credo as a living document.

On the face of it, you’d be hard-pressed to find an episode less salient to the emissions-cheating scandal at Volkswagen—a company that, by contrast, seems intent on poisoning its own product, name, and future. But although the details behind VW’s installation of “defeat devices” in its vehicles are only beginning to trickle out, the decision process is very likely to resemble a bizarro version of Johnson & Johnson’s, with opposite choices every step of the way.
The sociologist Diane Vaughan coined the phrase the normalization of deviance to describe a cultural drift in which circumstances classified as “not okay” are slowly reclassified as “okay.” In the case of the Challenger space-shuttle disaster—the subject of a landmark study by Vaughan—damage to the crucial O‑rings had been observed after previous shuttle launches. Each observed instance of damage, she found, was followed by a sequence “in which the technical deviation of the [O‑rings] from performance predictions was redefined as an acceptable risk.” Repeated over time, this behavior became routinized into what organizational psychologists call a “script.” Engineers and managers “developed a definition of the situation that allowed them to carry on as if nothing was wrong.” To clarify: They were not merelyacting as if nothing was wrong. They believed it, bringing to mind Orwell’s concept of doublethink, the method by which a bureaucracy conceals evil not only from the public but from itself.
If that comparison sounds overwrought, consider the words of Denny Gioia, a management professor at Penn State who, in the early 1970s, was the coordinator of product recalls at Ford. At the time, the Ford Pinto was showing a tendency to explode when hit from behind, incinerating passengers. Twice, Gioia and his team elected not to recall the car—a fact that, when revealed to his M.B.A. students, goes off like a bomb. “Before I went to Ford I would have argued strongly that Ford had an ethical obligation to recall,” he wrote in the Journal of Business Ethics some 17 years after he’d left the company. “I now argue and teach that Ford had an ethical obligation to recall. But, while I was there, I perceived no strong obligation to recall and I remember no strong ethical overtones to the case whatsoever.”The O-ring engineers were not merely acting as if nothing was wrong. They believed it.
What, Gioia the professor belatedly asked, had Gioia the auto executive been thinking? The best answer, he concluded, is that he hadn’t been. Executives are bombarded with information. To ease the cognitive load, they rely on a set of unwritten scripts imported from the organization around them. You could even define corporate culture as a collection of scripts. Scripts are undoubtedly efficient. Managers don’t have to muddle through each new problem afresh, Gioia wrote, because “the mode of handling such problems has already been worked out in advance.” But therein lies the danger. Scripts can be flawed, and grow more so over time, yet they discourage active analysis. Based on the information Gioia had at the time, the Pinto didn’t fit the criteria for recall that his team had already agreed upon (a clearly documentable pattern of failure of a specific part). No further thought necessary.
Sometimes a jarring piece of evidence does intrude, forcing a conscious reassessment. For Gioia, it was the moment he saw the charred hulk of a Pinto at a company depot known internally as “The Chamber of Horrors.” The revulsion it evoked gave him pause. He called a meeting. But nothing changed. “After the usual round of discussion about criteria and justification for recall, everyone voted against recommending recall—including me.”
The most troubling thing, says Vaughan, is the way scripts “expand like an elastic waistband” to accommodate more and more divergence. Morton-Thiokol, the NASA contractor charged with engineering the O-rings, requested a teleconference on the eve of the fatal Challenger launch. After a previous launch, its engineers had noticed O-ring damage that looked different from damage they’d seen before. Suspecting that cold was a factor, the engineers saw the near-freezing forecast and made a “no launch” recommendation—something they had never done before. But the data they faxed to NASA to buttress their case were the same data they had earlier used to argue that the space shuttle was safe to fly. NASA pounced on the inconsistency. Embarrassed and unable to overturn the script they themselves had built in the preceding years, Morton-Thiokol’s brass buckled. The “no launch” recommendation was reversed to “launch.”
“It’s like losing your virginity,” a NASA teleconference participant later told Vaughan. “Once you’ve done it, you can’t go back.” If you try, you face a credibility spiral: Were you lying then or are you lying now?
by Jerry Useem, The Atlantic | Read more:
Image: Justin RenteriaEating ‘Healthy-ish’
Abstinence, we are usually told around this time of year, makes the heart grow stronger. It’s why Dry January, which started in the green and pleasantly alcoholic land of Britain a few years ago before reaching the U.S., is increasingly being touted as a good and worthy thing to do, and why so many people are currently making plans to remove whole food groups from their diet: carbs, fat, Terry’s Chocolate Oranges. The key to health, books and websites and dietitians and former presidents reveal, is a process of elimination. It’s going without. It’s getting through the darkest, coldest month of the year without so much as a snifter of antioxidant-rich Cabernet.
The problem with giving things up, though, is that inevitably it creates a void in one’s diet that only Reese’s pieces and a family-sized wheel of brie can fill. Then there’s the fact that so many abstinence-espousing programs require spending money on things; on Whole 30 cookbooks and Weight Watchers memberships and $10 bottles of bone broth. For a process that supposedly involves cutting things out, there seems to be an awful lot to take in.
This, Michael Pollan posits, is the problem with food: It’s gotten extraordinarily complicated. The writer and sustainable-eating advocate has written several books on how the simple business of eating has become a minefield in which earnest Westerners try to tiptoe around gooey, genetically engineered sugar bombs without setting off an explosion of calories, corn sugar, and cancer. In Defense of Food, published in 2008, offers a “manifesto” for eaters (i.e. humans) that’s breathtaking in its seven-word simplicity: Eat Food. Not Too Much. Mostly Plants. This mantra is repeated once more in a documentary based on the book that airs Wednesday night on PBS, and it’s felt in the January issue of Bon Appetit, which is based almost entirely around the concept of “healthy-ish” eating: “delicious, comforting home cooking that just happens to be kinda good for you.”
Healthy-ish, as a concept, isn’t new. In fact, it’s the food industry’s equivalent of your mom telling you to finish your broccoli before you dive into the Twinkies, only dressed up with a sexy hyphenated coverline and some mouthwatering photos of chicken seared in a cast-iron skillet. “Healthy-ish” shouldn’t feel revolutionary. By its very definition it’s something of a big old foodie shrug—an acknowledgment that if we can’t all subsist on steamed fish and vegetables all of the time, we can at least offset the steak dinner for having salad for lunch. It is, as per Pollan at least, a philosophy that everything is best enjoyed in moderation, including moderation.
So why does it feel so subversive?
The reason, as explained by both manifestations of In Defense of Food, is that industries upon industries, even entire religions, have been predicated on the premise that eating (certain things) is bad and will kill you. The documentary draws on years of food-related quackery to illustrate how ingrained fearing food is.

This, Michael Pollan posits, is the problem with food: It’s gotten extraordinarily complicated. The writer and sustainable-eating advocate has written several books on how the simple business of eating has become a minefield in which earnest Westerners try to tiptoe around gooey, genetically engineered sugar bombs without setting off an explosion of calories, corn sugar, and cancer. In Defense of Food, published in 2008, offers a “manifesto” for eaters (i.e. humans) that’s breathtaking in its seven-word simplicity: Eat Food. Not Too Much. Mostly Plants. This mantra is repeated once more in a documentary based on the book that airs Wednesday night on PBS, and it’s felt in the January issue of Bon Appetit, which is based almost entirely around the concept of “healthy-ish” eating: “delicious, comforting home cooking that just happens to be kinda good for you.”
Healthy-ish, as a concept, isn’t new. In fact, it’s the food industry’s equivalent of your mom telling you to finish your broccoli before you dive into the Twinkies, only dressed up with a sexy hyphenated coverline and some mouthwatering photos of chicken seared in a cast-iron skillet. “Healthy-ish” shouldn’t feel revolutionary. By its very definition it’s something of a big old foodie shrug—an acknowledgment that if we can’t all subsist on steamed fish and vegetables all of the time, we can at least offset the steak dinner for having salad for lunch. It is, as per Pollan at least, a philosophy that everything is best enjoyed in moderation, including moderation.
So why does it feel so subversive?
The reason, as explained by both manifestations of In Defense of Food, is that industries upon industries, even entire religions, have been predicated on the premise that eating (certain things) is bad and will kill you. The documentary draws on years of food-related quackery to illustrate how ingrained fearing food is.
The Good Times
Thursday, December 31, 2015
The Sacred Child
[ed. I've been reposting a few things from 2015 over the last few days. Here's something from a bit earlier. Given the ubiquity of phone cameras these days it's an issue to think about.]
Goa, India, 2009. A shimmering white beach. Clear blue water, a cloudless sky. The rush of waves and a constant din from jet skis. Behind us: rust-coloured sand, skinny cows browsing among trash and dry bushes.
I'm lounging on the sun bed with a mystery novel and keeping half an eye on my three-year-old daughter, who is sitting in pink swimming pants and playing with a bucket and spade. She is blonde, blue-eyed and unbelievably cute. People here stare at her, ensorcelled, love-struck, touching her hair, pointing at her. The other day the restaurant waiter - stoned? - approached and bit her tenderly on her yummy upper arm. And above all, they want to take her picture. In this country headed headlong into the future - the little dirt track back to the hotel that we walked when we arrived a week ago has already been tarred over with asphalt - every Indian seems to have a camera phone. Often they ask me, or more rarely my wife, civilly if they may take a picture. Having been brought up on Swedish school pedagogics, I relay the question to my daughter: "Is it OK for you if they take your picture?" I guess I think it's her decision.
A well-dressed slender Indian man in white pants and shirt wanders past on the beach. He smiles and coos at the playing Swedish child and takes out his cell phone. My sister-in-law is already there, asks my daughter, who says no. The man pays no attention, takes the pictures anyway.
My daughter is clearly stressed and uneasy with the situation, the strange man who stands before her with his phone portraying her, laughing lightly. My sister in law tells him off sharply, "Please! No!". He pays no mind, takes some more pictures.
I run down to the water and confront the man. "You respect my daughter!" I yell repeatedly. He apologises, looks nervous, says something in Hindi that I don't understand and points at his phone, as if showing that hey, he just took some pictures, what's the harm? He hurries away.
One of the beach guards soon catches up with him and takes the phone, clearly in order to flip through the photo folder. The man, by now visibly sweating and piteous, explains and gesticulates to the grim guard. Apparently there is nothing on the phone to suggest that the man is a sex tourist or pedophile, as he soon gets his phone back and slips off.
I sit back heavily on the sun bed. Conflicting emotions. I feel indignant and aggrieved - dammit, I should have thrown that phone into the sea, would have served that perv right. Uncertain - OK, he shouldn't have done that, but what if he's really just an everyday Indian guy who loves to see European kids on the beach and wanted a lovely holiday souvenir? Is that really such a big deal?
No more strangers take any pictures of my daughter on the trip. I quit offering her to decide. I just say no, categorically. Her image becomes untouchable. Her likeness becomes sacred.
I should perhaps begin with the disclaimer we all seem forced to start with when we talk about this issue. To wit: I hate everything about child molestation. I hate pedophiles, child porn, all the dirt and darkness and nauseating shit those awful people do. I have two little daughters and I'm prepared to kill or die to protect them against that kind of evil.
This is not actually an essay on child pornography, at least not if we take that to mean images of children being sexually abused, images that could not exist unless children had been violated, defiled, victimised. But in 2011, in Sweden, that is not the definition of child pornography. Instead there is a boundary zone between images that are OK (legitimate though potentially provocative) and such that are a crime to produce, disseminate and possess. That gray zone raises a number of difficult questions about children, art, society and sexuality. Those questions have rarely been more topical than today, and they touch upon the most personal, forbidden and sacred of issues.
Biddick Hall, north-east England, 1976. This time the three-year-old's name is Rosie Bowdrey. Photographer Robert Mapplethorpe is a guest at the wealthy family's garden party, the sun beats down and he takes innumerable pictures. Rosie has been swimming and runs around in the nude; her mother hurriedly gets the child into a dress. She sits down, a little huffily, on a stone bench. Mapplethorpe takes a picture, probably using his new Hasselblad. Then the skirt comes off again.
34 years later this picture is considered the single most controversial work in Mapplethorpe's oeuvre. We're dealing with an artist who, later in life, took pictures of BDSM, of coprophagy, sexually charged images of African American men, pictures of himself with a bull whip up his posterior. But the picture where the genitals of a three-year-old can be made out is worse. Wherever "Rosie" has been shown, it has soon been taken down again, most recently in November 2010 at Bukowski's fine-arts auction house in Stockholm.
It makes no difference that Rosie's mother, Lady Beatrix Nevill, signed a release for the image, stating that she does not find it pornographic and that she wants it to be exhibited. It makes no difference that Rosie Bowdrey herself, now an adult, has said that she is proud of the picture, that she can't see how anyone would find it pornographic, and that she wants it to be exhibited. It makes no difference that nothing suggests that Mapplethorpe, who incidentally was gay, had any sexual interest in little girls.
Who is eroticising the child in the picture? The photographer - or the viewer?
Because at the same time: isn't there something erotic about that image? Or what? About the large luminous eyes, about the sullen mouth with its slightly drooping corners? Something like posing, provocative, that we recognise from a thousand sexually explicit or implicit pictures of adult women? Or what? What do you think?
People in art circles rarely condemn a work of art; more commonly one will encounter a "permissive" attitude to the sphere of aesthetics where anything smacking of censorship will be loudly decried. Thus it is interesting to note mystery novelist Mons Kallentoft writing on his blog that the image goes "way, way across the boundary to child porn" and noting with pleasure that this time "the alarm bells" had worked. "It's never ever right to eroticise a child, not even for the most self-aggrandising, priggish artistic purposes", he added. When I reach Kallentoft on the phone he is at first happy to develop his thoughts further.
"The girl in the picture can't choose, she's being watched. There are people on Earth who get turned on by pictures like these, and that constitutes abuse against her no matter how you shake it. Nobody has that right."
But as an adult, the girl in that picture has said that she doesn't view it as pornographic?
"It doesn't work that way. That's like saying that with consent, we're allowed to do whatever we like to each other, and we might as well sign contracts permitting others to murder us ... That picture is child porn and exhibiting it to the public is wrong! I mean sure, OK, you can keep it to yourself in your home."
So would the image be acceptable if it sat in somebody's photo album - where pictures of nude kids are pretty common?
Our interview takes a left turn here. Mons Kallentoft is very upset by my question, or by my matter-of-fact and slightly impersonal way of phrasing it. He asks me if I have experienced any sexual abuse against children. Before I can answer, he angrily declares that he isn't willing to intellectualise this issue further and abruptly ends our conversation.
I feel bad about this, like a cynical and superficial asshole. Somebody who is happy to sit in a comfy desk chair under pleasant lighting with a cup of tea and soft music in the background, writing about this issue as if it were all about aesthetics - while in fact we're talking about children's lives being ruined, children being violated and defiled in unimaginable ways. Do we even have the right to a lukewarm analytical attitude regarding an issue were the stakes are so high?
I don't want to use a fellow human being and colleague's emotional reaction as a rhetorical tool or pedagogical example, but Kallentoft's reaction really shows me how fraught, personal and painful this issue can be. And suddenly I also think I have gained a deeper understanding of how devout Christians or Muslims feel about pictures such as Elisabeth Ohlson Wallin's Ecce Homo or Lars Vilks's Mohammed cartoons. It's such a gross violation that it's impossible to speak rationally about it, a violation that can only get worse when some uncomprehending respectless bastard asks why you feel violated.
Suddenly I understand better how difficult it is to get anywhere when it comes to things that touches the depths of our souls. How much really is at stake.
by Jens Liljestrand, Aardvarcheology | Read more:
Image: via:
Repost
Goa, India, 2009. A shimmering white beach. Clear blue water, a cloudless sky. The rush of waves and a constant din from jet skis. Behind us: rust-coloured sand, skinny cows browsing among trash and dry bushes.
I'm lounging on the sun bed with a mystery novel and keeping half an eye on my three-year-old daughter, who is sitting in pink swimming pants and playing with a bucket and spade. She is blonde, blue-eyed and unbelievably cute. People here stare at her, ensorcelled, love-struck, touching her hair, pointing at her. The other day the restaurant waiter - stoned? - approached and bit her tenderly on her yummy upper arm. And above all, they want to take her picture. In this country headed headlong into the future - the little dirt track back to the hotel that we walked when we arrived a week ago has already been tarred over with asphalt - every Indian seems to have a camera phone. Often they ask me, or more rarely my wife, civilly if they may take a picture. Having been brought up on Swedish school pedagogics, I relay the question to my daughter: "Is it OK for you if they take your picture?" I guess I think it's her decision.

My daughter is clearly stressed and uneasy with the situation, the strange man who stands before her with his phone portraying her, laughing lightly. My sister in law tells him off sharply, "Please! No!". He pays no mind, takes some more pictures.
I run down to the water and confront the man. "You respect my daughter!" I yell repeatedly. He apologises, looks nervous, says something in Hindi that I don't understand and points at his phone, as if showing that hey, he just took some pictures, what's the harm? He hurries away.
One of the beach guards soon catches up with him and takes the phone, clearly in order to flip through the photo folder. The man, by now visibly sweating and piteous, explains and gesticulates to the grim guard. Apparently there is nothing on the phone to suggest that the man is a sex tourist or pedophile, as he soon gets his phone back and slips off.
I sit back heavily on the sun bed. Conflicting emotions. I feel indignant and aggrieved - dammit, I should have thrown that phone into the sea, would have served that perv right. Uncertain - OK, he shouldn't have done that, but what if he's really just an everyday Indian guy who loves to see European kids on the beach and wanted a lovely holiday souvenir? Is that really such a big deal?
No more strangers take any pictures of my daughter on the trip. I quit offering her to decide. I just say no, categorically. Her image becomes untouchable. Her likeness becomes sacred.
I should perhaps begin with the disclaimer we all seem forced to start with when we talk about this issue. To wit: I hate everything about child molestation. I hate pedophiles, child porn, all the dirt and darkness and nauseating shit those awful people do. I have two little daughters and I'm prepared to kill or die to protect them against that kind of evil.
This is not actually an essay on child pornography, at least not if we take that to mean images of children being sexually abused, images that could not exist unless children had been violated, defiled, victimised. But in 2011, in Sweden, that is not the definition of child pornography. Instead there is a boundary zone between images that are OK (legitimate though potentially provocative) and such that are a crime to produce, disseminate and possess. That gray zone raises a number of difficult questions about children, art, society and sexuality. Those questions have rarely been more topical than today, and they touch upon the most personal, forbidden and sacred of issues.
Biddick Hall, north-east England, 1976. This time the three-year-old's name is Rosie Bowdrey. Photographer Robert Mapplethorpe is a guest at the wealthy family's garden party, the sun beats down and he takes innumerable pictures. Rosie has been swimming and runs around in the nude; her mother hurriedly gets the child into a dress. She sits down, a little huffily, on a stone bench. Mapplethorpe takes a picture, probably using his new Hasselblad. Then the skirt comes off again.
34 years later this picture is considered the single most controversial work in Mapplethorpe's oeuvre. We're dealing with an artist who, later in life, took pictures of BDSM, of coprophagy, sexually charged images of African American men, pictures of himself with a bull whip up his posterior. But the picture where the genitals of a three-year-old can be made out is worse. Wherever "Rosie" has been shown, it has soon been taken down again, most recently in November 2010 at Bukowski's fine-arts auction house in Stockholm.
It makes no difference that Rosie's mother, Lady Beatrix Nevill, signed a release for the image, stating that she does not find it pornographic and that she wants it to be exhibited. It makes no difference that Rosie Bowdrey herself, now an adult, has said that she is proud of the picture, that she can't see how anyone would find it pornographic, and that she wants it to be exhibited. It makes no difference that nothing suggests that Mapplethorpe, who incidentally was gay, had any sexual interest in little girls.
Who is eroticising the child in the picture? The photographer - or the viewer?
Because at the same time: isn't there something erotic about that image? Or what? About the large luminous eyes, about the sullen mouth with its slightly drooping corners? Something like posing, provocative, that we recognise from a thousand sexually explicit or implicit pictures of adult women? Or what? What do you think?
People in art circles rarely condemn a work of art; more commonly one will encounter a "permissive" attitude to the sphere of aesthetics where anything smacking of censorship will be loudly decried. Thus it is interesting to note mystery novelist Mons Kallentoft writing on his blog that the image goes "way, way across the boundary to child porn" and noting with pleasure that this time "the alarm bells" had worked. "It's never ever right to eroticise a child, not even for the most self-aggrandising, priggish artistic purposes", he added. When I reach Kallentoft on the phone he is at first happy to develop his thoughts further.
"The girl in the picture can't choose, she's being watched. There are people on Earth who get turned on by pictures like these, and that constitutes abuse against her no matter how you shake it. Nobody has that right."
But as an adult, the girl in that picture has said that she doesn't view it as pornographic?
"It doesn't work that way. That's like saying that with consent, we're allowed to do whatever we like to each other, and we might as well sign contracts permitting others to murder us ... That picture is child porn and exhibiting it to the public is wrong! I mean sure, OK, you can keep it to yourself in your home."
So would the image be acceptable if it sat in somebody's photo album - where pictures of nude kids are pretty common?
Our interview takes a left turn here. Mons Kallentoft is very upset by my question, or by my matter-of-fact and slightly impersonal way of phrasing it. He asks me if I have experienced any sexual abuse against children. Before I can answer, he angrily declares that he isn't willing to intellectualise this issue further and abruptly ends our conversation.
I feel bad about this, like a cynical and superficial asshole. Somebody who is happy to sit in a comfy desk chair under pleasant lighting with a cup of tea and soft music in the background, writing about this issue as if it were all about aesthetics - while in fact we're talking about children's lives being ruined, children being violated and defiled in unimaginable ways. Do we even have the right to a lukewarm analytical attitude regarding an issue were the stakes are so high?
I don't want to use a fellow human being and colleague's emotional reaction as a rhetorical tool or pedagogical example, but Kallentoft's reaction really shows me how fraught, personal and painful this issue can be. And suddenly I also think I have gained a deeper understanding of how devout Christians or Muslims feel about pictures such as Elisabeth Ohlson Wallin's Ecce Homo or Lars Vilks's Mohammed cartoons. It's such a gross violation that it's impossible to speak rationally about it, a violation that can only get worse when some uncomprehending respectless bastard asks why you feel violated.
Suddenly I understand better how difficult it is to get anywhere when it comes to things that touches the depths of our souls. How much really is at stake.
by Jens Liljestrand, Aardvarcheology | Read more:
Image: via:
Repost
Labels:
Critical Thought,
Law,
Psychology,
Relationships
Bob Dylan's MusiCares Person of the Year Speech
I learned lyrics and how to write them from listening to folk songs. And I played them, and I met other people that played them back when nobody was doing it. Sang nothing but these folk songs, and they gave me the code for everything that's fair game, that everything belongs to everyone.
For three or four years all I listened to were folk standards. I went to sleep singing folk songs. I sang them everywhere, clubs, parties, bars, coffeehouses, fields, festivals. And I met other singers along the way who did the same thing and we just learned songs from each other. I could learn one song and sing it next in an hour if I'd heard it just once.
If you sang "John Henry" as many times as me -- "John Henry was a steel-driving man / Died with a hammer in his hand / John Henry said a man ain't nothin' but a man / Before I let that steam drill drive me down / I'll die with that hammer in my hand."
If you had sung that song as many times as I did, you'd have written "How many roads must a man walk down?" too.
Big Bill Broonzy had a song called "Key to the Highway." "I've got a key to the highway / I'm booked and I'm bound to go / Gonna leave here runnin' because walking is most too slow." I sang that a lot. If you sing that a lot, you just might write,
Georgia Sam he had a bloody nose
Welfare Department they wouldn’t give him no clothes
He asked poor Howard where can I go
Howard said there’s only one place I know
Sam said tell me quick man I got to run
Howard just pointed with his gun
And said that way down on Highway 61
You'd have written that too if you'd sang "Key to the Highway" as much as me.
"Ain't no use sit 'n cry / You'll be an angel by and by / Sail away, ladies, sail away." "I'm sailing away my own true love." "Boots of Spanish Leather" -- Sheryl Crow just sung that.
"Roll the cotton down, aw, yeah, roll the cotton down / Ten dollars a day is a white man's pay / A dollar a day is the black man's pay / Roll the cotton down." If you sang that song as many times as me, you'd be writing "I ain't gonna work on Maggie's farm no more," too.
I sang a lot of "come all you" songs. There's plenty of them. There's way too many to be counted. "Come along boys and listen to my tale / Tell you of my trouble on the old Chisholm Trail." Or, "Come all ye good people, listen while I tell / the fate of Floyd Collins a lad we all know well / The fate of Floyd Collins, a lad we all know well."
"Come all ye fair and tender ladies / Take warning how you court your men / They're like a star on a summer morning / They first appear and then they're gone again." "If you'll gather 'round, people / A story I will tell / 'Bout Pretty Boy Floyd, an outlaw / Oklahoma knew him well."
If you sung all these "come all ye" songs all the time, you'd be writing, "Come gather 'round people where ever you roam, admit that the waters around you have grown / Accept that soon you'll be drenched to the bone / If your time to you is worth saving / And you better start swimming or you'll sink like a stone / The times they are a-changing."
You'd have written them too. There's nothing secret about it. You just do it subliminally and unconsciously, because that's all enough, and that's all I sang. That was all that was dear to me. They were the only kinds of songs that made sense. (...)
All these songs are connected. Don't be fooled. I just opened up a different door in a different kind of way. It's just different, saying the same thing. I didn't think it was anything out of the ordinary.
by Bob Dylan, LA Times | Read more:
Image: YouTube
Permission to Fail
[ed. At least the title is accurate. Jeezus. I don't mean to take it out on poor Michelle here because there are so many, many, other posts like this on the internet (*cough*, Medium), she just happened to have the unfortunate luck of popping up on my screen today. I have to say, if I see one more angst-ridden, pseudo-motivational article about some young person wrestling with a new career or business (oh, sorry... "startup") I'm going to scream. Just STFU! Please. Do your job, learn what you can and leave it at that. None of this hand-holding and "be brave" posturing will get you anywhere, except maybe with your co-workers who now have a better idea about your insecurities and can act accordingly. Do you think the heavy rollers at Sequoia Capital have time for this, or even care?]

“Now that we’re on this crazy success trajectory, the degree of stress and the degree of doubt and the degree of second-guessing hasn’t been reduced at all,” he said. “In many respects, it’s actually worse now because there’s more at stake. […] I think I wake up every day and look in the mirror and say, ‘We’ve almost certainly fucked this up completely.‘” - Stewart Butterfield, CEO, Slack
One thing I’ve noticed is that it’s easy to write and to be transparent when things are going great. It’s harder when the roller coaster dips downward, when your stomach feels like it’s in freefall. Way back in January, our team experienced one of those dips. Up until this point, for 24 straight months, Keen IO’s business had grown 5-15% - every month! We were embarrassingly confident, and we felt unstoppable. It was coming off of this incredible rise that reality hit and we got to experience a little more of that roller coaster everyone was talking about.
I wrote the following company-wide memo during that time, when we got to learn what it was like to fall a little bit.
On Tue, Feb 24, 2015 at 5:55 PM, Michelle Wetzler wrote:
Hey everybody - I hope you don’t mind me sharing a relatively raw piece of writing. It started out as a sort-of blog post, but then I realized it’s really a letter to you all. Feeling a bit brave right now and clicking send. Hope it’s helpful.
by Michelle Wetzler, Keen | Read more:
Image: Imgur
Subscribe to:
Posts (Atom)