Friday, November 22, 2013

Evolution's Other Narrative

During a recent meal with a friend who happens to be a successful engineer, I found myself drawn, as usual, into debate. Although our theological and political views diverge, he and I customarily find common ground in scientific epistemology. However, this time the topic was whether intelligent design should be taught in high schools. When I expressed incredulity at his support for teaching intelligent design, he said, “Brad, just look around us—survival of the fittest can’t be all that’s going on here, and I think it is important to respect people’s sensitivity to that.”

I reminded my friend that, because intelligent design argues for supernatural causes of natural phenomena, teaching it would undermine rational inquiry, together with students’ ability to eventually make the kind of scientific breakthroughs we are enjoying today. I pointed out the U.S. National Institutes of Health’s Human Microbiome Project as an example, which is revealing how human health suffers when the health of the millions of microorganisms with which we’ve coevolved suffers. My friend’s simplistic interpretation of evolution as “survival of the fittest” left him ignorant even of the possibility of projects like this, which are based on evolutionary considerations of symbiosis. Evidently, educators—and certainly evolutionary specialists themselves—must broadcast a more nuanced story of evolutionary theory. Otherwise, future scientists and projects that inform better approaches to human health and global ecology will be sabotaged before they even emerge.

Science education has failed to overcome entrenched cultural ideals rooted not only in religion, but also in political philosophy. For those like my engineer friend trying to comprehend how magnificent structures of life emerge by means of “survival of the fittest,” skepticism is understandable. Popular appreciation for life’s complexity has far outpaced the popular interpretation of the evolutionary source of that complexity, which has remained stuck in 1864, when Herbert Spencer coined the phrase “survival of the fittest.”

When it comes to the story of evolutionary science, people know the name Charles Darwin, but most do not know the names Ivan Wallin or Lynn Margulis—two more recent, groundbreaking evolutionary theorists. Over the past several decades, these and other researchers have revealed that organisms’ cooperation and interdependence contribute more to evolution than competition. Symbiogenesis—the emergence of a new species through the evolutionary interdependence of two or more species—is at least as important in the history of life as survival of the fittest. Such insight has failed to gain traction in American minds—including those of American scientists—because of cultural history traceable back through the popularization of Adam Smith’s individualist philosophy. (...)

According to Margulis, the evolving relationships between microscopic organisms and other micro- and macroscopic organisms are the essence of the history of life. Despite scientists’ mid-century focus on eukaryotic life (organisms with larger cells featuring a bounded nucleus and organelles), the most prolific type of organism on Earth, bacteria, is prokaryotic (an organism without a bounded nucleus). Virtually all eukaryotic forms of life have adapted symbiotic associations with prokaryotic bacteria. Margulis was among the first Western scientists to attempt to popularize this fact. She spent virtually her entire career laboring to bring this mostly microscopic form of evolution to the macroscopic focus of her readers.

Margulis’s research in microbiology equipped her to verify and expand on Wallin’s symbiosis-centered theory. In 1966 she attempted to publish a summary of her perspectives on the evolution of complex life forms in “The Origin of Mitosing Eukaryotic Cells,” only to be rejected by more than a dozen scientific journals. When her article was finally published by the Journal of Theoretical Biology, criticism ensued. Nonetheless, the further Margulis pushed her symbiotic evolutionary theory, the more convinced she became that the emergence of eukaryotic cells a billion and a half years ago—a major evolutionary transition in the history of life—was the result of symbiogenesis.

In Margulis’s view, out of prokaryotic–prokaryotic symbiosis emerged eukaryotes. Out of prokaryotic–eukaryotic symbiosis emerged more competitive eukaryotes. And out of eukaryotic–eukaryotic symbiosis emerged multicellular life. The classic image of evolution, the tree of life, almost always exclusively shows diverging branches; however, a banyan tree, with diverging and converging branches is best. To this day, many scientists and most laypeople remain ignorant of this way of imagining evolution, which profoundly constricts how they imagine themselves.

by Bradford Harris, American Scientist |  Read more:
Image: Endosymbiosis: Homage to Lynn Margulis, by Shoshanah Dubineer

Senate Limits Use of the Filibuster

[ed. There'll be lots of commentary on this (for example: here and here). The Republicans have been abusing the filibuster for a long time. Nice to see some sanity finally restored. And for those who say wait until the shoe's on the other foot, just look at the record.]

The Senate approved the most fundamental alteration of its rules in more than a generation on Thursday, ending the minority party’s ability to filibuster most presidential nominees in response to the partisan gridlock that has plagued Congress for much of the Obama administration.

Furious Republicans accused Democrats of a power grab, warning them that they would deeply regret their action if they lose control of the Senate next year and the White House in years to come. Invoking the Founding Fathers and the meaning of the Constitution, Republicans said Democrats were trampling the minority rights the framers intended to protect. But when the vote was called, Senator Harry Reid, the majority leader who was initially reluctant to force the issue, prevailed 52 to 48.

Under the change, the Senate will be able to cut off debate on executive and judicial branch nominees with a simple majority rather than rounding up a supermajority of 60 votes. The new precedent established by the Senate on Thursday does not apply to Supreme Court nominations or legislation itself.

It represented the culmination of years of frustration over what Democrats denounced as a Republican campaign to stall the machinery of Congress, stymie President Obama’s agenda, and block his picks to cabinet posts and federal judgeships by insisting that virtually everything the Senate approves must be done by a supermajority.

After repeatedly threatening to change the filibuster, Mr. Reid decided to follow through when Republicans refused this week to back down from their effort to keep Mr. Obama from filling any of three vacancies on the most powerful appeals court in the country.

This was the final straw for some Democratic holdouts against limiting the filibuster, providing Mr. Reid with the votes he needed to impose a new standard certain to reverberate through the Senate for years.

“There has been unbelievable, unprecedented obstruction,” Mr. Reid said as he set in motion the steps for the vote on Thursday. “The Senate is a living thing, and to survive it must change as it has over the history of this great country. To the average American, adapting the rules to make the Senate work again is just common sense.”

Republicans accused Democrats of irreparably damaging the character of an institution that in many ways still operates as it did in the 19th century, and of disregarding the constitutional prerogative of the Senate as a body of “advice and consent” on presidential nominations.

“You think this is in the best interest of the United States Senate and the American people?” asked the Republican leader, Senator Mitch McConnell, sounding incredulous.

“I say to my friends on the other side of the aisle, you’ll regret this. And you may regret it a lot sooner than you think,” he added.

Mr. Obama applauded the Senate’s move. “Today’s pattern of obstruction, it just isn’t normal,” he told reporters at the White House. “It’s not what our founders envisioned. A deliberate and determined effort to obstruct everything, no matter what the merits, just to refight the results of an election is not normal, and for the sake of future generations we can’t let it become normal.”

by Jeremy W. Peters, NY Times |  Read more:
Image:Stephen Crowley/The New York Times

Thursday, November 21, 2013

We Do Not Need Another Cat

We are down a cat. It's still too upsetting to talk about (rural life, tentative open window policy, probable coyote, lifetime of horrible, horrible guilt for not sticking to indoor cat guns), but we used to have two cats, the correct number of cats, and now we have one cat. An indoor cat.

And we can't really face the idea of acquiring a second cat, because a) we're having a human baby in a few months, and b) First Cat never really liked having a second cat, and now that she's Only Cat, she's super-stoked about it and prances around like she owns the place, and c) that would involve formally admitting that Second Cat is gone for good.

But, you know, I read the shelter intake emails every morning, even though Second Cat has been almost certainly deceased for a month now, and so I literally page through dozens of pictures of homeless cats on a daily basis, and it makes me feel like a ghoul. Even though, bless 'em, homeless cats usually put on great bitchface for the camera, you know? The dogs have that plaintive "where's my mommy?" thing going, and the cats are all "get that out of my face. I don't need you! I don't need anyone!"

And you start thinking, maybe an elderly boy cat? Just some big orangey lump? But then First Cat is all, "I tolerated Second Cat because she was from the same Brooklyn feral cluster as me. We were basically sisters. Don't push your luck. Did you see what I did to the stuffed bobcat you bought for your nieces?"

I don't know.

by Nicole Cliff, The Hairpin |  Read more:
Image: uncredited
[ed. Repost]

The Next Housing Crash

The iconic American single-family home was a housing model for a different era, when Baby Boomers were raising their kids, gas was cheap, and suburbia beckoned. But today, Boomers are preparing to retire and downsize, while many of their children are eager to live elsewhere, trading home ownership for rentals, suburbs for cities, two-car garages for more-compact living. Can the housing market alter course in time to accommodate everyone? And what will happen to the Boomers’ dream homes in the suburbs if no one lines up to buy them?


by Emily Badger, The Atlantic |  Read more:
Image: The Atlantic

They're Watching You at Work

[ed. My father used to be a corporate psychologist so I spent many hours with block diagrams and Rorscharch tests as a child. I wonder what he thought they revealed?]

The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught. And it can’t help but feel a little creepy. It requires the creation of a vastly larger box score of human performance than one would ever encounter in the sports pages, or that has ever been dreamed up before. To some degree, the endeavor touches on the deepest of human mysteries: how we grow, whether we flourish, what we become. Most companies are just beginning to explore the possibilities. But make no mistake: during the next five to 10 years, new models will be created, and new experiments run, on a very large scale. Will this be a good development or a bad one—for the economy, for the shapes of our careers, for our spirit and self-worth? Earlier this year, I decided to find out. (...)

Consider Knack, a tiny start-up based in Silicon Valley. Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.

When Hans Haringa heard about Knack, he was skeptical but intrigued. Haringa works for the petroleum giant Royal Dutch Shell—by revenue, the world’s largest company last year. For seven years he’s served as an executive in the company’s GameChanger unit: a 12-person team that for nearly two decades has had an outsize impact on the company’s direction and performance. The unit’s job is to identify potentially disruptive business ideas. Haringa and his team solicit ideas promiscuously from inside and outside the company, and then play the role of venture capitalists, vetting each idea, meeting with its proponents, dispensing modest seed funding to a few promising candidates, and monitoring their progress. They have a good record of picking winners, Haringa told me, but identifying ideas with promise has proved to be extremely difficult and time-consuming. The process typically takes more than two years, and less than 10 percent of the ideas proposed to the unit actually make it into general research and development.

When he heard about Knack, Haringa thought he might have found a shortcut. What if Knack could help him assess the people proposing all these ideas, so that he and his team could focus only on those whose ideas genuinely deserved close attention? Haringa reached out, and eventually ran an experiment with the company’s help.

Over the years, the GameChanger team had kept a database of all the ideas it had received, recording how far each had advanced. Haringa asked all the idea contributors he could track down (about 1,400 in total) to play Dungeon Scrawl and Wasabi Waiter, and told Knack how well three-quarters of those people had done as idea generators. (Did they get initial funding? A second round? Did their ideas make it all the way?) He did this so that Knack’s staff could develop game-play profiles of the strong innovators relative to the weak ones. Finally, he had Knack analyze the game-play of the remaining quarter of the idea generators, and asked the company to guess whose ideas had turned out to be best.

When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process. Knack identified six broad factors as especially characteristic of those whose ideas would succeed at Shell: “mind wandering” (or the tendency to follow interesting, unexpected offshoots of the main task at hand, to see where they lead), social intelligence, “goal-orientation fluency,” implicit learning, task-switching ability, and conscientiousness. Haringa told me that this profile dovetails with his impression of a successful innovator. “You need to be disciplined,” he said, but “at all times you must have your mind open to see the other possibilities and opportunities.”

What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out. If he and his colleagues were no longer mired in evaluating “the hopeless folks,” as he put it to me, they could solicit ideas even more widely than they do today and devote much more careful attention to the 20 people out of 100 whose ideas have the most merit.

Haringa is now trying to persuade his colleagues in the GameChanger unit to use Knack’s games as an assessment tool. But he’s also thinking well beyond just his own little part of Shell. He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers. Shell goes to extremes to try to make itself the world’s most innovative energy company, he told me, so shouldn’t it apply that spirit to developing its own “human dimension”?

by Don Peck, The Atlantic |  Read more:
Image: Peter Yang

Arturo Agostino, Vorticella
via:

Foo Fighters

Stuxnet's Secret Twin

Three years after it was discovered, Stuxnet, the first publicly disclosed cyberweapon, continues to baffle military strategists, computer security experts, political decision-makers, and the general public. A comfortable narrative has formed around the weapon: how it attacked the Iranian nuclear facility at Natanz, how it was designed to be undiscoverable, how it escaped from Natanz against its creators' wishes. Major elements of that story are either incorrect or incomplete.

That's because Stuxnet is not really one weapon, but two. The vast majority of the attention has been paid to Stuxnet's smaller and simpler attack routine -- the one that changes the speeds of the rotors in a centrifuge, which is used to enrich uranium. But the second and "forgotten" routine is about an order of magnitude more complex and stealthy. It qualifies as a nightmare for those who understand industrial control system security. And strangely, this more sophisticated attack came first. The simpler, more familiar routine followed only years later -- and was discovered in comparatively short order.

With Iran's nuclear program back at the center of world debate, it's helpful to understand with more clarity the attempts to digitally sabotage that program. Stuxnet's actual impact on the Iranian nuclear program is unclear, if only for the fact that no information is available on how many controllers were actually infected. Nevertheless, forensic analysis can tell us what the attackers intended to achieve, and how. I've spent the last three years conducting that analysis -- not just of the computer code, but of the physical characteristics of the plant environment that was attacked and of the process that this nuclear plant operates. What I've found is that the full picture, which includes the first and lesser-known Stuxnet variant, invites a re-evaluation of the attack. It turns out that it was far more dangerous than the cyberweapon that is now lodged in the public's imagination.

In 2007, an unidentified person submitted a sample of code to the computer security site VirusTotal. It later turned out to be the first variant of Stuxnet -- at least, the first one that we're aware of. But that was only realized five years later, with the knowledge of the second Stuxnet variant. Without that later and much simpler version, the original Stuxnet might still today sleep in the archives of anti-virus researchers, unidentified as one of the most aggressive cyberweapons in history. Today we now know that the code contained a payload for severely interfering with the system designed to protect the centrifuges at the Natanz uranium-enrichment plant.

Stuxnet's later, and better-known, attack tried to cause centrifuge rotors to spin too fast and at speeds that would cause them to break. The "original" payload used a different tactic. It attempted to overpressurize Natanz's centrifuges by sabotaging the system meant to keep the cascades of centrifuges safe. (...)

Natanz's cascade protection system relies on Siemens S7-417 industrial controllers to operate the valves and pressure sensors of up to six cascades, or groups of 164 centrifuges each. A controller can be thought of as a small embedded computer system that is directly connected to physical equipment, such as valves. Stuxnet was designed to infect these controllers and take complete control of them in a way that previous users had never imagined -- and that had never even been discussed at industrial control system conferences.

A controller infected with the first Stuxnet variant actually becomes decoupled from physical reality. Legitimate control logic only "sees" what Stuxnet wants it to see. Before the attack sequence executes (which is approximately once per month), the malicious code is kind enough to show operators in the control room the physical reality of the plant floor. But that changes during attack execution.

One of the first things this Stuxnet variant does is take steps to hide its tracks, using a trick straight out of Hollywood. Stuxnet records the cascade protection system's sensor values for a period of 21 seconds. Then it replays those 21 seconds in a constant loop during the execution of the attack. In the control room, all appears to be normal, both to human operators and any software-implemented alarm routines.

by Ralph Langer, FP |  Read more:
Image: uncredited

A Case for Life Panels

[ed. See also: How Doctors Die: Showing Others the Way]

At the beginning of 2012, my mother was 95 years old. She lived in an assisted-living center, with hospice care, in a hospital bed 24/7. She was hollow-eyed and emaciated. Though she had moments of clarity, she was confused, anxious and uncomfortable. Her quality of life was minimal, at best. And the cost to keep her in this condition had risen to close to $100,000 a year.

Three years earlier, when she was completely rational, my mother told me that while she had lived a full and rewarding life, she was ready to go. By 2012, when her life was more punishment than reward, she did not have the mental faculties to reaffirm her desire, nor was there a legal way to carry out her decision. Even if my mother had been living in one of the states like Oregon, Washington or Vermont that have “death with dignity” statutes on their books, the fact that she lacked mental competency to request an assisted death by 2012 almost certainly would have ruled out any possibility that the state would have granted her wish.

Nor would it have been an option to move her to one of the few countries that have removed the legal perils of a decision to end one’s life. It was hard enough to get my mother from her bed to her chair. How would I have transported her to the Netherlands?

No, there is only one solution to this type of situation, for anyone who may encounter it in the future. What is needed here, I suggest, is not a death panel. It’s a “life panel” with the legal authority to ensure that my mother’s request to end her own life, on her own terms, would be honored. (...)

Some of my clients are extremely realistic about the crushing expenses they could face in their final years. Others are more sanguine. When I tell them that their money is unlikely to last through their 90s, they say: “Well, that’s O.K. I don’t plan to live past 85, anyway.” I have a standard answer in these cases. I say: “Yes, you expect to die at 85, but what if you’re unlucky? What if you live to 95?” At that point, I tell them about my mother. Then we get down to work.

Occasionally, people tell me that their end dates are guaranteed. They are saving pills that will put them out of their misery, or they have made “arrangements” with friends. For all their planning, my clients do not realize that when the time comes, they may be too sick or demented to carry out their do-it-yourself strategies. And so we come back to the life panel. Who is on it? Certainly, a doctor would be involved. After all, we laymen might feel guilty about making decisions that would hasten the end of a life, but under current law in most states, doctors would be guilty — of murder. On a life panel, a doctor would be held blameless. And I would have no problem adding a medical ethicist and a therapist.

Most important, I think the individual should be allowed to nominate panelists who are likely to understand the person’s wishes: family members, close friends, a person with whom they share religious beliefs.

This may seem like a reach, but in fact we already come quite close to this now. As any financial planner will tell you, everyone needs a living will. This is a legal document that instructs a surrogate or a medical center on the level of life-prolonging or palliative care you want if you become unable to make medical decisions.

But legal documents go only so far. Doctors I have asked about this issue know firsthand the uncertainties of deciding when a person has lost medical decision-making capacity. Nor is it possible to write out instructions for every possible medical eventuality.

A life panel might not be the perfect solution, but neither is draining a family’s resources to support a joyless existence in a hospital bed.

by Bob Goldman, NY Times |  Read more:
Image: Federica Bordoni

Wednesday, November 20, 2013

All The Selves We Have Been

It is when we are young that we are most obviously busy with the project of trying to construct a self we hope the world will appreciate, monitoring and rearranging the impressions we make upon others. Yet as we age, most of us are still trying to hold on to some sense of who and what we are, however hard this may become for those who start to feel increasingly invisible. Everywhere I look nowadays I see older people busily engaged with the world and eager, just as I am, to relate to others, while also struggling to shore up favored ways of seeing ourselves. However, the world in general is rarely sympathetic to these attempts, as though the time had come, or were long overdue, for the elderly to withdraw altogether from worrying about how they appear to others. In my view, such a time never comes, which means finding much better ways of affirming old age than those currently available. (...)

Aging encompasses so much, and yet most people’s thoughts about it embrace so little. Against the dominant fixation, for instance, I write not primarily about aging bodies, with their rising demands, frequent embarrassments, and endless diversities—except that of course our bodies are there, in every move we make, or sometimes fail to complete. I have little to say, either, about the corrosions of dementia. It is telling nowadays how often those who address the topic of aging alight on dementia—often, paradoxically, in criticism of others who simply equate aging with decline, while doing just this themselves. For the faint-hearted, I need to point out that although the incidence of dementia will indeed accelerate in the age group now headed towards their nineties, even amongst the very oldest it will not predominate—though this information hardly eliminates our fear of such indisputable decline.

Conversely, I do not make, or not in quite the usual way, an exploration of those many narratives of resilience, which suggest that with care of the self, diligent monitoring, and attention to spiritual concerns we can postpone aging itself, at least until those final moments of very old age. On this view, we can stay healthy, fit and “young”—or youngish—performing our yoga, practicing Pilates, eating our greens, avoiding hazards and spurning envy and resentment. It is true, we may indeed remain healthy, but we will not stay young. “You are only as old as you feel,” though routinely offered as a jolly form of reassurance, carries its own disavowal of old age.

Aging faces, aging bodies, as we should know, are endlessly diverse. Many of them are beautifully expressive, once we choose to look—those eyes rarely lose their luster, when engrossed. However, I am primarily concerned with the possibilities for and impediments to staying alive to life itself, whatever our age. This takes me first of all to the temporal paradoxes of aging, and to enduring ways of remaining open and attached to the world.

As we age, changing year on year, we also retain, in one manifestation or another, traces of all the selves we have been, creating a type of temporal vertigo and rendering us psychically, in one sense, all ages and no age. “All ages and no age” is an expression once used by the psychoanalyst Donald Winnicott to describe the wayward temporality of psychic life, writing of his sense of the multiple ages he could detect in those patients once arriving to lie on the couch at his clinic in Hampstead in London. Thus the older we are the more we encounter the world through complex layerings of identity, attempting to negotiate the shifting present while grappling with the disconcerting images of the old thrust so intrusively upon us. “Live in the layers, / not on the litter,” the North American poet, Stanley Kunitz, wrote in one of his beautiful poems penned in his seventies. (...)

“I don’t feel old,” elderly informants repeatedly told the oral historian Paul Thompson. Their voices echo the words he’d read in his forays into published autobiography and archived interviews. Similarly, in the oral histories collected by the writer Ronald Blythe, an eighty-four-year-old ex-schoolmaster reflects: “I tend to look upon other old men as old men—and not include myself… My boyhood stays imperishable and is such a great part of me now. I feel it very strongly—more than ever before.”

“How can a 17-year-old, like me, suddenly be 81?” the exactingly scientific developmental biologist Lewis Wolpert asks in the opening sentences of his book on the surprising nature of old age, wryly entitled You’re Looking Very Well. Once again, this keen attachment to youth tells us a great deal about the stigma attending old age: “you’re looking old” would never be said, except to insult. On the one hand there can be a sense of continuous fluidity, as we travel through time; on the other, it is hard to ignore those distinct positions we find ourselves in as we age, whatever the temptation. I have been finding, however, that it becomes easier to face up to my own anxieties about aging after surveying the radical ambiguities in the speech or writing of others thinking about the topic, especially when they do so neither to lament nor to celebrate old age, but simply to affirm it as a significant part of life. This is the trigger for the words that follow, as I assemble different witnesses to help guide me through the thoughts that once kept me awake at night, pondering all the things that have mattered to me and wondering what difference aging makes to my continuing ties to them.

by Lynne Segal, Guernica |  Read more:
Image: from Flickr via Abode of Chaos

Paul Gauguin, The Meal. Paris Musée d'Orsay

Most Lives Are Lived by Default

Jamie lives in a large city in the midwest. He’s a copywriter for an advertising firm, and he’s good at it.

He’s also good at thinking of reasons why he ought to be happy with his life. He has health insurance, and now savings. A lot of his friends have neither. His girlfriend is pretty. They never fight. His boss has a sense of humor, doesn’t micromanage, and lets him go early most Fridays.

On most of those Fridays, including this one, instead of taking the train back to his suburban side-by-side, he walks to a downtown pub to meet his friends. He will have four beers. His friends always stay longer.

Jamie’s girlfriend Linda typically arrives on his third beer. She greets them all with polite hugs, Jamie with a kiss. He orders his final beer when she orders her only one. They take a taxi home, make dinner together, and watch a movie on Netflix. When it’s over they start a second one and don’t finish it. They have sex, then she goes to wash her face and brush her teeth. When she returns, he goes.

There was never a day Jamie sat down and decided to be a copywriter living in the midwest. A pair of lawyers at his ex-girlfriend’s firm took him out one night when he was freshly laid-off from writing for a tech magazine, bought him a hundred dollars worth of drinks and gave him the business card of his current boss. It was a great night. That was nine years ago.

His friends are from his old job. White collar, artsy and smart. If one of the five of them is missing at the pub on Friday, they’ll have lunch during the week.

Jamie isn’t unhappy. He’s bored, but doesn’t quite realize it. As he gets older his boredom is turning to fear. He has no health problems but he thinks about them all the time. Cancer. Arthritis. Alzheimer’s. He’s thirty-eight, fit, has no plans for children, and when he really thinks about the course of his life he doesn’t quite know what to do with himself, except on Fridays.

In two months he and Linda are going to Cuba for ten days. He’s looking forward to that right now.

***

A few weeks ago I asked everyone reading to share their biggest problem in life in the comment section. I’ve done this before — ask about what’s going on with you — and every time I do I notice two things.

The first thing is that everyone has considerable problems. Not simply occasional tough spots, but the type of issue that persists for years or decades. The kind that becomes a theme in life, that feels like part of your identity. By the sounds of it, it’s typical among human beings to feel like something huge is missing.

The other thing is that they tend to be one of the same few problems: lack of human connection, lack of personal freedom (due to money or family situations), lack of confidence or self-esteem, or lack of self-control.

The day-to-day feel and quality of each of our lives sits on a few major structures: where we live, what we do for a living, what we do with ourselves when we’re not at work, and which people we spend most of our time with.

by David Cain, Raptitude |  Read more:
Image: uncredited

Craft Transit

At the 2013 Walking Summit early this month in Washington, DC, I spent a lot of time looking at other people’s shoes.

My interest in footwear-as-fashion borders on nil, but I was curious about locomotion. I saw a lot of sensible, flat-heeled shoes on women, and some efficient Tevas and Hi-Techs on men. But also quite a few painful and pointy dress shoes on both sexes, all inappropriate for walking more than to the nearest Starbucks. I tried not to judge, but, well, what can I say?

I spent two days at the summit listening, learning, and chatting with advocates for walking. It brought together a diverse crowd of nearly 400 people: urban planners, doctors, transit advocates, public health professionals, recreational trail directors, and people who blog and write about getting around. They talked about how much we walk, why we don’t do more of it, where we walk, how to get people walking more.

As at conferences everywhere, these discussions were decked out with splashy statistics. Many came from a newly released survey about American attitudes toward walking, which had been commissioned by health care provider Kaiser Permanente (the muscle behind the summit). Seventy-nine percent of Americans, for instance, agree that they “should probably walk more.” And 66 percent believe that distracted drivers were a problem in their neighborhoods.

But one statistic really caught my attention: 72 percent of respondents think walking “is cool.”

Seriously? I suspect a finger on the scale. Because walking has long been the antithesis of cool. Walking is what the elderly do in malls. Walking is what the poor do because they can’t afford righteous wheels, or even bus fare. Walking is what a baseball player does, with a limp, when he’s hit by a ball — it’s the opposite of a home run. And race walkers? They may have set back walking by several generations with their alarmingly wobbly, hip-gimballing walk. The Facebook page “Walking is Cool?” It has a total of seven “likes.”

Walking as a cool activity is hobbled by a number of obstructions. For instance, those who crusade for walking often scare the common people with exclamation points. “Fun you say? Yes, fun!” enthuses a web site advocating walking, posted under a heading reading “Why Not Walk?!” Many walking advocates appear to use keyboards lacking the basic period. You could lose an eye on all their punctuation. True believers scare people.

This is compounded by a persistent belief — at least among many I’ve spoken with — that walking is quite possibly the most boring activity anyone can engage in. Washing dishes by hand is preferable. It’s no coincidence that a synonym for “boring” is “pedestrian.” One young woman — who has evidently been so traumatized by exclamation points that she can no longer employ any punctuation whatsoever — recently groused on an online forum: “I try and try but I can't stand it its too boring I tried listening to songs on my iPod and even walking with a friend but its no use I just don't like walking… but the thing is I want to walk but can’t.”

In my experience, many others share her view that walking may be good, but leads to a slow death by boredom. The only cure? Take two automobiles and call me in the morning.

Running isn’t saddled with this baggage. This is part because when you run briskly down a city street, all rustly in your nylon, it conveys that you’re a can-do person with a busy life, although not too busy to take care of The Big Dog. In contrast, when someone walks past, they’re invisible, or if they’re walking a bit faster than normal, one may note them only to assume they’ve missed their bus. Also running has cool accessories that convey social status and tech savviness. Last summer, for instance, Adidas introduced Springblade, “the first running shoe with individually tuned blades engineered to help propel runners forward with one of the most effective energy returns in the industry.” I assume they couldn’t call it “Bladerunner” because of trademark issues, which is too bad. I don’t even run and I want a pair.

Same with biking — cool and expensive equipment is abundant, including jerseys in colors garish enough to be seen from the orbiting space station. Of course, the dork-helmet remains one of our generation’s unresolved problems, but great minds are at work on this.

How to overcome walking’s dull reputation?

by Wayne Curtis, The Smart Set |  Read more:
Image: Wayne Curtis