Thursday, November 21, 2013

We Do Not Need Another Cat

We are down a cat. It's still too upsetting to talk about (rural life, tentative open window policy, probable coyote, lifetime of horrible, horrible guilt for not sticking to indoor cat guns), but we used to have two cats, the correct number of cats, and now we have one cat. An indoor cat.

And we can't really face the idea of acquiring a second cat, because a) we're having a human baby in a few months, and b) First Cat never really liked having a second cat, and now that she's Only Cat, she's super-stoked about it and prances around like she owns the place, and c) that would involve formally admitting that Second Cat is gone for good.

But, you know, I read the shelter intake emails every morning, even though Second Cat has been almost certainly deceased for a month now, and so I literally page through dozens of pictures of homeless cats on a daily basis, and it makes me feel like a ghoul. Even though, bless 'em, homeless cats usually put on great bitchface for the camera, you know? The dogs have that plaintive "where's my mommy?" thing going, and the cats are all "get that out of my face. I don't need you! I don't need anyone!"

And you start thinking, maybe an elderly boy cat? Just some big orangey lump? But then First Cat is all, "I tolerated Second Cat because she was from the same Brooklyn feral cluster as me. We were basically sisters. Don't push your luck. Did you see what I did to the stuffed bobcat you bought for your nieces?"

I don't know.

by Nicole Cliff, The Hairpin |  Read more:
Image: uncredited
[ed. Repost]

The Next Housing Crash

The iconic American single-family home was a housing model for a different era, when Baby Boomers were raising their kids, gas was cheap, and suburbia beckoned. But today, Boomers are preparing to retire and downsize, while many of their children are eager to live elsewhere, trading home ownership for rentals, suburbs for cities, two-car garages for more-compact living. Can the housing market alter course in time to accommodate everyone? And what will happen to the Boomers’ dream homes in the suburbs if no one lines up to buy them?


by Emily Badger, The Atlantic |  Read more:
Image: The Atlantic

They're Watching You at Work

[ed. My father used to be a corporate psychologist so I spent many hours with block diagrams and Rorscharch tests as a child. I wonder what he thought they revealed?]

The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught. And it can’t help but feel a little creepy. It requires the creation of a vastly larger box score of human performance than one would ever encounter in the sports pages, or that has ever been dreamed up before. To some degree, the endeavor touches on the deepest of human mysteries: how we grow, whether we flourish, what we become. Most companies are just beginning to explore the possibilities. But make no mistake: during the next five to 10 years, new models will be created, and new experiments run, on a very large scale. Will this be a good development or a bad one—for the economy, for the shapes of our careers, for our spirit and self-worth? Earlier this year, I decided to find out. (...)

Consider Knack, a tiny start-up based in Silicon Valley. Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.

When Hans Haringa heard about Knack, he was skeptical but intrigued. Haringa works for the petroleum giant Royal Dutch Shell—by revenue, the world’s largest company last year. For seven years he’s served as an executive in the company’s GameChanger unit: a 12-person team that for nearly two decades has had an outsize impact on the company’s direction and performance. The unit’s job is to identify potentially disruptive business ideas. Haringa and his team solicit ideas promiscuously from inside and outside the company, and then play the role of venture capitalists, vetting each idea, meeting with its proponents, dispensing modest seed funding to a few promising candidates, and monitoring their progress. They have a good record of picking winners, Haringa told me, but identifying ideas with promise has proved to be extremely difficult and time-consuming. The process typically takes more than two years, and less than 10 percent of the ideas proposed to the unit actually make it into general research and development.

When he heard about Knack, Haringa thought he might have found a shortcut. What if Knack could help him assess the people proposing all these ideas, so that he and his team could focus only on those whose ideas genuinely deserved close attention? Haringa reached out, and eventually ran an experiment with the company’s help.

Over the years, the GameChanger team had kept a database of all the ideas it had received, recording how far each had advanced. Haringa asked all the idea contributors he could track down (about 1,400 in total) to play Dungeon Scrawl and Wasabi Waiter, and told Knack how well three-quarters of those people had done as idea generators. (Did they get initial funding? A second round? Did their ideas make it all the way?) He did this so that Knack’s staff could develop game-play profiles of the strong innovators relative to the weak ones. Finally, he had Knack analyze the game-play of the remaining quarter of the idea generators, and asked the company to guess whose ideas had turned out to be best.

When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process. Knack identified six broad factors as especially characteristic of those whose ideas would succeed at Shell: “mind wandering” (or the tendency to follow interesting, unexpected offshoots of the main task at hand, to see where they lead), social intelligence, “goal-orientation fluency,” implicit learning, task-switching ability, and conscientiousness. Haringa told me that this profile dovetails with his impression of a successful innovator. “You need to be disciplined,” he said, but “at all times you must have your mind open to see the other possibilities and opportunities.”

What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out. If he and his colleagues were no longer mired in evaluating “the hopeless folks,” as he put it to me, they could solicit ideas even more widely than they do today and devote much more careful attention to the 20 people out of 100 whose ideas have the most merit.

Haringa is now trying to persuade his colleagues in the GameChanger unit to use Knack’s games as an assessment tool. But he’s also thinking well beyond just his own little part of Shell. He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers. Shell goes to extremes to try to make itself the world’s most innovative energy company, he told me, so shouldn’t it apply that spirit to developing its own “human dimension”?

by Don Peck, The Atlantic |  Read more:
Image: Peter Yang

Arturo Agostino, Vorticella
via:

Foo Fighters

Stuxnet's Secret Twin

Three years after it was discovered, Stuxnet, the first publicly disclosed cyberweapon, continues to baffle military strategists, computer security experts, political decision-makers, and the general public. A comfortable narrative has formed around the weapon: how it attacked the Iranian nuclear facility at Natanz, how it was designed to be undiscoverable, how it escaped from Natanz against its creators' wishes. Major elements of that story are either incorrect or incomplete.

That's because Stuxnet is not really one weapon, but two. The vast majority of the attention has been paid to Stuxnet's smaller and simpler attack routine -- the one that changes the speeds of the rotors in a centrifuge, which is used to enrich uranium. But the second and "forgotten" routine is about an order of magnitude more complex and stealthy. It qualifies as a nightmare for those who understand industrial control system security. And strangely, this more sophisticated attack came first. The simpler, more familiar routine followed only years later -- and was discovered in comparatively short order.

With Iran's nuclear program back at the center of world debate, it's helpful to understand with more clarity the attempts to digitally sabotage that program. Stuxnet's actual impact on the Iranian nuclear program is unclear, if only for the fact that no information is available on how many controllers were actually infected. Nevertheless, forensic analysis can tell us what the attackers intended to achieve, and how. I've spent the last three years conducting that analysis -- not just of the computer code, but of the physical characteristics of the plant environment that was attacked and of the process that this nuclear plant operates. What I've found is that the full picture, which includes the first and lesser-known Stuxnet variant, invites a re-evaluation of the attack. It turns out that it was far more dangerous than the cyberweapon that is now lodged in the public's imagination.

In 2007, an unidentified person submitted a sample of code to the computer security site VirusTotal. It later turned out to be the first variant of Stuxnet -- at least, the first one that we're aware of. But that was only realized five years later, with the knowledge of the second Stuxnet variant. Without that later and much simpler version, the original Stuxnet might still today sleep in the archives of anti-virus researchers, unidentified as one of the most aggressive cyberweapons in history. Today we now know that the code contained a payload for severely interfering with the system designed to protect the centrifuges at the Natanz uranium-enrichment plant.

Stuxnet's later, and better-known, attack tried to cause centrifuge rotors to spin too fast and at speeds that would cause them to break. The "original" payload used a different tactic. It attempted to overpressurize Natanz's centrifuges by sabotaging the system meant to keep the cascades of centrifuges safe. (...)

Natanz's cascade protection system relies on Siemens S7-417 industrial controllers to operate the valves and pressure sensors of up to six cascades, or groups of 164 centrifuges each. A controller can be thought of as a small embedded computer system that is directly connected to physical equipment, such as valves. Stuxnet was designed to infect these controllers and take complete control of them in a way that previous users had never imagined -- and that had never even been discussed at industrial control system conferences.

A controller infected with the first Stuxnet variant actually becomes decoupled from physical reality. Legitimate control logic only "sees" what Stuxnet wants it to see. Before the attack sequence executes (which is approximately once per month), the malicious code is kind enough to show operators in the control room the physical reality of the plant floor. But that changes during attack execution.

One of the first things this Stuxnet variant does is take steps to hide its tracks, using a trick straight out of Hollywood. Stuxnet records the cascade protection system's sensor values for a period of 21 seconds. Then it replays those 21 seconds in a constant loop during the execution of the attack. In the control room, all appears to be normal, both to human operators and any software-implemented alarm routines.

by Ralph Langer, FP |  Read more:
Image: uncredited

A Case for Life Panels

[ed. See also: How Doctors Die: Showing Others the Way]

At the beginning of 2012, my mother was 95 years old. She lived in an assisted-living center, with hospice care, in a hospital bed 24/7. She was hollow-eyed and emaciated. Though she had moments of clarity, she was confused, anxious and uncomfortable. Her quality of life was minimal, at best. And the cost to keep her in this condition had risen to close to $100,000 a year.

Three years earlier, when she was completely rational, my mother told me that while she had lived a full and rewarding life, she was ready to go. By 2012, when her life was more punishment than reward, she did not have the mental faculties to reaffirm her desire, nor was there a legal way to carry out her decision. Even if my mother had been living in one of the states like Oregon, Washington or Vermont that have “death with dignity” statutes on their books, the fact that she lacked mental competency to request an assisted death by 2012 almost certainly would have ruled out any possibility that the state would have granted her wish.

Nor would it have been an option to move her to one of the few countries that have removed the legal perils of a decision to end one’s life. It was hard enough to get my mother from her bed to her chair. How would I have transported her to the Netherlands?

No, there is only one solution to this type of situation, for anyone who may encounter it in the future. What is needed here, I suggest, is not a death panel. It’s a “life panel” with the legal authority to ensure that my mother’s request to end her own life, on her own terms, would be honored. (...)

Some of my clients are extremely realistic about the crushing expenses they could face in their final years. Others are more sanguine. When I tell them that their money is unlikely to last through their 90s, they say: “Well, that’s O.K. I don’t plan to live past 85, anyway.” I have a standard answer in these cases. I say: “Yes, you expect to die at 85, but what if you’re unlucky? What if you live to 95?” At that point, I tell them about my mother. Then we get down to work.

Occasionally, people tell me that their end dates are guaranteed. They are saving pills that will put them out of their misery, or they have made “arrangements” with friends. For all their planning, my clients do not realize that when the time comes, they may be too sick or demented to carry out their do-it-yourself strategies. And so we come back to the life panel. Who is on it? Certainly, a doctor would be involved. After all, we laymen might feel guilty about making decisions that would hasten the end of a life, but under current law in most states, doctors would be guilty — of murder. On a life panel, a doctor would be held blameless. And I would have no problem adding a medical ethicist and a therapist.

Most important, I think the individual should be allowed to nominate panelists who are likely to understand the person’s wishes: family members, close friends, a person with whom they share religious beliefs.

This may seem like a reach, but in fact we already come quite close to this now. As any financial planner will tell you, everyone needs a living will. This is a legal document that instructs a surrogate or a medical center on the level of life-prolonging or palliative care you want if you become unable to make medical decisions.

But legal documents go only so far. Doctors I have asked about this issue know firsthand the uncertainties of deciding when a person has lost medical decision-making capacity. Nor is it possible to write out instructions for every possible medical eventuality.

A life panel might not be the perfect solution, but neither is draining a family’s resources to support a joyless existence in a hospital bed.

by Bob Goldman, NY Times |  Read more:
Image: Federica Bordoni

Wednesday, November 20, 2013

All The Selves We Have Been

It is when we are young that we are most obviously busy with the project of trying to construct a self we hope the world will appreciate, monitoring and rearranging the impressions we make upon others. Yet as we age, most of us are still trying to hold on to some sense of who and what we are, however hard this may become for those who start to feel increasingly invisible. Everywhere I look nowadays I see older people busily engaged with the world and eager, just as I am, to relate to others, while also struggling to shore up favored ways of seeing ourselves. However, the world in general is rarely sympathetic to these attempts, as though the time had come, or were long overdue, for the elderly to withdraw altogether from worrying about how they appear to others. In my view, such a time never comes, which means finding much better ways of affirming old age than those currently available. (...)

Aging encompasses so much, and yet most people’s thoughts about it embrace so little. Against the dominant fixation, for instance, I write not primarily about aging bodies, with their rising demands, frequent embarrassments, and endless diversities—except that of course our bodies are there, in every move we make, or sometimes fail to complete. I have little to say, either, about the corrosions of dementia. It is telling nowadays how often those who address the topic of aging alight on dementia—often, paradoxically, in criticism of others who simply equate aging with decline, while doing just this themselves. For the faint-hearted, I need to point out that although the incidence of dementia will indeed accelerate in the age group now headed towards their nineties, even amongst the very oldest it will not predominate—though this information hardly eliminates our fear of such indisputable decline.

Conversely, I do not make, or not in quite the usual way, an exploration of those many narratives of resilience, which suggest that with care of the self, diligent monitoring, and attention to spiritual concerns we can postpone aging itself, at least until those final moments of very old age. On this view, we can stay healthy, fit and “young”—or youngish—performing our yoga, practicing Pilates, eating our greens, avoiding hazards and spurning envy and resentment. It is true, we may indeed remain healthy, but we will not stay young. “You are only as old as you feel,” though routinely offered as a jolly form of reassurance, carries its own disavowal of old age.

Aging faces, aging bodies, as we should know, are endlessly diverse. Many of them are beautifully expressive, once we choose to look—those eyes rarely lose their luster, when engrossed. However, I am primarily concerned with the possibilities for and impediments to staying alive to life itself, whatever our age. This takes me first of all to the temporal paradoxes of aging, and to enduring ways of remaining open and attached to the world.

As we age, changing year on year, we also retain, in one manifestation or another, traces of all the selves we have been, creating a type of temporal vertigo and rendering us psychically, in one sense, all ages and no age. “All ages and no age” is an expression once used by the psychoanalyst Donald Winnicott to describe the wayward temporality of psychic life, writing of his sense of the multiple ages he could detect in those patients once arriving to lie on the couch at his clinic in Hampstead in London. Thus the older we are the more we encounter the world through complex layerings of identity, attempting to negotiate the shifting present while grappling with the disconcerting images of the old thrust so intrusively upon us. “Live in the layers, / not on the litter,” the North American poet, Stanley Kunitz, wrote in one of his beautiful poems penned in his seventies. (...)

“I don’t feel old,” elderly informants repeatedly told the oral historian Paul Thompson. Their voices echo the words he’d read in his forays into published autobiography and archived interviews. Similarly, in the oral histories collected by the writer Ronald Blythe, an eighty-four-year-old ex-schoolmaster reflects: “I tend to look upon other old men as old men—and not include myself… My boyhood stays imperishable and is such a great part of me now. I feel it very strongly—more than ever before.”

“How can a 17-year-old, like me, suddenly be 81?” the exactingly scientific developmental biologist Lewis Wolpert asks in the opening sentences of his book on the surprising nature of old age, wryly entitled You’re Looking Very Well. Once again, this keen attachment to youth tells us a great deal about the stigma attending old age: “you’re looking old” would never be said, except to insult. On the one hand there can be a sense of continuous fluidity, as we travel through time; on the other, it is hard to ignore those distinct positions we find ourselves in as we age, whatever the temptation. I have been finding, however, that it becomes easier to face up to my own anxieties about aging after surveying the radical ambiguities in the speech or writing of others thinking about the topic, especially when they do so neither to lament nor to celebrate old age, but simply to affirm it as a significant part of life. This is the trigger for the words that follow, as I assemble different witnesses to help guide me through the thoughts that once kept me awake at night, pondering all the things that have mattered to me and wondering what difference aging makes to my continuing ties to them.

by Lynne Segal, Guernica |  Read more:
Image: from Flickr via Abode of Chaos

Paul Gauguin, The Meal. Paris Musée d'Orsay

Most Lives Are Lived by Default

Jamie lives in a large city in the midwest. He’s a copywriter for an advertising firm, and he’s good at it.

He’s also good at thinking of reasons why he ought to be happy with his life. He has health insurance, and now savings. A lot of his friends have neither. His girlfriend is pretty. They never fight. His boss has a sense of humor, doesn’t micromanage, and lets him go early most Fridays.

On most of those Fridays, including this one, instead of taking the train back to his suburban side-by-side, he walks to a downtown pub to meet his friends. He will have four beers. His friends always stay longer.

Jamie’s girlfriend Linda typically arrives on his third beer. She greets them all with polite hugs, Jamie with a kiss. He orders his final beer when she orders her only one. They take a taxi home, make dinner together, and watch a movie on Netflix. When it’s over they start a second one and don’t finish it. They have sex, then she goes to wash her face and brush her teeth. When she returns, he goes.

There was never a day Jamie sat down and decided to be a copywriter living in the midwest. A pair of lawyers at his ex-girlfriend’s firm took him out one night when he was freshly laid-off from writing for a tech magazine, bought him a hundred dollars worth of drinks and gave him the business card of his current boss. It was a great night. That was nine years ago.

His friends are from his old job. White collar, artsy and smart. If one of the five of them is missing at the pub on Friday, they’ll have lunch during the week.

Jamie isn’t unhappy. He’s bored, but doesn’t quite realize it. As he gets older his boredom is turning to fear. He has no health problems but he thinks about them all the time. Cancer. Arthritis. Alzheimer’s. He’s thirty-eight, fit, has no plans for children, and when he really thinks about the course of his life he doesn’t quite know what to do with himself, except on Fridays.

In two months he and Linda are going to Cuba for ten days. He’s looking forward to that right now.

***

A few weeks ago I asked everyone reading to share their biggest problem in life in the comment section. I’ve done this before — ask about what’s going on with you — and every time I do I notice two things.

The first thing is that everyone has considerable problems. Not simply occasional tough spots, but the type of issue that persists for years or decades. The kind that becomes a theme in life, that feels like part of your identity. By the sounds of it, it’s typical among human beings to feel like something huge is missing.

The other thing is that they tend to be one of the same few problems: lack of human connection, lack of personal freedom (due to money or family situations), lack of confidence or self-esteem, or lack of self-control.

The day-to-day feel and quality of each of our lives sits on a few major structures: where we live, what we do for a living, what we do with ourselves when we’re not at work, and which people we spend most of our time with.

by David Cain, Raptitude |  Read more:
Image: uncredited

Craft Transit

At the 2013 Walking Summit early this month in Washington, DC, I spent a lot of time looking at other people’s shoes.

My interest in footwear-as-fashion borders on nil, but I was curious about locomotion. I saw a lot of sensible, flat-heeled shoes on women, and some efficient Tevas and Hi-Techs on men. But also quite a few painful and pointy dress shoes on both sexes, all inappropriate for walking more than to the nearest Starbucks. I tried not to judge, but, well, what can I say?

I spent two days at the summit listening, learning, and chatting with advocates for walking. It brought together a diverse crowd of nearly 400 people: urban planners, doctors, transit advocates, public health professionals, recreational trail directors, and people who blog and write about getting around. They talked about how much we walk, why we don’t do more of it, where we walk, how to get people walking more.

As at conferences everywhere, these discussions were decked out with splashy statistics. Many came from a newly released survey about American attitudes toward walking, which had been commissioned by health care provider Kaiser Permanente (the muscle behind the summit). Seventy-nine percent of Americans, for instance, agree that they “should probably walk more.” And 66 percent believe that distracted drivers were a problem in their neighborhoods.

But one statistic really caught my attention: 72 percent of respondents think walking “is cool.”

Seriously? I suspect a finger on the scale. Because walking has long been the antithesis of cool. Walking is what the elderly do in malls. Walking is what the poor do because they can’t afford righteous wheels, or even bus fare. Walking is what a baseball player does, with a limp, when he’s hit by a ball — it’s the opposite of a home run. And race walkers? They may have set back walking by several generations with their alarmingly wobbly, hip-gimballing walk. The Facebook page “Walking is Cool?” It has a total of seven “likes.”

Walking as a cool activity is hobbled by a number of obstructions. For instance, those who crusade for walking often scare the common people with exclamation points. “Fun you say? Yes, fun!” enthuses a web site advocating walking, posted under a heading reading “Why Not Walk?!” Many walking advocates appear to use keyboards lacking the basic period. You could lose an eye on all their punctuation. True believers scare people.

This is compounded by a persistent belief — at least among many I’ve spoken with — that walking is quite possibly the most boring activity anyone can engage in. Washing dishes by hand is preferable. It’s no coincidence that a synonym for “boring” is “pedestrian.” One young woman — who has evidently been so traumatized by exclamation points that she can no longer employ any punctuation whatsoever — recently groused on an online forum: “I try and try but I can't stand it its too boring I tried listening to songs on my iPod and even walking with a friend but its no use I just don't like walking… but the thing is I want to walk but can’t.”

In my experience, many others share her view that walking may be good, but leads to a slow death by boredom. The only cure? Take two automobiles and call me in the morning.

Running isn’t saddled with this baggage. This is part because when you run briskly down a city street, all rustly in your nylon, it conveys that you’re a can-do person with a busy life, although not too busy to take care of The Big Dog. In contrast, when someone walks past, they’re invisible, or if they’re walking a bit faster than normal, one may note them only to assume they’ve missed their bus. Also running has cool accessories that convey social status and tech savviness. Last summer, for instance, Adidas introduced Springblade, “the first running shoe with individually tuned blades engineered to help propel runners forward with one of the most effective energy returns in the industry.” I assume they couldn’t call it “Bladerunner” because of trademark issues, which is too bad. I don’t even run and I want a pair.

Same with biking — cool and expensive equipment is abundant, including jerseys in colors garish enough to be seen from the orbiting space station. Of course, the dork-helmet remains one of our generation’s unresolved problems, but great minds are at work on this.

How to overcome walking’s dull reputation?

by Wayne Curtis, The Smart Set |  Read more:
Image: Wayne Curtis

Tuesday, November 19, 2013

Joe Walsh


Robert Carrithers, Wedding Reception
via:

U.S. helicopters land in Haiti.
via:

[ed. Sistine Living Room]
via:

The 40-Year Slump


[ed. See also: Paul Krugman's A Permanent Slump.]

The steady stream of Watergate revelations, President Richard Nixon’s twists and turns to fend off disclosures, the impeachment hearings, and finally an unprecedented resignation—all these riveted the nation’s attention in 1974. Hardly anyone paid attention to a story that seemed no more than a statistical oddity: That year, for the first time since the end of World War II, Americans’ wages declined.

Since 1947, Americans at all points on the economic spectrum had become a little better off with each passing year. The economy’s rising tide, as President John F. Kennedy had famously said, was lifting all boats. Productivity had risen by 97 percent in the preceding quarter-century, and median wages had risen by 95 percent. As economist John Kenneth Galbraith noted in The Affluent Society, this newly middle-class nation had become more egalitarian. The poorest fifth had seen their incomes increase by 42 percent since the end of the war, while the wealthiest fifth had seen their incomes rise by just 8 percent. Economists have dubbed the period the “Great Compression.”

This egalitarianism, of course, was severely circumscribed. African Americans had only recently won civil equality, and economic equality remained a distant dream. Women entered the workforce in record numbers during the early 1970s to find a profoundly discriminatory labor market. A new generation of workers rebelled at the regimentation of factory life, staging strikes across the Midwest to slow down and humanize the assembly line. But no one could deny that Americans in 1974 lived lives of greater comfort and security than they had a quarter-century earlier. During that time, median family income more than doubled.

Then, it all stopped. In 1974, wages fell by 2.1 percent and median household income shrunk by $1,500. To be sure, it was a year of mild recession, but the nation had experienced five previous downturns during its 25-year run of prosperity without seeing wages come down.

What no one grasped at the time was that this wasn’t a one-year anomaly, that 1974 would mark a fundamental breakpoint in American economic history. In the years since, the tide has continued to rise, but a growing number of boats have been chained to the bottom. Productivity has increased by 80 percent, but median compensation (that’s wages plus benefits) has risen by just 11 percent during that time. The middle-income jobs of the nation’s postwar boom years have disproportionately vanished. Low-wage jobs have disproportionately burgeoned. Employment has become less secure. Benefits have been cut. The dictionary definition of “layoff” has changed, from denoting a temporary severance from one’s job to denoting a permanent severance.

As their incomes flat-lined, Americans struggled to maintain their standard of living. In most families, both adults entered the workforce. They worked longer hours. When paychecks stopped increasing, they tried to keep up by incurring an enormous amount of debt. The combination of skyrocketing debt and stagnating income proved predictably calamitous (though few predicted it). Since the crash of 2008, that debt has been called in.

All the factors that had slowly been eroding Americans’ economic lives over the preceding three decades—globalization, deunionization, financialization, Wal-Martization, robotization, the whole megillah of nefarious –izations—have now descended en masse on the American people. Since 2000, even as the economy has grown by 18 percent, the median income of households headed by people under 65 has declined by 12.4 percent. Since 2001, employment in low-wage occupations has increased by 8.7 percent while employment in middle-wage occupations has decreased by 7.3 percent. Since 2003, the median wage has not grown at all.

The middle has fallen out of the American economy—precipitously since 2008, but it’s been falling out slowly and cumulatively for the past 40 years. Far from a statistical oddity, 1974 marked an epochal turn. The age of economic security ended. The age of anxiety began.

by Harold Meyerson, American Prospect |  Read more:
Image: Jason Schneider

The Wow! Signal


[ed. I don't think I'd use celebrity videos and Twitter feeds if I were searching for intelligent life.]

The Wow! signal was a strong narrowband radio signal detected by Jerry R. Ehman on August 15, 1977, while he was working on a SETI project at the Big Ear radio telescope of The Ohio State University, then located at Ohio Wesleyan University's Perkins Observatory in Delaware, Ohio. The signal bore the expected hallmarks of non-terrestrial and non-Solar System origin. It lasted for the full 72-second window that Big Ear was able to observe it, but has not been detected again. The signal has been the subject of significant media attention.

Amazed at how closely the signal matched the expected signature of an interstellar signal in the antenna used, Ehman circled the signal on the computer printout and wrote the comment "Wow!" on its side. This comment became the name of the signal.

In 2012, on the 35th anniversary of the Wow! signal, Arecibo Observatory beamed a response from humanity, containing 10,000 Twitter messages, in the direction from which the signal originated. In the response, Arecibo scientists have attempted to increase the chances of intelligent life receiving and decoding the celebrity videos and crowd-sourced Tweets by attaching a repeating sequence header to each message that will let the recipient know that the messages are intentional and from another intelligent life form.

by Wikipedia |  Read more:
Image: J. Ehman