Tuesday, February 12, 2013

A Match Made in the Code

Unlike many other Web dating services, eHarmony doesn’t let customers search for partners on their own. They pay up to $60 per month to be offered matches based on their answers to a long questionnaire, which currently has about 200 items. The company has gathered answers from 44 million people, and says that its matches have led to more than half a million marriages since 2005.

Dr. Gonzaga, a social psychologist who previously worked at a marriage-research lab at the University of California, Los Angeles, said eHarmony wouldn’t let him disclose its formulas, but he did offer some revelations. He said its newest algorithm matches couples by focusing on six factors:

¶ Level of agreeableness — or, put another way, how quarrelsome a person is.

¶ Preference for closeness with a partner — how much emotional intimacy each wants and how much time each likes to spend with a partner.

¶ Degree of sexual and romantic passion.

¶ Level of extroversion and openness to new experience.

¶ How important spirituality is.

¶ How optimistic and happy each one is.

The more similarly that two people score in these factors, the better their chances, Dr. Gonzaga said, and presented evidence, not yet published, from several studies at eHarmony Labs. One study, which tracked more than 400 married couples matched by eHarmony, found that scores from their initial questionnaires correlated with a couple’s satisfaction with their relationship four years later.

“It is possible,” Dr. Gonzaga concluded, “to empirically derive a matchmaking algorithm that predicts the relationship of a couple before they ever meet.”

Not so fast, replied the critics in the hall. They didn’t doubt that factors like agreeableness could predict a good marriage. But that didn’t mean eHarmony had found the secret to matchmaking, said Harry T. Reis of the University of Rochester, one of the authors of last year’s critique.

“That agreeable person that you happen to be matching up with me would, in fact, get along famously with anyone in this room,” Dr. Reis told Dr. Gonzaga.

He and his co-authors argued that eHarmony’s results could merely reflect the well-known “person effect”: an agreeable, non-neurotic, optimistic person will tend to fare better in any relationship. But the research demonstrating this effect also showed that it’s hard to make predictions based on what’s called a dyadic effect — how similar the partners are to each other.

“In the existing literature, similarity components are notoriously weak at accounting for relationship satisfaction,” said Paul W. Eastwick of the University of Texas, Austin. “For example, what really matters for my relationship satisfaction is whether I myself am neurotic and, to a slightly lesser extent, whether my partner is neurotic. Our similarity on neuroticism is irrelevant.”

by John Tierney, NY Times |  Read more:
Illustration: Viktor Koen

Resigning Pope No Longer Has Strength To Lead Church Backward


VATICAN CITY—Citing his advancing age and deteriorating health, Pope Benedict XVI announced his resignation from the papacy Monday, saying he no longer possessed the strength and energy required to lead the Catholic Church backward.

According to the 85-year-old pontiff, after considerable prayer and reflection on his physical stamina and mental acuity, he concluded that his declining faculties left him unable to helm the Church’s ambitious regressive agenda and guide the faith’s one billion global followers on their steady march away from modernity and cultural advancement.

“It is with sadness, but steadfast conviction, that I announce I am no longer capable of impeding social progress with the energy and endurance that is required of the highest ministry in the Roman Catholic Church,” Benedict reportedly said in Latin to the Vatican’s highest cardinals. “While I’m proud of the strides the Church has made over the past eight years, from thwarting AIDS-prevention efforts in Africa to failing to punish or even admit to decades of sexual abuse of children at the hands of clergy, it has become evident to me that, in this rapidly evolving world, I now lack the capacity to continue guiding this faith back centuries.”

“Thus, I must step down from the papacy,” he added. “But let me assure every member of the Church that the Vatican’s commitment to narrow-mindedness and social obstruction will long live on after my departure.”

by The Onion |  Read more: 
Photo: uncredited

Bear Counting in Canada


[ed. What it's like to weigh and radio tag a denning Momma black bear and her three young cubs in the wilds of northern Ontario, Canada. Great video, and hilarious.]

Monday, February 11, 2013


Armin Carl Hansen (American, 1886 - 1957) Fishermen

The Fresh Wars


One afternoon last fall in New York, walking toward the subway at Union Square, I decided to lose the tortilla.

For months, Taco Bell had been urging America to drop by for a taste of its Cantina Bell menu. TV commercials featured celebrity spokeschef Lorena Garcia touring dewy fields of cilantro and welcoming viewers into a kitchen where she and her associates lovingly ladled black beans over rice. She laid oblong medallions of grilled chicken atop beds of romaine lettuce, roasted corn, and vermilion slivers of pepper and tomato. "Making a burrito bowl, I think, made Taco Bell a little bit nervous—mostly because they're used to wrapping everything in a tortilla," Garcia explains in one spot. "But I said, 'Guys, lose the tortilla and share these beautiful ingredients with the world.’ "

"Beautiful ingredients" aren't the first things one associates with Taco Bell, a chain that last year enjoyed record-shattering success with its Doritos Locos Taco, a $1.29 fistful of garbage dusted in neon-orange sodium that tasted vaguely like cheese and synergy. They certainly weren't the first things I saw when I sat down with my Cantina Bowl, from which pale green guacamole and a lumpy tuft of grated Monterey Jack stared back at me from a valley of rice, romaine, and meat as if to croak, "Ta-daaaahhh."

Yet despite its unremarkable appearance, the Cantina Bowl was remarkable for what it signified. It was a shot across the bow to competitors like Chipotle, a company that had based nearly two decades of rapid growth on wholesome, sustainably raised ingredients prepared in-store before Taco Bell ever enlisted Chef Garcia for an offensive of its own.

In essence, it was the latest salvo in the Fresh Wars.

The Fresh Wars have advertisers, marketers, and chefs embroiled in a battle for the title of freshest American fast food—and for the business of an increasingly sophisticated and conscientious populace of eaters. Taco Bell vs. Chipotle is just the start. The tagline "Eat Fresh" has helped Subway eclipse McDonald's as the world's largest fast-food chain. But Arby's crusade to "Slice Up the Truth About Freshness" aims to sow doubts about Subway's food sourcing while wooing customers with meat sliced on-premises. Meanwhile, Domino's Pizza has spent more than three years and untold millions reinventing its pizza and its image as models of quality and transparency, a gambit that has at least two high-profile competitors following suit.

The skirmishes emphasize the extraordinary value of one abstract concept for an industry desperate to capitalize on health and sourcing trends without actually having to invest in high-quality ingredients. Fresh doesn't have to be low-calorie or even especially nutritious—a burrito with ingredients prepared on-site at Chipotle may pack three times the calories of a burger. Nor does fresh require pathologically locavorian supply-chain standards: As Arby's has revealed, a sandwich from Subway might contain cold-cuts processed, packaged, and shipped from a centralized facility in Iowa. Better yet for retailers like Taco Bell, Domino's, and Arby's, the mere implications of freshness can be sold at a premium to new customers who otherwise might have avoided those chains' wares altogether. The only unabashedly pure thing about the concept of fresh is its subjectivity.

"I think it's meaningless, almost, now," says Mark Crumpacker, the chief marketing officer with Chipotle. "You could claim that something very heavily processed was fresh, I guess. I don't think there are any rules around 'fresh.' You can just say it with impunity. And I think lots of people do."

So maybe "Is it fresh?" isn't the question we should be asking ourselves as we lose the tortilla, slice up freshness, and muddle through the trenches of fast-food trends. Instead, amid the varying strategies, we have a much more basic and far more crucial determination to make: What does fresh even mean?

by S.T. VanAirsdale, Slate |  Read more:
Photos: BrokenSphere/Wikipedia Commons: iStockphoto/Thinkstock

Francis Picabia Udnie, Jeune fille américaine, Danse, 1913
via:

Matthias Kanter. Weg, 2011. Acrylic on canvas, 220 x 200 cm.

The Boeing Debacle

Brake problems. A fuel leak. A cracked windshield. One electrical fire. Then another. An emergency landing in Japan. A safety investigation imposed by the FAA. Then two premier customers—Japan’s two main airlines, ANA and JAL, ground their fleet of Boeing [BA] 787s. Then the FAA grounds all 787s used by the only American carrier. Now other regulators around the world follow suit, grounding all 50 of the 787s delivered so far. The regulatory grounding of an entire fleet is unusual—the first since 1979—and relates to a key to the plane’s claimed energy-efficiency: the novel use of lithium ion batteries, which have shown a propensity to overheat and lead to fires—fires that generate oxygen and hence are difficult to put out.

And keep in mind: Boeing’s 787 project is already billions of dollars over budget. The delivery schedule has been pushed back at least seven times. The first planes were delivered over three years late. In fact, out of a total of 848 planes sold, only 6 percent have been delivered.

Yet grave as these issues seem, they are merely symptoms of a deeper disease that has been gnawing at the US economy for decades: flawed offshoring decisions by the C-suite. Offshoring is not some menial matter to be left to accountants in the backroom or high-priced consultants armed with spreadsheets, promising quick profits. It raises mission-critical issues potentially affecting the survival of entire firms, whole industries and ultimately the economy.

Not just Boeing: an economy-wide problem

Thus Boeing is hardly alone in making flawed offshoring decisions. Boeing is just the latest and most spectacular example of an economy-wide problem.

“Many companies that offshored manufacturing didn’t really do the math,” Harry Moser, an MIT-trained engineer and founder of the Reshoring Initiative told me. As many as 60 percent of the decisions were based on miscalculations.

As noted by Gary Pisano and Willy Shih in their classic article, “Restoring American Competitiveness” (Harvard Business Review, July-August 2009), offshoring has been devastating whole US industries, stunting innovation, and crippling capacity to compete long-term.

Pisano and Shih write: “The decline of manufacturing in a region sets off a chain reaction. Once manufacturing is outsourced, process-engineering expertise can’t be maintained, since it depends on daily interactions with manufacturing. Without process-engineering capabilities, companies find it increasingly difficult to conduct advanced research on next-generation process technologies. Without the ability to develop such new processes, they find they can no longer develop new products. In the long term, then, an economy that lacks an infrastructure for advanced process engineering and manufacturing will lose its ability to innovate.”

Pisano and Shih have a frighteningly long list of industries that are “already lost” to the USA, including: compact fluorescent lighting; LCDs for monitors, TVs and handheld devices like mobile phones; electrophoretic displays; lithium ion, lithium polymer and NiMH batteries; advanced rechargeable batteries for hybrid vehicles; crystalline and polycrystalline silicon solar cells, inverters and power semiconductors for solar panels; desktop, notebook and netbook PCs; low-end servers; hard-disk drives; consumer networking gear such as routers, access points, and home set-top boxes; advanced composite used in sporting goods and other consumer gear; advanced ceramics and integrated circuit packaging.

The list of industries “at risk” is even longer and more worrisome.

by Steve Denning, Forbes |  Read more:
Image: uncredited

Before Greed

Speaking in New Haven in 1860, Abraham Lincoln told an audience, “I am not ashamed to confess that 25 years ago I was a hired laborer, mauling rails, at work on a flat-boat—just what might happen to any poor man’s son.” After his death, Lincoln’s personal trajectory from log cabin to White House emerged as the ideal American symbol. Anything was possible for those who strived.

But the goal of this striving was not great wealth. Perhaps the most revealing memorial to Lincoln and his world is found in one of the most mundane of American documents: the census. There he is in the Springfield, Illinois, listing of 1860: Abraham Lincoln, 51 years old, lawyer, owner of a home worth $5,000, with $12,000 in personal property. His neighbor Lotus Niles, a 40-year-old secretary—equivalent to a manager today—had accumulated $7,000 in real estate and $2,500 in personal property. Nearby was Edward Brigg, a 48-year-old teamster from England, with $4,000 in real estate and $300 in personal property. Down the block lived Richard Ives, a bricklayer with $4,000 in real estate and $4,500 in personal property. The highest net worth in the neighborhood belonged to a 50-year-old livery stable owner, Henry Corrigan, with $30,000 in real estate but only $300 in personal property. This was a town and a country where bricklayers, lawyers, stable owners, and managers lived in the same areas and were not much separated by wealth. Lincoln was one of the richer men in Springfield, but he was not very rich.

Not only was great wealth an aberration in Lincoln’s time, but even the idea that the accumulation of great riches was the point of a working life seemed foreign. Whereas today the most well-off frequently argue that riches are the reward of hard work, in the Civil War era, the reward was a “competency,” what the late historian Alan Dawley described as the ability to support a family and have enough in reserve to sustain it through hard times at an accustomed level of prosperity. When, through effort or luck, a person amassed not only a competency but enough to support himself and his family for his lifetime, he very often retired. Philip Scranton, an industrial historian, writes of one representative case: Charles Schofield, a successful textile manufacturer in Philadelphia who, in 1863, sold his interest in his firm for $40,000 and “retired with a competency.” Schofield, who was all of 29 years old, considered himself “opulent enough.” The idea of having enough frequently trumped the ambition for endless accumulation.

As the men and women of Lincoln’s and Schofield’s generations aged, they retained the ideal of progress from poverty to competency. Later in the century, midwestern publishers created county histories that featured images of their subscribers’ homesteading progress, from “first home in the woods” to comfortable farm. The “mug books”—so called because they included images not only of cabins and farms but also of their owners—captured the trajectory of these American lives and the achievement of their economic ambitions: the creation of prosperous homes. They built them, but they could build them because they were citizens of a democratic republic. The opportunity to build secure homes was part of the purpose of the economy.

For a moment at the end of the Civil War, it seemed the liberal ideal of a republican citizenry, in which autonomous individuals build a society based on contracts, would reach fruition in a United States where extremes of wealth and poverty were largely nonexistent. Instead, by 1900, extremes of the sort that hadn’t been seen since the abolition of slavery were de rigueur. In 1860 there was only one Cornelius Vanderbilt, but 40 years later, the concentration of wealth in the corporate form ensured an enlarged class of the super-rich.

by Richard White, Boston Review |  Read more:
Photo: Library of Congress

Never on a Saturday

Earlier this week, the United States Post Office announced that come August, it would be suspending regular home delivery service of the mails on Saturdays, except for package service. The USPS is In financial straits, and the budget-cutting move will save about $2 Billion in its first year, putting a dent in the $16 Billion it lost just in 2012.

The Post Office has come under financial pressure from a number of sources over the past decade. Of course the internet has usurped traffic. And there’s also lost market share to private carriers like Federal Express and United Parcel Service, which cut into the lucrative package an overnight delivery markets, while leaving the USPS with an unenviable monopoly in the money-losing but vitally important national letter-and-stamp service. Despite regularly increasing rates over the last decade, the United States still offers one of the cheapest such services in the world, with a flat fee of 46 cents to send a 1 oz. envelope 1st class anywhere in the United States.

For less than half a dollar, you can send a birthday card from Maine to Hawai’i, and be confident that it will arrive in 2-3 days. Pretty impressive. Especially when compared to other nations, almost all of which charge more for an ounce of domestic mail, even though most of them are quite a bit smaller in size. The chart below compares rates from 2011.

Another financial constraint comes from the fact that, other than some small subsidies for overseas U.S. electoral ballots, the USPS is a government agency that pays its own way, operating without any taxpayer dollars for about thirty years now..

However, the biggest factor in its recent financial free fall is undoubtedly the Postal Accountability and Enhancement Act of 2006 (PAEA), which Republicans pushed through Congress and President George W. Bush signed into law. The PAEA required the Post Office fully fund its pension healthcare costs through the year 2081.

Yes, you read that right. 2081. And it was given only 10 years to find the money to fund 75 years worth of retirement healthcare benefits.To clarify just how odious this regulation is, think about it like this. In the next three years, the Post Office must finish finding the money to fully fund not only all of its current retirees and current works, but also decades’ worth of future workers it hasn't hired yet. Indeed, some of the future retired workers in question weren’t even born yet when PAEA was signed into law.

Needless to say, no other federal, state, or government agency, much less any private company, has such a mandate, and the USPS is now bleeding money down the drain like it was shivved in a prison shower stall; which, metaphorically speaking, it was. Cloaked in the mantle of fiscal responsibility, the real impetus for the PAEA was an attack on the postal workers’ union, and a nod to the USPS’s private competitors.

by Akim Reinhardt, 3 Quarks Daily | Read more:
Image: Charles Schultz

Friday, February 8, 2013

The Idealist


At the beginning of every year, Aaron Swartz would post an annotated list of everything he’d read in the last 12 months. His list for 2011 included 70 books, 12 of which he identified as “so great my heart leaps at the chance to tell you about them even now.” One of these was Franz Kafka’s The Trial, about a man caught in the cogs of a vast bureaucracy, facing charges and a system that defy logical explanation. “I read it and found it was precisely accurate—every single detail perfectly mirrored my own experience,” Swartz wrote. “This isn’t fiction, but documentary.”

At the time of his death, the 26-year-old Swartz had been pursued by the Department of Justice for two years. He was charged in July 2011 with accessing MIT’s computer network without authorization and using it to download 4.8 million documents from the online database JSTOR. His actions, the government alleged, violated Title 18 of the U.S. Code, and carried a maximum penalty of up to 50 years in jail and $1 million in fines.

The case had sapped Swartz’s finances, his time, and his mental energy and had fostered a sense of extreme isolation. Though his lawyers were working hard to strike a deal, the government’s position was clear: Any plea bargain would have to include at least a few months of jail time.

A prolonged indictment, a hard-line prosecutor, a dead body—these are the facts of the case. They are outnumbered by the questions that Swartz’s family, friends, and supporters are asking a month after his suicide. Why was MIT so adamant about pressing charges? Why was the DOJ so strict? Why did Swartz hang himself with a belt, choosing to end his own life rather than continue to fight?

When you kill yourself, you forfeit the right to control your own story. At rallies, on message boards, and in media coverage, you will hear that Swartz was felled by depression, or that he got caught in a political battle, or that he was a victim of a vindictive state. A memorial in Washington, D.C., this week turned into a battle over Swartz’s legacy, with mourners shouting in disagreement over what policy changes should be enacted to honor his memory.

Aaron Swartz is a difficult puzzle. He was a programmer who resisted the description, a dot-com millionaire who lived in a rented one-room studio. He could be a troublesome collaborator but an effective troubleshooter. He had a talent for making powerful friends, and for driving them away. He had scores of interests, and he indulged them all. In August 2007, he noted on his blog that he’d “signed up to build a comprehensive catalog of every book, write three books of my own (since largely abandoned), consult on a not-for-profit project, help build an encyclopedia of jobs, get a new weblog off the ground, found a startup, mentor two ambitious Google Summer of Code projects (stay tuned), build a Gmail clone, write a new online bookreader, start a career in journalism, appear in a documentary, and research and co-author a paper.” Also, his productivity had been hampered because he’d fallen in love, which “takes a shockingly huge amount of time!”

How can one sort of organization develop a young man like Aaron Swartz, and how can another destroy him?

He was fascinated by large systems, and how an organization’s culture and values could foster innovation or corruption, collaboration or paranoia. Why does one group accept a 14-year-old as an equal partner among professors and professionals while another spends two years pursuing a court case that’s divorced from any sense of proportionality to the alleged crime? How can one sort of organization develop a young man like Aaron Swartz, and how can another destroy him?

Swartz believed in collaborating to make software and organizations and government work better, and his early experiences online showed him that such things were possible. But he was better at starting things than he was at finishing them. He saw obstacles as clearly as he saw opportunity, and those obstacles often defeated him. Now, in death, his refusal to compromise has taken on a new cast. He was an idealist, and his many projects—finished and unfinished—are a testament to the barriers he broke down and the ones he pushed against. This is Aaron Swartz’s legacy: When he thought something was broken, he tried to fix it. When he failed, he tried to fix something else.

Eight or nine months before he died, Swartz became fixated on Infinite Jest, David Foster Wallace’s massive, byzantine novel. Swartz believed he could unwind the book’s threads and assemble them into a coherent, easily parsed whole. This was a hard problem, but he thought it could be solved. As his friend Seth Schoen wrote after his death, Swartz believed it was possible to “fix the world mainly by carefully explaining it to people.”

It wasn’t that Swartz was smarter than everyone else, says Taren Stinebrickner-Kauffman—he just asked better questions. In project after project, he would probe and tinker until he’d teased out the answers he was looking for. But in the end, he was faced with a problem he couldn’t solve, a system that didn’t make sense.

by Justin Peters, Slate |  Read more:
Photo by Sage Ross/Flickr/Wikimedia Commons

Beat By Dre: The Exclusive Inside Story of How Monster Lost the World


There's never been anything like Beats By Dre. The bulky rainbow headphones are a gaudy staple of malls, planes, clubs, and sidewalks everywhere: as mammoth, beloved, and expensive as their namesake. But Dr. Dre didn't just hatch the flashy lineup from his freight train chest: The venture began as an unlikely partnership between a record-industry powerhouse and a boutique audio company best known for making overpriced HDMI cables.

You might know this; you might own a pair of beats that still has Monster's tiny, subjugated logo printed on them. But what you don't know is how, in inking the deal, Monster screwed itself out of a fortune. It's the classic David vs Goliath story—with one minor edit: David gets his ass kicked and is laughed out of the arena. This is the inside story of one of the all time worst deals in tech.

The route to a rapper-gadget sensation doesn't start in the VIP section of a club over a bottle of Cristal. The idea wasn't hatched in the back of a Maybach or in a boardroom whose walls are decked out in platinum records and shark tanks. Before Dre got paid, and red 'B' logos clamped millions young heads across the globe, the son of Chinese immigrants started toying with audio equipment in California.

Beats begins with Monster, Inc., and Monster begins with Noel Lee. He's a friendly, incredibly smart man with a comic-book hairstyle and a disability that adds to his supervillain stature: Lee is unable to walk. Instead, he glides around on a chrome-plated Segway. Lee has been making things for your ears since 1979, after he took an engineering education and spun it into a components business with one lucrative premise: your music doesn't sound as good as it could.

In true Silicon Valley fashion, Lee started out in his family's basement: taste-testing different varieties of copper wire until he found a type that he thought enhanced audio quality. Then, also in Silicon Valley fashion, he marketed the shit out of it and jacked up its price: Monster Cable. Before it was ever mentioned in the same gasp as Dre, Monster was trying to get music lovers to buy into a superior sound that existed mostly in imaginations and marketing brochures. "We came up with a reinvention of what a speaker cable could be," Noel Lee boasts. His son, Kevin, describes it differently: "a cure for no disease."

Monster expanded into pricey HDMI cables, surge protectors, and... five different kinds of screen-cleaner. Unnecessary, overpriced items like these have earned Monster a reputation over the years as ripoff artists, but that belies the company's ability to make audio products that are actually pretty great. The truth is, audio cable is a lot like expensive basketball shoes: There are a couple hundred people in the world who really need the best, and the rest of us probably can't tell the difference. Doesn't matter: Through a combination of slick persuasion and status-pushing, Noel Lee carved out a small empire.

But you can only sell so many $200 cables. The next step was speakers, but the company started in on speakers too late; the hi-fi era was over. Plenty of people were content with the sound their TVs made, or at most, a soundbar. Monster took a bath.

But speakers for your head? This was the absolute, legit next big thing.

by Sam Biddle, Gizmodo |  Read more:
Image: uncredited

Thursday, February 7, 2013


“Courage is doing what is right; tranquility is courage in repose.” - Inazo Nitobe
via:

Yodamanu Reflets I, Strasbourg 2011.

Caring on Stolen Time: A Nursing Home Diary

I work in a place of death. People come here to die, and my co-workers and I care for them as they make their journeys. Sometimes these transitions take years or months. Other times, they take weeks or some short days. I count the time in shifts, in scheduled state visits, in the sham monthly meetings I never attend, in the announcements of the “Employee of the Month” (code word for best ass-kisser of the month), in the yearly pay increment of 20 cents per hour, and in the number of times I get called into the Human Resources office.

The nursing home residents also have their own rhythms. Their time is tracked by scheduled hospital visits; by the times when loved ones drop by to share a meal, to announce the arrival of a new grandchild, or to wait anxiously at their bedsides for heart-wrenching moments to pass. Their time is measured by transitions from processed food to pureed food, textures that match their increasing susceptibility to dysphagia. Their transitions are also measured by the changes from underwear to pull-ups and then to diapers. Even more than the loss of mobility, the use of diapers is often the most dreaded adaptation. For many people, lack of control over urinary functions and timing is the definitive mark of the loss of independence.

Many of the elderly I have worked with are, at least initially, aware of the transitions and respond with a myriad of emotions from shame and anger to depression, anxiety, and fear. Theirs was the generation that survived the Great Depression and fought the last “good war.” Aging was an anti-climactic twist to the purported grandeur and tumultuousness of their mid-twentieth-century youth.

“I am afraid to die. I don’t know where I will go,” a resident named Lara says to me, fear dilating her eyes.

“Lara, you will go to heaven. You will be happy,” I reply, holding the spoonful of pureed spinach to her lips. “Tell me about your son, Tobias.”

And so Lara begins, the same story of Tobias, of his obedience and intelligence, which I have heard over and over again for the past year. The son whom she loves, whose teenage portrait stands by her bedside. The son who has never visited, but whose name and memory calm Lara.

Lara is always on the lookout, especially for Alba and Mary, the two women with severe dementia who sit on both sides of her in the dining room. To find out if Alba is enjoying her meal, she will look to my co-worker Saskia to ask, “Is she eating? If she doesn’t want to, don’t force her to eat. She will eat when she is hungry.” Alba, always cheerful, smiles. Does she understand? Or is she in her usual upbeat mood? “Lara, Alba’s fine. With you watching out for her, of course she’s OK!” We giggle. These are small moments to be cherished.

In the nursing home, such moments are precious because they are accidental moments.

The residents run on stolen time. Alind, like me, a certified nursing assistant (CNA), comments, “Some of these residents are already dead before they come here.”

By “dead,” he is not referring to the degenerative effects of dementia and Alzheimer’s disease but to the sense of hopelessness and loneliness that many of the residents feel, not just because of physical pain, not just because of old age, but as a result of the isolation, the abandonment by loved ones, the anger of being caged within the walls of this institution. This banishment is hardly the ending they toiled for during their industrious youth.

by Jomo, Dissent |  Read more:
Photo via:

The Marvelous Marie Curie

Marie Curie (1867–1934) is not only the most important woman scientist ever; she is arguably the most important scientist all told since Darwin. Einstein? In theoretical brilliance he outshone her — but her breakthroughs, by Einstein’s own account, made his possible. She took part in the discovery of radioactivity, a term she coined; she identified it as an atomic property of certain elements. When scoffers challenged these discoveries, she meticulously determined the atomic weight of the radioactive element she had revealed to the world, radium, and thereby placed her work beyond serious doubt. Yet many male scientists of her day belittled her achievement, even denied her competence. Her husband, Pierre Curie, did the real work, they insisted, while she just went along for the wifely ride. Chauvinist condescension of this order would seem to qualify Marie Curie as belle idéale of women’s studies, icon for the perennially aggrieved. But such distinction better suits an Aphra Behn or Artemisia Gentileschi than it does a Jane Austen or Marie Curie. Genuine greatness deserves only the most gracious estate, not an academic ghetto, however fashionable and well-appointed.

Yet the fact remains: much of the interest in Madame Curie stems from her having been a woman in the man’s world of physics and chemistry. The interest naturally increases as women claim their place in that world; with this interest comes anger, sometimes righteous, sometimes self-righteous, that difficulties should still stand in the way. A president of Harvard can get it in the neck for suggesting that women don’t have the almost maniacal resolve it takes to become first-rate scientific researchers — that they are prone to distraction by such career-killers as motherhood. So Marie Curie’s singularity cannot but be enveloped in the sociology of science, which is to say these days, feminist politics.

The sociology is important, as long as one remembers the singularity. For Marie Curie did have the almost maniacal resolve to do great scientific work. The work mattered as much to her as it does to most any outstanding scientist; yet can one really say it was everything? She passionately loved her husband and, after his premature death, loved another scientist of immense talent, perhaps of genius; she had the highest patriotic feeling for her native Poland and her adopted France, and risked her life in wartime; she raised two daughters, one, Irène, a Nobel Prize laureate in chemistry, the other, Ève, an accomplished writer, most notably as her mother’s biographer.

Madame Curie’s life reads almost like a comic-book adventure version of feminine heroism: the honest-to-goodness exploits of the original Wonder Woman; the one and only real deal; accept no imitations. Of course, imitation is precisely what such a life tends to inspire in the most zealous and worthy admirers. Madame Curie, however, explicitly warned such aspirants to scientific immortality that the way was unspeakably lonesome and hard, as her daughter Ève Curie records her saying in the 1937 biography Madame Curie. “Madame Curie avoided even that element of vanity that might most easily have been forgiven her: to let herself be cited as an example to other women. ‘It isn’t necessary to lead such an anti-natural existence as mine,’ she sometimes said to calm her overmilitant admirers. ‘I have given a great deal of time to science because I wanted to, because I loved research.... What I want for women and young girls is a simple family life and some work that will interest them.’” Better for gifted women to find some smaller work they enjoy doing and fit it into a life of traditional completeness. But hadn’t Madame Curie herself done it all, and on the titanic scale that launched so many dreamers toward the most earnest fantasies, and in many cases the most heartening achievements? How could she warn others off the path she had traveled? Despite her professions that she had taken the course right for her, did she really regret having traveled it?

One can only say that her intensity was preternatural. She could not have lived otherwise than she did: like a demon’s pitchfork or an angel’s whisper, the need to know, and to be known for knowing — though only among those who mattered, the serious ones like her, for she despised celebrity — drove her on relentlessly. Hardship and ill fortune accompanied her all her days. There seemed to be no ordeal she could not power her way through. Her indomitable will served her voracious intelligence. But for every accomplishment, for every distinction, for every rare joy, she paid and paid. Interludes of happiness brightened the prevailing emotional murk, but the murk did prevail. Episodes of major depression began in childhood and became a fixture. At various times in her life she thought seriously of suicide.

Love could be lost, and forever; children failed to fill the void; only work provided reliable solace and meaning. So she worked. She worked doggedly, devotedly, brilliantly. Scientific work was not simply diversion from the pains of living; it was a way of life, like Socratic philosophy, from which Madame Curie appeared to have acquired the guiding principle: “Nothing in life is to be feared. It is only to be understood.” Whether the unforeseen consequences of her work still sustain that sublime credo is a question as yet unresolved.

by Algis Valiunas, The New Atlantis | Read more:
Photo via:

How the Gun-Control Movement Got Smart


Here is how advocates of gun control used to talk about their cause: They openly disputed that the Second Amendment conferred the right to own a gun. Their major policy goals were to make handguns illegal and enroll all U.S. gun owners in a federal database. The group now known as the Brady Campaign to Prevent Gun Violence was once known as Handgun Control Inc.; a 2001 book by the executive director of the Violence Policy Center was entitled Every Handgun Is Aimed at You: The Case for Banning Handguns.

Contrast that with what you see today: Gun-control groups don't even use the term "gun control," with its big-government implications, favoring "preventing gun violence" instead. Democratic politicians preface every appeal for reform with a paean to the rights enshrined in the Second Amendment and bend over backwards to assure "law-abiding gun owners" they mean them no ill will. Even the president, a Chicago liberal who once derided rural voters' tendency to "cling to guns or religion," seeks to assure gun enthusiasts he's one of them by citing a heretofore-unknown enthusiasm for skeet shooting, adding, "I have a profound respect for the traditions of hunting that trace back in this country for generations. And I think those who dismiss that out of hand make a big mistake."

A frequent question in the current battle over gun control is why anyone should expect reform to succeed now when it's failed repeatedly for the last 20 years. Maybe this is why: Between then and now, advocates of gun control got smarter. They've radically changed their message into one that's more appealing to Middle America and moderate voters.

In the late '90s, "Democrats and gun-control groups had approached the debate consistently in a way that deeply, almost automatically alienated a lot of gun owners," said Jon Cowan, former president of a now-defunct group called Americans for Gun Safety.

The story of the way the gun debate changed is largely the story of AGS. Formed in 2000 by Andrew McKelvey, the CEO of Monster.com, the group sought to reset the terms of the debate and steer the gun-control movement away from its inward-looking, perpetually squabbling, far-left orientation. The various advocacy groups were often more concerned with fighting with each other than with taking the fight to their opponents, and a vocal contingent valued ideological purity over pragmatism. (...)

"There was as much fighting between the groups as with the opposition," David Hantman, a former aide to the bill's sponsor, Senator Dianne Feinstein, recalled. "Some of them insisted that we couldn't just renew [the ban], we had to strengthen it." With Republicans controlling the White House and both houses of Congress, that wasn't politically feasible, and the ban was allowed to lapse. Around the same time, legislation to close the "gun-show loophole" by requiring background checks for non-dealer gun sales was defeated, and Congress passed a bill according gun manufacturers blanket immunity from product-liability lawsuits.

McKelvey, a Yellow Pages ad marketer-turned-tech billionaire, came to the gun issue after being shocked by Columbine. Described by friends as an apolitical businessman who enjoyed hunting (he died of cancer in 2008), McKelvey was frustrated by the tone-deaf approach he saw the gun-control movement taking. He joined the board of Handgun Control Inc. and immediately began pressuring the group to change its name, promising substantial financial support in exchange for such a move; when the group resisted, he quit the board and set out to form his own group -- AGS.

If the NRA today seems fixated on the notion that the left is out to undercut the Second Amendment, confiscate law-abiding Americans' legally acquired firearms, and instigate federal-government monitoring of all gun owners, that's because 15 years ago, gun-control advocates wanted to do all of those things.

by Molly Ball, The Atlantic |  Read more:
Photo: Pete Souza/White House

Wednesday, February 6, 2013


Jennifer Laura Palmer, Number 78 from The Drawing Project, Ink on Paper 9 x 6 in., 2012.
via:

01-04-13 by Lee Kaloidis on Flickr.
via: