Sunday, July 31, 2011

Minority Rules

Scientists at Rensselaer Polytechnic Institute have found that when just 10 percent of the population holds an unshakable belief, their belief will always be adopted by the majority of the society. The scientists, who are members of the Social Cognitive Networks Academic Research Center (SCNARC) at Rensselaer, used computational and analytical methods to discover the tipping point where a minority belief becomes the majority opinion. The finding has implications for the study and influence of societal interactions ranging from the spread of innovations to the movement of political ideals.

“When the number of committed opinion holders is below 10 percent, there is no visible progress in the spread of ideas. It would literally take the amount of time comparable to the age of the universe for this size group to reach the majority,” said SCNARC Director Boleslaw Szymanski, the Claire and Roland Schmitt Distinguished Professor at Rensselaer. “Once that number grows above 10 percent, the idea spreads like flame.”

As an example, the ongoing events in Tunisia and Egypt appear to exhibit a similar process, according to Szymanski. “In those countries, dictators who were in power for decades were suddenly overthrown in just a few weeks.”

The findings were published in the July 22, 2011, early online edition of the journal Physical Review E in an article titled “Social consensus through the influence of committed minorities.”

An important aspect of the finding is that the percent of committed opinion holders required to shift majority opinion does not change significantly regardless of the type of network in which the opinion holders are working. In other words, the percentage of committed opinion holders required to influence a society remains at approximately 10 percent, regardless of how or where that opinion starts and spreads in the society.

To reach their conclusion, the scientists developed computer models of various types of social networks. One of the networks had each person connect to every other person in the network. The second model included certain individuals who were connected to a large number of people, making them opinion hubs or leaders. The final model gave every person in the model roughly the same number of connections. The initial state of each of the models was a sea of traditional-view holders. Each of these individuals held a view, but were also, importantly, open minded to other views.

Once the networks were built, the scientists then “sprinkled” in some true believers throughout each of the networks. These people were completely set in their views and unflappable in modifying those beliefs. As those true believers began to converse with those who held the traditional belief system, the tides gradually and then very abruptly began to shift.

Read more:

Obama: His Words and His Deeds

by David Bromwich


In early June, a constitutional crisis faced Barack Obama over his defiance of the War Powers Act of 1973. The law requires the President to seek approval by Congress within sixty days of committing American forces to an armed conflict anywhere in the world. Two resolutions emerged and were debated in Congress to force compliance from Obama. One, drafted by the Speaker of the House, John Boehner, called for the President to give a justification of US actions in Libya. On June 3, the Boehner resolution passed by a vote of 268–145. An alternative resolution, drafted by Dennis Kucinich, the best-known anti-interventionist among Democrats, would have called for US withdrawal from Libya within fifteen days. The Kucinich resolution was defeated 148–265.

The debate and the two votes were the first major signs of congressional resistance to the aggrandizement of executive power begun by George W. Bush in Afghanistan and Iraq and continued by Obama in Afghanistan and Libya. The reasons the President had cited in a letter to Congress for his circumvention of congressional approval of his actions in Libya betrayed a curious mixture of arrogance and disregard for the War Powers Act. The US military role in Libya, Obama said, was subordinate, and, since NATO was now in command, the Libya war hardly qualified as a war. Congress was free to discuss the matter if it liked, and he would welcome its approval, but in his view he acted within his legal powers in giving the orders without approval.

Few members of Congress as yet hold a fully articulated objection to America’s wars in Asia and North Africa. But other causes in play may trouble the President’s determination to show his sympathy with the Arab Spring by military action in Libya. Obama has an unfortunate propensity to be specific when it would serve him well to avoid particulars, and to become vague at times when dates, names, numbers, or “a line in the sand” is what is needed to clarify a policy. On Libya, he was specific. He said the American commitment would last “days, not weeks.” It has now lasted a little under three months. Reliable reporters such as Anthony Shadid of The New York Times and Patrick Cockburn of The Independent have suggested that an end to the conflict is nowhere in sight.

The narrow aim of enforcing a “no-fly zone” to protect civilians, asserted by Susan Rice and Hillary Clinton as the limit of American aims, turns out to have been a wedge for an air war against Qaddafi, a war, in fact, as thorough as is compatible with avoidance of harm to civilians. The surest thing one can say about the end of this engagement is that the US—along with France, Great Britain, and perhaps also Italy, which arranged the intervention—will at some point install a client state and fit out a friendly government with a democratic constitution. Nothing about the war affords any insight into the intermediate calculations of Obama and his collaborators, Nicolas Sarkozy and David Cameron.

Obama was in Brasília on March 19 when he announced his authorization of “limited military action” in Libya. For that matter, he has been away from Washington for a large part of his two and a half years as president. This fact may be dwelt on excessively by his detractors, especially at Fox News, but its importance is scarcely acknowledged by his allies. (According to figures compiled at the end of 2010 by the CBS reporter Mark Knoller, Obama’s first twenty-three months in office saw seventy days on foreign trips and fifty-eight days on vacation trips.) He has gambled that it pays to present himself as a statesman above the scramble of something disagreeable called Washington.

Here he follows a path trodden by almost all his predecessors. Carter, Reagan, Clinton, and George W. Bush all affected the stance of outsider; only Bush Senior scorned to adopt the tactic (and could not have gotten away with it if he tried). Nor does taking such a position confer an automatic advantage. It worked well for Reagan until the Iran-contra scandal in 1986. Clinton was helped and hurt in about equal parts by the outsider pretense. For Carter and the younger Bush, it seems to have added to the impression of incompetence or disengagement. People came to think that there were things these men could have learned from Washington.

The anti-Washington tactic, and the extensive travel it licenses, have not worked well for Obama. He retains the wish to be seen as a man above party; and a more general distaste for politics is also involved. But what is Barack Obama if not a politician? By his tones of voice and selection of venues he has implied several possibilities: organizer, pastor, school principal, counselor on duties and values. Most prominently, over the past six months he seems to have improvised the role (from materials left behind by Reagan) of a kind of national host or “moderator” of the concerns of Americans. From mid-2009 through most of 2010, Obama embarked on solo missions to shape public opinion at town hall meetings and talk show bookings, but the preferred format now appears to be the craftily timed and planned and much-heralded ecumenical address. Obama’s televised speech on January 12 at the memorial service after the Tucson shooting was his first major venture on those lines. His speech on May 19 at the State Department was the second; and its announced subject was even more ambitious: the entire domain of US policy in the Middle East.

Being president of the world has sometimes seemed a job more agreeable to Barack Obama than being president of the United States. This goes with another predilection. Obama has always preferred the symbolic authority of the grand utterance to the actual authority of a directed policy: a policy fought for in particulars, carefully sustained, and traceable to his own intentions. The danger of the built-up speech venues—the Nobel Prize speech of December 2009 was another example—is that they cast Obama as the most famous holder-forth in the world, and yet it is never clear what follows for him from the fact that the world is listening. These settings make a president who is now more famous than popular seem not popular but galactic.

Read more:

Cowboys and Aliens

by Dennis Hartley

Ah, summer. The high season of high concept films, pitched to the Hollywood higher-ups by people who are really, really, high. Hey now! Consider Cowboys and Aliens, the newest film from Iron Man director Jon “Vegas, baby, Vegas” Favreau. The title is the pitch. That’s probably all it took: “Cowboys. Aliens. Daniel Craig. Harrison Ford.” And, BAM! Green-lighted. Done deal. It’s almost eloquent, in its masterful conceptual brevity. OK, there have been precedents, vis a vis the mash-up of the Old West with sci-fi. The Valley of Gwangi is one film that immediately springs to mind-a guilty pleasure from 1969 that featured cowpokes wranglin’ a purple stop-motion T. Rex (Barney with teeth!) for a Mexican circus. Gene Autry’s Phantom Empire movie serial dates all the way back to the 1930s, which has the Singing Cowboy mixing it up with robots and denizens hailing from the underground city of ‘Murania’ (Queen Tika!). Back to the Future, Part III would fit in that theme park. Westworld and The Adventures of Buckaroo Banzai Across the 8th Dimension sort of count. And then there’s…well, others. It would be cheating to include TV, so I won’t mention The Wild, Wild West, the odd Twilight Zone or Star Trek episode, or “Gunmen of the Apocalypse” (Best.Red.Dwarf.Episode.Ever.).

The film opens, appropriately enough, with a Mystery. Actually, it opens kind of like Hangover 3. A rangy 1870s gunslinger (Daniel Craig) wakes up in the middle of the Arizona desert with a cauterized wound, an empty holster, a non-removable, anachronistic hi-tech device affixed to his wrist…and amnesia. An absence of empty tequila bottles in the immediate vicinity would appear to indicate that there could be an interesting story behind all this. He isn’t given too much time to ponder, as he (Jake, we’ll call him) is soon set upon by some gamey ruffians with human scalps hanging from their saddles. Sizing up his wound and assuming his unusual bracelet is a kind of shackle, the boys figure Jake might be worth reward money (not only do these fellers spout authentic Western gibberish, but they ain’t none too bright). Imagine their surprise (and Jake’s) when he instinctively springs into action and expertly takes ‘em all out, Jason Bourne style. So we (and Jake) have discovered one thing right off the bat-he’s a badass.


Source:  Misplaced

Ronnie Earl and the Broadcasters


[Great acoustic performance.  Turn it up.]

The Sewers I Swim In


I've seen lots of arguments about why reducing the deficit right now would bring crisis to the economy. Most of them are very textbook Keynesian arguments arguing that at times of excess capacity, reducing deficit spending would just add headwinds to an already struggling economy. The other argument is that the US should take advantage of exceptionally low borrowing rates to invest in rapidly aging infrastructure and put Americans back to work using a sort of New Deal 2.0 scheme.

The first argument is a bird's eye solution to a ground-level problem. Yes, government spending would goose GDP, but is that spending creating wealth? Where is that "stimulus" going? Our goal, after all, is not to maximize GDP, but to maximize wealth. GDP is just a poor objective measure for a deeply subjective phenomenon and gaming our own framework won't help anyone, regardless of what numbers the BLS, BEA and FRB release over the upcoming months. And let's not forget that Washington has a very poor track record as an allocator of capital. I'm simply not comfortable leaving these decision up to the people that decided to try to reflate the bubble by pulling-forward demand, subsidizing toy arrows and foreign liquor and build useless airports. Just sayin'.

But does this mean we should address the crisis with full-throttle austerity? Not quite. As it was eloquently pointed out last Summer on interfluidity, austerity is stupid and deficits are dangerous. We can't make generalizations about debt, deficits or balanced budgets. Deficits and debt are neither good nor bad on their own. Leveraging up for wealth-creating projects is good, borrowing to throw money away shoveling sand from one pile to the other not so much. Washington is focusing on abstract goals like "putting real Americans to work." And one can't blame them because that's what people want, jobs. But "jobs" isn't something you can simply create from thin air, you can't just throw money at this problem and expect to fix it. "Jobs bills" and "improving America" are nebulous ideas, subject to interpretation without any objective way to measure success or failure, which is probably what Washington wants.

"Well, fine, but what do you suggest then?" you may be asking yourself. I just want to say one word to you. Just one word. Sewage. We've spent the better part of the last 10,000 years trying to secure sources of clean water and get rid of waste. Humanity has developed modern plumbing and sanitary sewers. We survived the Great Stink of 1858. We've battled epidemics of water-borne disease, droughts and floods.  I feel comfortable in making the broad statement that clean water is good and shitty water is bad. Therefore, one could expect that making something good out of something bad would be a positive thing, an improvement, a wealth-creating action. If you disagree, feel free to stop reading now.

Read more:

BP 'Stranglehold' Over Iraq

by Terry Macalister


BP has been accused of taking a "stranglehold" on the Iraqi economy after the Baghdad government agreed to pay the British firm even when oil is not being produced by the Rumaila field, confidential documents reveal.

The original deal for operating Iraq's largest field – half as big as the entire North Sea – has been rewritten so that BP will be immediately compensated for civil disruption or government decisions to cut production.

This potentially could influence the policy decisions made by Iraq in relation to the Opec oil cartel, and is a major step away from the original terms of an auction deal signed in the summer of 2009, critics claim.

"Iraq's oil auctions were portrayed as a model of transparency and a negotiating victory for the Iraqi government," said Greg Muttitt, author of Fuel on the Fire: Oil and Politics in Occupied Iraq. "Now we see the reality was the opposite: a backroom deal that gave BP a stranglehold on the Iraqi economy, and even influence over the decisions of Opec."

Read more:

Know Thyself: Easier Said Than Done

by Nicholas Humphrey

A few days before a review of my latest book appeared in these pages, I wrote to my editor, saying I had seen an advance copy and how much I liked the color illustration of the yellow moon. He replied that I must be mistaken, since the Book Review doesn’t use color. The next weekend he wrote to say he couldn’t think what had come over him — he reads the Book Review every week, and had somehow not noticed the color. Odd. And yet these lapses can happen to the best of us. Ask yourself what the Roman number four on the face of the church clock looks like. Most people will answer it looks like IV, but almost certainly the truth is it looks like IIII.

Why are we so bad at knowing — in this case remembering — what passes through our own minds? The philosopher Eric Schwitzgebel, in “Perplexities of Consciousness,” contends that our minds, rather than being open-access, are largely hidden territory. Despite what we believe about our powers of introspection, the reality is that we know awfully little about what our conscious experience amounts to. Even when reporting current experience, we make divergent, confused and even contradictory claims about what it’s like to be on the inside.

...He begins with the curious case of color in dreams. When people today are asked whether they regularly dream in color, most say they do. But it was not always so. Back in the 1950s most said they dreamed in black and white. Presumably it can hardly be true that our grandparents had different brains that systematically left out the color we put in today. So this must be a matter of interpretation. Yet why such freedom about assigning color? Well, try this for an answer. Suppose that, not knowing quite what dreams are like, we tend to assume they must be like photographs or movies — pictures in the head. Then, when asked whether we dream in color we reach for the most readily available pictorial analogy. Understandably, 60 years ago this might have been black-and-white movies, while for most of us today it is the color version. But, here’s the thing: Neither analogy is necessarily the “right” one. Dreams don’t have to be pictures of any kind at all. They could be simply thoughts — and thoughts, even thoughts about color, are neither colored nor non-colored in themselves.

Read more:

Black Box

by  Jerry Adler

They were guarded by silent corpses, the passengers and crew of an Airbus A330 that plummeted to the bottom of the Atlantic in June 2009. For nearly two years, the boxes -- not black, actually, but bright orange -- had lain amid some of the most rugged undersea terrain in the world, 3,500-metre-high mountains rising from the ocean floor, covered with landslides and steep scarps.

Until May when an advanced robotic submersible, the Remora 6000, brought the two black boxes from Air France flight 447 to the surface, they were among the world's most sought-after artefacts, the keys to understanding why a state-of-the-art wide-body jet fell out of the sky on a routine flight from Rio de Janeiro to Paris, killing all 228 aboard. Since no one knew the exact coordinates of the crash, the searchers had to extrapolate their grid from the plane's last known location. It took a team led by the king of undersea searchers, Dave Gallo of the Woods Hole Oceanographic Institution in Massachusetts, to find the wreckage; Phoenix International, a deepwater recovery company, finally brought the recorders home. Why did it take so long? "You can find a needle in a haystack," Gallo says, "but you have to find the haystack first."

French accident investigators removed the memory cards, carefully dried them, plugged in the right cables, and soon announced that the boxes had preserved nearly all the data they had captured -- two hours of audio recorded from the cockpit and a complete record of thousands of measurements taken between takeoff and the moment the Airbus crashed. It was regarded, rightly, as a technological triumph. Although voice and data recorders are built to withstand the most extreme conditions of shock, fire and pressure -- they get fired from an air cannon as part of the testing regimen -- they are not designed to preserve data for so long at such depths. The black boxes, built by Honeywell, had greatly exceeded their specifications.

But this elaborate and expensive undersea search could have been avoided; the technology has long existed that could make the recorders obsolete. As the Bureau d'Enquêtes et d'Analyses (BEA), the French agency that investigates air accidents, struggled to explain the crash in two inconclusive interim reports in 2009, the question was already being asked: if real-time stock quotes can be transmitted to anyone with a smartphone, why does the vital work of investigating an aeroplane crash still depend on reading physical memory chips that must be rescued from the wreckage?

Read more:
Joan Miro, Still Life with Old Shoe.
via:
Unknown.
via:

Timeshare Wars

by Marilyn W. Thompson

Hollywood couldn’t create a more perfect movie setting than Sedona, Ariz., with its craggy red rocks and all those junipers. So while vacationing with my film-obsessed son, I thought it only natural to stroll into the free-admission Sedona Motion Picture Museum.

Housed in a storefront on the city’s main shopping strip, the museum seemed a bit “lame,” as my son bluntly put it. Its collection consisted of framed photos of bygone westerns, with stars such as Jimmy Stewart and John Wayne, filmed against the area’s stunning scenery. Within 15 minutes, we had seen everything and were ready to head to the nearby Cowboy Club to sample cactus fries.

But a museum attendant stopped us and soon revealed the true purpose of this pseudo-attraction. The museum was partly a marketing device to entice tourists to timeshare pitches at a 14-year-old resort development affiliated with RCI, one of America’s largest vacation ownership exchange companies. In the timeshare trade, the attendant is known as a “tour generation representative,” earning commissions for making “off-premises contacts” with potential buyers. Like slow-witted sheep, my son and I had walked clear-eyed into a booby trap.

What evolved over the next few days was a revealing look at the hard-core salesmanship of timeshare developers and, by extension, of the companies they contract to provide exchange services for buyers — RCI and its principal timeshare rival, the publicly traded leisure company Interval International. Anyone who has ever owned a timeshare has experienced the relentless push during precious relaxation time to persuade you to invest in more weeks, or more “points,” at more resorts in more locations. It’s an oft-repeated ritual whenever you check in for a timeshare swap: groggy travelers presented with “invitations” for timeshare previews within minutes of getting the keys to their rooms.

Read more: 

image credit:

Read My Lips

[ed.  I'm no fan of taxes, but I understand their necessity.  It's what government does with those dollars (and who's exempt from paying them) that I find most frustrating.  Mulish stubbornness promoting simple black and white solutions to complex problems will never get my vote.  Put it this way:  if you had two applicants applying for a job, which would you hire, the one who believes in the company, or the one who wants to starve it to it's core?  Developing efficient policies that benefit the entire country (not just corporations and the wealthy) should be the issue, not fundamental revenue generation.]

by  Steven Mufson

A scorching summer. A struggling economy. A stalemate in budget talks. A Republican leader reluctant to break his anti-tax pledge. Democrats balking at spending cuts. A proposal for a balanced budget amendment.

It was 1990, the year Congress passed one of the biggest deficit-reduction packages in American history. But before it was cemented into law, the country endured months of bickering and brinksmanship. Sound familiar?

By some measures, the 1990 budget deal was a success: It helped shrink the deficit, then at 5 percent of gross domestic product, by $492 billion — $850 billion in today’s dollars — over just five years. And it passed with support from both parties.  But in other ways, the 1990 budget deal set the stage for today’s fiscal deadlock. At the center of it all was the Dirty Harry-style pledge that President George H.W. Bush had issued during his 1988 presidential campaign — “Read my lips: No new taxes.” Although an agreement was eventually reached that raised taxes and cut spending, many Republican lawmakers thought the deal and its aftermath proved the folly of compromise.

“The 1990 budget agreement was real bloodshed. It was a civil war within the party,” says John Feehery, who worked for Republican former congressmen Tom DeLay (Tex.),J. Dennis Hastert (Ill.) and Robert H. Michel (Ill.), who was at the center of the 1990 dealmaking.“We’re still living in the world of that agreement. That’s when it became really radioactive to vote for tax increases.”

Read more:

----------------------

by  Frank Bruni

WHAT does the face of antitax absolutism look like?

It has a tentative beard, more shadow than shag, like an awkward weigh station on the road from callow to professorial. It wears blunt glasses over narrowed eyes that glint mischievously, and its mouth is rarely still, because there’s no end to the jeremiads pouring forth: about the peril of Obama, the profligacy of Democrats and the paramount importance of opposing all tax increases, even ones that close the loopiest of loopholes.

It belongs to Grover Norquist, and if you hadn’t seen it before, you probably spotted it last week, as he pinged from CNN to MSNBC to Fox, reveling in the solidarity Republicans had shown against any new revenue. The country was lurching toward a possible default, but Norquist was riding high. In between television appointments on Thursday, he met me for breakfast near Times Square.

As he walked in and sat down he was sermonizing. As he got up and left an hour later he was still going strong. He seems to live his whole life in midsentence and takes few detectable breaths, his zeal boundless and his catechism changeless: Washington is an indiscriminate glutton, and extra taxes are like excess calories, sure to bloat the Beast.

...It’s the group Norquist runs, Americans for Tax Reform, that has been pressing politicians for decades to sign a pledge not to vote for any net tax increase under any circumstances. All but 6 of the 240 Republicans in the House, along with two Democrats, have done so.

...But vanity is too commonplace inside the Beltway to be troubling. What’s alarming about Norquist and the pledge mentality, which has spread to other causes and other points of the political spectrum, is their promotion of the idea that political rigidity is to be prized above all else. That purity is king. Such a theology precludes nimbleness and compromise, which are not only the hallmarks of maturity but also the essence of sane government.

Read more:

Ani DiFranco




Saturday, July 30, 2011

Scientific Tuesdays


[ed.  Cool project to try with your kids or grandchildren this weekend.]

The Roach in Repose

[ed.  The story about Searching (a couple posts down) got me thinking about something that's always been a mystery to me.  In Hawaii there are dead cockroaches everywhere.  In fact, it's an odd day when you don't find at least a couple, somewhere.  But the thing is, a lot of them aren't really that old.  You see all age classes and most of the deceased look relatively serviceable (in terms of their condition) except for one small detail.  So what's up with that?  Parasites, territorial disputes, starvation (hard to believe), incidental poisonings, pre-mature old age, cockroach AIDS?  I have no idea, and, surprisingly, can't seem to find much clear information anywhere.  But now I do know why cockroaches die on their backs.]

Dear Cecil:

While working part-time in the food service at USC, I had the opportunity to see thousands of dead cockroaches. One thing about these roaches intrigues me: why did they all die on their backs? Is it programmed into their tiny little genes, or do they do it just to bug us?


— Leslie, University of Southern California, Los Angeles

Dear Leslie:

Frankly, if I saw thousands of dead cockroaches at the food service where I went to school, I'd have other things on my mind than why they all died on their backs. Besides, they don't always die that way--basically it depends on how the little scumbags happen to meet their Maker. I've been discussing the subject with the crack bug scientists at some of the nation's leading institutions of higher learning, and we've formulated the following Roach Mortality Scenarios, which represent a major step forward in our understanding of roach postmortem positioning:

(1) Roach has heart attack while crawling on the wall. OK, so maybe roaches don't have heart attacks. Just suppose the roach croaks somehow and tumbles earthward. The aerodynamics of the roach corpse (smooth on the back, or wing side; irregular on the front, or leg side) are such that the critter will tend to land on its back. Or so goes the theory. Admittedly the study of bug airfoil characteristics is not as advanced as it might be.

(2) Roach desiccates, i.e., dries out, after the manner of Gloria Vanderbilt. This is what happens when you use Cecil's Guaranteed Roach Assassination Technique, described elsewhere in this archive. The roach saunters carelessly through the lethal borax crystals, causing him to lose precious bodily fluids and eventually die. Since this process is gradual, it may happen that the roach simply conks out and dies on its belly.

(3) Roach dies after ingesting potent neurotoxins, e.g., Diet Coke, some traditional bug poison like pyrethrum, or the food served at USC cafeterias. Neurotoxins cause the roach to twitch itself to death, in the course of which it will frequently kick over on its back, there to flail helplessly until the end comes. No doubt this accounts for the supine position of the deceased cockroaches you observed.

One unresolved issue. Having seen thousands of dead roaches, did it occur to you to avail yourself of, say, a broom?

— Cecil Adams

via:
image:

Medium Chill

by David Roberts
 
Mother Jones has an interesting package up called "Speedup: Working More, Making Less." In the lead story, editors Monika Bauerlein and Clara Jeffery chronicle the harrowing pace of modern work life: the long hours and hairy commutes, multitasking and endless accessibility, the sense of always being busy, always falling behind, always doing a crappy job on both sides of the work/life ledger.

Bauerlein and Jeffery discuss the phenomenon mainly in terms of external forces acting on workers -- a system of laws and regulations comprehensively biased in favor of employers. And that's where the main focus should be; policy changes are the stuff of organizing and politics. But it reminded me I've been meaning to write something about the other side, the internal forces impelling us to work harder and harder. We are being driven, but we are also driving ourselves. Finding saner, happier, more sustainable lives will involve addressing both sides of the equation.

About a year ago, I was visiting with an old friend of mine who lives in Portland now. He's helping to run a tech startup, working 80-hour weeks, half that on the road, with barely enough time at home to maintain a relationship with his dog, much less a romance. The goal, he said, is to grow like crazy, get bought out by Google, and retire at 40. "It's the big chill, man!" (No, Boomers, not the movie.)

I shook my head and laughed. "I'll take the medium chill!"

Ever since then I've been mulling that concept over. By way of approaching it, I'm going to talk a little about personal experience, so if that kind of thing bugs you, skip on down, there's some social science geekery below.

Personal chill

"Medium chill" has become something of a slogan for my wife and me. (We might make t-shirts.) We're coming up on 10 years married now, but we recognized our mutual love of medium chill within weeks of meeting, about the time we found ourselves on her couch watching scratchy bootleg VHS tapes of The Sopranos I ordered off eBay, drinking Two Buck Chuck, and loving life. We just never knew what to call it.

We now have a smallish house in a nondescript working class Seattle neighborhood with no sidewalks. We have one car, a battered old minivan with a large dent on one side where you have to bang it with your hip to make the door shut. Our boys go to public schools. Our jobs pay enough to support our lifestyle, mostly anyway. If we wanted, we could both do the "next thing" on our respective career paths. She could move to a bigger company. I could freelance more, angle to write for a bigger publications, write a book, hire a publicist, whatever. We could try to make more money. Then we could fix the water pressure in our shower, redo the back patio, get a second car, or hell, buy a bigger house closer in to town. Maybe get the kids in private schools. All that stuff people with more money than us do.

But ... meh. It's not that we don't think about those things. The water pressure thing drives me batty. Fact is, we just don't want to work that hard! We already work harder than we feel like working. We enjoy having time to lay around in the living room with the kids, reading. We like to watch a little TV after the kids are in bed. We like going to the park and visits with friends and low-key vacations and generally relaxing. Going further down our respective career paths would likely mean more work, greater responsibilities, higher stress, and less time to lay around the living room with the kids.

So why do it? There will always be a More and Better just beyond our reach, no matter how high we climb. We could always have a little more money and a few more choices. But as we see it, we don't need to work harder to get more money to have more choices because we already made our choice. We chose our family and our friends and our place. Like any life ours comes with trade-offs, but on balance it's a good life, we've already got it, and we're damn well going to enjoy it. 

Read more:

image credit:

Searching

by Nishant Batsha


Type “why am I” into a Google search and autocomplete will suggest “why am I here?” Type “why did” and you’ll find “why did I get married?” These questions seem so hackneyed, the kind of generic lamentations you might hear in a bad movie. And yet, Google’s autocomplete algorithm insists that searches relating to marital strife and existence are, in fact, incredibly common. This has led me to wonder again and again: has Google become one of our expressions of existential moaning?

Outside of the confines of autocomplete, we generally know very little about each other’s online searches (although blog metrics can provide surprising—and sometimes bizarre—insights). But back in 2006, AOL’s research division released a text file containing 20 million searches conducted by 650,000 users over a three-month period. While the furor surrounding the privacy implications of the release ultimately led AOL to remove the file from its website and issue an apology, the document remains easily downloadable.

If Google's autocomplete gives us a broad picture of how people use search engines in times of crisis, these AOL logs provide much more detailed case histories. Take this series from User #71845: “Why do men have online affairs,” “sin to feel pleasure when other hurt,” “why do women accept infedelty." From User #2413067: “why can’t I save money”. User #2446971: “why didn’t mom want me to get married,” “my ex husband is dying and I would like to speak with him,” “why are you so cold to me on mother’s day,” “why are the men in my life including my son emotionally beating me.” User #3898228: “in speculation the worst feeling in the world is the dawning of realization. when you wake up and realize that nothing is as it should be and everything is wrong. when you wake up and put your situations into words and want to cry. nothing is right.” (Surprisingly, this isn't a song lyric or a poem, but instead seems to mark a sort of meta-realization: the realization about the dawning of realizations). User #4553622: “why am I not lucky,” followed up with “who are the lucky people." All of these were turned up from a brief search of one of ten logs, using “why” as my ctrl+f.

It would be easy to dismiss this as the wackiness of AOL users, but I’ve found that in certain moments, either when it’s too late, or I’m too tired, or I can’t quite muster the nerve to click anyone’s name on my Gchat list, I end up typing into a search engine the particular crisis that confronts me. Questions that are self-consciously academic (why is it that emptiness tends to co-exist with our late capitalism?) or simply existential (why is this all so meaningless?) appear in the search box. I’ve confessed this habit to friends, who at first tend to label it as just another idiosyncrasy. After a few drinks, the confession surfaces: they too find themselves seeking solace in Google from time to time. To borrow a turn of phrase from Søren Kierkegaard, we all seem to be suffering the Sickness Unto Search. We are Existential Googlers.

But why? There’s no denying the fact that the Internet is a Big Place. At the 2010 Techonomy Conference, Eric Schmidt, the CEO of Google, repeated what had been kicking around Google’s press releases for a few years: the size of the Internet was about five million terabytes (5.12 billion gigabytes), with Google indexing about 205,000 gigabytes of that information (a measurement from the Dawn of Civilization until 2003—it’s unclear whether Google just stopped keeping track after that). It should come as no surprise then that, with this amount of data sitting on our desktops, nestled away in our pockets, we channel our existential angst through the search box, believing that somewhere in that tangle of information must be stored some crucial piece of advice.

Or maybe we're not even looking for advice. Maybe we're looking for company. Writing this, I looked up to find the carcass of a dead yellowjacket on my windowsill (bear with me for a moment; I promise I’m not suffering from Diminished Attention Span, another Web Disorder). The stark un-digital life and death of the insect caught my attention. This dead bug was unfortunately not alone. If you were to look out my window and glance towards the uppermost outer ledge, you will find a parade of yellowjackets flying to and fro; they have a small nest in a crevice in the bricks above. Sometimes one somehow slips through the small crack of an open space and makes its way inside. Here, it becomes frantic, desperate. In its longing to escape from its newfound prison, it begins to ram its head, over and over again, into the glass, as if hoping to make the window give way by sheer force of will. If I'm out of the room when this happens, I will find its still body later. This one's analog death brought me back towards humanity. What kind of loneliness does one feel in that kind of isolation?

Most of us don’t need to think about yellowjackets in order to arrive at this question; most of us experience isolation on a regular basis. Blame it on the cruelties of modernity, the banalities of existence, or the Xanax prescription running out, this kind of capital-L “what am I doing in this world” loneliness is pretty common. And we all respond to it in different ways: exercise, chain-smoking, incessant status-updates on Facebook, alcohol, blogging. Despite all the advances in personal technology that seek to link us with each other in a myriad of ways, we are continually confronted with chasms of emptiness. And yet, we return to technology, seeking to find answers in the network that lay behind the search box.

It might be useful if I were able to use these searches to develop some sort of grand unified theory of our collective post-modern psyche. But instead I can only grapple with why I keep entering my questions into the engine. I know that when I do, I'm never quite looking for an answer. Instead, I tend to hope for a brief moment of catharsis. I’ll scan a few search results, sure; maybe I’ll even browse a link or two. But nothing is more satisfying then that moment of Existential Googling and pressing enter. Emotion, search, relief.

I can only speak for myself here; I don't know the reasons for other people's searches, just as I don't know why User #2446971 spent, according to her search timestamps, a sizeable portion of her Mother’s Day asking a data-mining algorithm why her son has abandoned her. Perhaps this is simply another iteration of calling out into the dark, whispering prayers on bended knee, or lying under the stars. The singing of psalms, the singing of qawwali. Augustine and his pears, Sartre and his nausea. A teleology of adaptation, a continual movement of our despair.

I keep thinking of the yellowjackets on my windowsill. They too exist in networks of kith, kin and information. And I can’t help but wonder if, in its last moments, the dying creature secreted forlorn pheromones into the unusually still air of my room. I can only imagine its search query: how did I get into this prison? Why am I here?

via:

A Daunting Path to Prosperity

by Liz Alderman


PISA, Italy — Six years ago, the Swedish retail giant Ikea planned a 60-million-euro megastore just a few miles from where the Tower of Pisa leans into the earth. Backers said the huge construction project, new roads and wave of shoppers would bring hundreds of sorely needed jobs to this bucolic corner of Tuscany.

But things got tangled — as they often do in Italy, where bureaucracy and politics can easily overwhelm economics.

Each application that Ikea filed seemed to require yet another. Each mandatory impact study begat the next. By May, when a local mayor had still not decided whether the company could get a building permit, Ikea put out word it would abandon the plan.

As Italy teeters on the edge of the European debt crisis, it can ill afford more debacles like that one. Otherwise, despite having the world’s seventh-largest economy, Italy may have little hope of outgrowing the staggering debt load that could threaten its financial future — and that of the euro monetary union.

Already, investors seem skeptical whether Italy and other debt-saddled European countries can right themselves, despite the financial rescue plan for Greece that Europe’s leaders agreed to last week.

On Thursday, Italy’s borrowing costs jumped almost a full percentage point at an auction of 10-year bonds, compared with just one month ago. At 5.77 percent, the interest rate was more than twice what financially buoyant Germany must pay on bonds of the same maturity. As higher interest rates make it even harder for Italy to reduce its debt, the main recourse would seem to be faster growth.

“This is the only major issue for Italy now — to resume growth,” said Francesco Giavazzi, an economics professor at Bocconi University and a research fellow at the Center for Economic Policy Research in London.

Italy must not only encourage big corporate investments like the Ikea project, experts say, but it must also remove impediments that stifle growth in the thousands of small and medium-size companies that make up the backbone of its economy.

One small-business man, Mauro Pelatti, says he has given up on expanding his business in Florence, an hour east of here. “Bureaucracy is so strong, and taxes are so high, that it’s virtually impossible,” said Mr. Pelatti, whose privately held company, Omap, makes parts for steel-stamping machines used on products like Vespa scooters.

Italy’s economy experienced paltry growth starting in the late 1990s, when the country’s manufacturing was overtaken by competitors in Asia. Then came the global financial crisis in 2007, which shrank Italy’s economy by more than 6 percent.

Growth has resumed, but the International Monetary Fund predicts “another decade of stagnation,” with Italy’s gross domestic product expanding by only about 1.4 percent annually in the next few years. (The German economy, Europe’s growth leader, grew 3.5 percent in 2010 and grew by 1.5 percent in the first quarter compared with the same period a year ago.)

Hindering growth is Italy’s heaving government debt, which at 119 percent of gross domestic product is second only to Greece’s among euro zone members. Although it has run a budget surplus, minus debt costs, for several years and recently passed a 48 billion deficit-reduction plan, the Italian government now spends 16 percent of that budget on interest payments — a bill that will rise if investors and creditors continue to fear that Italy cannot escape Europe’s debt crisis.

Currently, the amount of Italy’s debt held by foreigners — nearly 800 billon euros — is more than that of Greece, Ireland and Portugal combined. Should Italy stumble, the aftershocks would be more disruptive than anything the euro zone has felt so far in the crisis.

Read more:

Steve Martin

Flying Lanterns of Poznan



On the first day of summer, June 21, flying lanterns are floated on Poznan. 11,439 paper lanterns were released by Poznań to heaven to beat the Polish record.

The first day of summer, it is called. Midsummer Night, commonly called St. John's night, which is the shortest night in the entire calendar year. It is also a festival of fire, water, sun and moon, harvest, fertility, joy, and above all love!

via:

The Cults



A Drug for Down Syndrome

by Dan Hurley

Early in the evening of June 25, 1995, hours after the birth of his first and only child, the course of Dr. Alberto Costa’s life and work took an abrupt turn. Still recovering from a traumatic delivery that required an emergency Caesarean section, Costa’s wife, Daisy, lay in bed, groggy from sedation. Into their dimly lighted room at Methodist Hospital in Houston walked the clinical geneticist. He took Costa aside to deliver some unfortunate news. The baby girl, he said, appeared to have Down syndrome, the most common genetic cause of cognitive disabilities, or what used to be called “mental retardation.”

Costa, himself a physician and neuroscientist, had only a basic knowledge of Down syndrome. Yet there in the hospital room, he debated the diagnosis with the geneticist. The baby’s heart did not have any of the defects often associated with Down syndrome, he argued, and her head circumference was normal. She just didn’t look like a typical Down syndrome baby. And after all, it would take a couple weeks before a definitive examination would show whether she had been born with three copies of all or most of the genes on the 21st chromosome, instead of the usual two.

Costa had dreamed that a child of his might grow up to become a mathematician. He had even prevailed upon Daisy to name their daughter Tyche, after the Greek goddess of fortune or chance, and in honor of the Renaissance astronomer Tycho Brahe. Now he asked the geneticist what the chances were that Tyche (pronounced Tishy) really had Down syndrome.

“In my experience,” he said, “close to a hundred percent.”

Costa and his wife had been trying to have a baby for a couple of years. Daisy’s first pregnancy ended in a miscarriage, which they knew can occur because of a genetic disorder in the fetus. When Daisy became pregnant a second time, Costa insisted they get a chorionic villus sampling, an invasive prenatal genetic test. But the procedure caused a miscarriage. (The test showed that the fetus was genetically normal.) Costa vowed that if there was a third pregnancy — this one — they would conduct no prenatal tests.

Now, with Tyche bundled peacefully in a bassinet at the foot of Daisy’s bed, and Daisy asleep, Costa sat up through most of the night crying. He had gone into the research side of medicine in part to avoid scenes like this — parents devastated by a diagnosis. But by morning, he found himself doing what any father of a newborn might: hovering by the crib, holding his daughter’s hand and marveling at her beauty.

“From that day, we bonded immediately,” he told me during one of our many talks over the last year. “All I could think is, She’s my baby, she’s a lovely girl and what can I do to help her? Obviously I was a physician and a neuroscientist who studies the brain. Here was this new life in front of me and holding my finger and looking straight in my eyes. How could I not think in terms of helping that kid?”

With no experience in the study of Down syndrome, Costa took a short walk the next day to a library affiliated with Baylor College of Medicine, where he worked as a research associate in neuroscience. Reading the latest studies, he learned that the prognosis was not nearly as dire as it was once considered. Life expectancies had grown, education reforms had produced marked gains in functioning and — of particular interest to Costa — a mouse model of the disorder had recently been developed, opening the door to experimentation. He soon made a decision: he would devote himself to the study of Down syndrome.

In 2006, using mice with the equivalent of Down syndrome, Costa published one of the first studies ever to show that a drug could normalize the growth and survival of new brain cells in the hippocampus, a structure deep within the brain that is essential for memory and spatial navigation. In people with Down syndrome, the slower pace of neuron growth in the hippocampus is suspected to play a key role in cognitive deficits. Follow-up studies by other researchers reached conflicting results as to whether the drug Costa had tested, the antidepressant Prozac, could produce practical gains on learning tests to match its ability to boost brain-cell growth. Undeterred, Costa moved on to another treatment strategy. In 2007 he published a study that showed that giving mice with Down syndrome the Alzheimer’s drug memantine could improve their memory.

Now Costa has taken the next step: he is completing the first randomized clinical trial ever to take a drug that worked in mice with Down and apply it to humans with the disease, a milestone in the history of Down-syndrome research.

“This was a disorder for which it was believed there was no hope, no treatment, and people thought, Why waste your time?” says Craig C. Garner, a professor of psychiatry and behavioral sciences and co-director of the Center for Research and Treatment of Down Syndrome at Stanford University. “The last 10 years have seen a revolution in neuroscience, so that we now realize that the brain is amazingly plastic, very flexible, and systems can be repaired.”

But the effects of that revolution on Down research may yet be cut short. A competing set of scientists are on the cusp of achieving an entirely different kind of medical response to Down syndrome: rather than treat it, they promise to prevent it. They have developed noninvasive, prenatal blood tests which would allow for routine testing for Down syndrome in the first trimester of a pregnancy, raising the specter that many more parents would terminate an affected pregnancy. Some predict that one of the new tests could be available to the public within the year.

Costa, like others working on drug treatments, fears that the imminent approval of those tests might undercut support for treatment research, and even raises the possibility that children like Tyche will be among the last of a generation to be born with Down syndrome.

“It’s like we’re in a race against the people who are promoting those early screening methods,” Costa, who is 48, told me. “These tests are going to be quite accessible. At that point, one would expect a precipitous drop in the rate of birth of children with Down syndrome. If we’re not quick enough to offer alternatives, this field might collapse.”

Read more:

Genki Rockets


Sensory Trip to a Paradise on Screen

by Seth Schiesel


Spectral whales undulate through ethereal mist, their hides speckled with winking gems that pulsate to a throbbing bass line. Gleaming motes of color spin and coalesce into flowers, layer upon layer of shimmering petals. A girl named Lumi, born in outer space and transported to cyberspace, beckons you to rescue her digital consciousness.

Sounds pretty trippy, huh?

This is Child of Eden, the luminous new game from the Japanese auteur Tetsuya Mizuguchi and one of the most inspirational exhibits of artistry to be found in interactive entertainment today. The game — available now for the Xbox 360 and scheduled to arrive for the PlayStation 3 in September — was developed by Q Entertainment (of which Mr. Mizuguchi was a founder) and published by Ubisoft.

In the quest for commercial success, so many games (like so much of any medium) end up similar. The formulas are known, the structures accepted. For many top games the question becomes how well they fulfill and execute the basic template of their genre.

Child of Eden is an example of what can happen when creativity is liberated from the bounds of convention. It hews to only the most basic form of an arcade-style shooting game (stuff is whirling around on a screen; shoot it), perhaps in the way that even rebellious painters hew to the convention of stretching canvas across a wooden frame. From there Mr. Mizuguchi goes wild, integrating music, sound and the player’s own physical movement into a full-body experience.

And that is because Child of Eden makes the best use yet of the new Kinect system for the Xbox 360. Kinect, introduced by Microsoft last fall, does away with the video game controller altogether. Using advanced technology and software, the Kinect sensor, which sits under your television, can see your body in three dimensions and recognize your voice. So in all sorts of games, you just lean if you want your character to lean. If you want it to jump, you jump, and so on.

Read more:

Useless Studies, Real Harm

by Carl Elliot

Last month, the Archives of Internal Medicine published a scathing reassessment of a 12-year-old research study of Neurontin, a seizure drug made by Pfizer. The study, which had included more than 2,700 subjects and was carried out by Parke-Davis (now part of Pfizer), was notable for how poorly it was conducted. The investigators were inexperienced and untrained, and the design of the study was so flawed it generated few if any useful conclusions. Even more alarming, 11 patients in the study died and 73 more experienced “serious adverse events.” Yet there have been few headlines, no demands for sanctions or apologies, no national bioethics commissions pledging to investigate. Why not?

One reason is that the study was not quite what it seemed. It looked like a clinical trial, but as litigation documents have shown, it was actually a marketing device known as a “seeding trial.” The purpose of seeding trials is not to advance research but to make doctors familiar with a new drug.

In a typical seeding trial, a pharmaceutical company will identify several hundred doctors and invite them to take part in a research study. Often the doctors are paid for each subject they recruit. As the trial proceeds, the doctors gradually get to know the drug, making them more likely to prescribe it later.

In an age of for-profit clinical research, this is the new face of scandal. Pharmaceutical companies promote their drugs with pseudo-studies that have little if any scientific merit, and patients naïvely sign up, unaware of the ways in which they are being used. Nobody really knows how often companies conduct such trials, but they appear with alarming regularity in pharmaceutical marketing documents. In the marketing plan for the antidepressant Lexapro for the 2004 fiscal year, Forest Laboratories described 102 Phase IV trials — the classification under which seeding trials fall — in a section labeled “Marketing Tactics.”

Oversight bodies like the Food and Drug Administration generally don’t view seeding trials as research scandals: seeding trials are not illegal, and the drugs in question have already received F.D.A. approval. But even after particularly egregious seeding trials have been exposed, the F.D.A. has not issued sanctions. Take the notorious Advantage study, a seeding trial of the pain reliever Vioxx conducted by Merck. According to a 2008 report in the Annals of Internal Medicine, litigation documents show that the Advantage study was conceived and managed by Merck’s marketing department. Three subjects died in the Advantage trial; five more subjects experienced heart attacks. Oversight bodies should treat the Advantage study as a violation of research ethics.

Read more:

Friday, July 29, 2011

Depeche Mode


How Google Dominates Us

by James Gleick

Tweets Alain de Botton, philosopher, author, and now online aphorist:
The logical conclusion of our relationship to computers: expectantly to type “what is the meaning of my life” into Google.
You can do this, of course. Type “what is th” and faster than you can find the e Google is sending choices back at you: what is the cloud? what is the mean? what is the american dream? what is the illuminati? Google is trying to read your mind. Only it’s not your mind. It’s the World Brain. And whatever that is, we know that a twelve-year-old company based in Mountain View, California, is wired into it like no one else.

Google is where we go for answers. People used to go elsewhere or, more likely, stagger along not knowing. Nowadays you can’t have a long dinner-table argument about who won the Oscar for that Neil Simon movie where she plays an actress who doesn’t win an Oscar; at any moment someone will pull out a pocket device and Google it. If you need the art-history meaning of “picturesque,” you could find it in The Book of Answers, compiled two decades ago by the New York Public Library’s reference desk, but you won’t. Part of Google’s mission is to make the books of answers redundant (and the reference librarians, too). “A hamadryad is a wood-nymph, also a poisonous snake in India, and an Abyssinian baboon,” says the narrator of John Banville’s 2009 novel, The Infinities. “It takes a god to know a thing like that.” Not anymore.

The business of finding facts has been an important gear in the workings of human knowledge, and the technology has just been upgraded from rubber band to nuclear reactor. No wonder there’s some confusion about Google’s exact role in that—along with increasing fear about its power and its intentions.

Most of the time Google does not actually have the answers. When people say, “I looked it up on Google,” they are committing a solecism. When they try to erase their embarrassing personal histories “on Google,” they are barking up the wrong tree. It is seldom right to say that anything is true “according to Google.” Google is the oracle of redirection. Go there for “hamadryad,” and it points you to Wikipedia. Or the Free Online Dictionary. Or the Official Hamadryad Web Site (it’s a rock band, too, wouldn’t you know). Google defines its mission as “to organize the world’s information,” not to possess it or accumulate it. Then again, a substantial portion of the world’s printed books have now been copied onto the company’s servers, where they share space with millions of hours of video and detailed multilevel imagery of the entire globe, from satellites and from its squadrons of roving street-level cameras. Not to mention the great and growing trove of information Google possesses regarding the interests and behavior of, approximately, everyone.

When I say Google “possesses” all this information, that’s not the same as owning it. What it means to own information is very much in flux.

In barely a decade Google has made itself a global brand bigger than Coca-Cola or GE; it has created more wealth faster than any company in history; it dominates the information economy. How did that happen? It happened more or less in plain sight. Google has many secrets but the main ingredients of its success have not been secret at all, and the business story has already provided grist for dozens of books. Steven Levy’s new account, In the Plex, is the most authoritative to date and in many ways the most entertaining. Levy has covered personal computing for almost thirty years, for Newsweek and Wired and in six previous books, and has visited Google’s headquarters periodically since 1999, talking with its founders, Larry Page and Sergey Brin, and, as much as has been possible for a journalist, observing the company from the inside. He has been able to record some provocative, if slightly self-conscious, conversations like this one in 2004 about their hopes for Google:
“It will be included in people’s brains,” said Page. “When you think about something and don’t really know much about it, you will automatically get information.”
“That’s true,” said Brin. “Ultimately I view Google as a way to augment your brain with the knowledge of the world. Right now you go into your computer and type a phrase, but you can imagine that it could be easier in the future, that you can have just devices you talk into, or you can have computers that pay attention to what’s going on around them….”
…Page said, “Eventually you’ll have the implant, where if you think about a fact, it will just tell you the answer.”
Read more:

Western Spaghetti


*2009 Sundance Film Festival Winner *TIME Magazine voted #2 Viral Video of the Year *2009 Audience Award, Annecy Animation Festival.

Friday Book Club - Clapton

by Stephen King

[ed.  Clapton is like the Zelig of Rock and Roll culture and history.  He's everywhere, and has some amazing stories to tell.]

Most A.A. meetings begin with the chairman offering his qualifications at the head table next to the coffee maker. This qualification is more commonly known in the program as the drunkalogue. It’s a good word, with its suggestions of inebriated travel, and it certainly fits Eric Clapton’s account of his life. “Clapton” is nothing so literary as a memoir, but its dry, flat-stare honesty makes it a welcome antidote to the macho fantasies of recovery served up by James Frey in “A Million Little Pieces.”

A drunkalogue consists of three parts: what it was like, what happened and what it’s like now. Following a format that Clapton, now 20 years sober, could probably recite in his sleep, the world’s most famous rock-and-blues guitarist duly — and sometimes dutifully — covers the bases. He is rarely able to communicate clearly what his music means to him (“It’s difficult to talk about these songs in depth,” he says at one point; “that’s why they’re songs”), but his writing is adequate to the main task, which is describing how he became the rock ’n’ roll version of Harry Potter: Clapton is, after all, the Boy Who Lived. And this drunkalogue has other things to recommend it; to my knowledge, no other addict-alcoholics can claim to have filched George Harrison’s wife or escaped — barely — dying in a helicopter crash with Stevie Ray Vaughan. Both Clapton’s and Vaughan’s choppers took off into heavy fog after a show in Wisconsin. Vaughan’s turned the wrong way and crashed into an artificial ski slope.

I’ve heard it suggested at recovery meetings that the true alcoholic is almost always an overachiever with a bad self-image, and Clapton fits this profile as well as any. After millions of records sold, thousands of S.R.O. concert dates and decades of conspicuous consumerism (Visvim shoes, Patek Philippe watches, a yacht), he can still call himself “a toe-rag from Ripley.”

That’s the small town in Surrey where Clapton grew up. He discovered, as a child of 6 or 7, that the couple he believed to be his parents were really his grandmother and step-grandfather. His mother was actually the daughter of Rose Clapp and her first husband, Rex Clapton. His father was a married Canadian airman named Edward Fryer: “The truth dawned on me, that when Uncle Adrian jokingly called me a little bastard, he was telling the truth.”

Clapton’s first guitar (he seems to remember them all) was a Hoyer too big for him, and painful to play; his first addiction, Horlicks and Ovaltine tablets stolen from the local sweet shop; his first encounter with the sexual embarrassment that would haunt him for years came with a school caning (“six of the best”) after asking a schoolmate, with no idea what the query meant, if she might “fancy a shag.” He got drunk for the first time at 16 and woke alone in the woods, with fouled trousers, vomit on his shirt and no money. Then he adds the perfect drunkalogue kicker: “I couldn’t wait to do it all again.”