Tuesday, August 23, 2011
Lovelorn in a Facebook Age
I woke up one day last week to an anguished email from a friend whose girlfriend had just broken up with him. He had an urgent question: How could he take his mind off her so that he wouldn't call or text her?
I was momentarily stumped. What advice did I have for coping with one of life's worst experiences—losing a romantic partner? What would help him channel his energy into positive, productive activities?
It's no secret that when we lose a lover, we tend to lose our willpower. Suddenly, despite our best intentions, we fall prey to obsessive thoughts ("What did I do wrong?"), feelings ("I'll be alone forever") and actions (calling, emailing, texting).
I reflected on the advice I got after a major breakup almost two years ago. "Literature, my dear, literature… " began one email from a good friend. My mother reminded me to listen to music because "it soothes the soul." Others suggested exercise, volunteer work, travel. All excellent advice—and difficult to follow when you are in pain.
"It's not a heartbroken thing, it's a brain-broken thing," says Marianne Legato, a cardiologist and founder of the Partnership for Gender-Specific Medicine at Columbia University.
Therapists say the emotional stages after a breakup parallel the well-known stages of grief: denial, anger, bargaining, depression, acceptance, rebuilding. In general, the more meaningful the relationship, the longer it will take to move through the stages after a breakup. Figure a couple months for a short relationship, six months to a year for one that lasted a few years, and two to three years to recover after a long-term marriage, says Tina B. Tessina, a marriage and family therapist in Long Beach, Calif.
In the age of smartphones and iPads, though, it's easy to try to hang on, simply by peeking at your ex's Facebook page or Twitter feed. Did your former flame call? Pretend you're just checking the time on your phone. Is he still ignoring you? Send a quick text. What we're looking for when we engage in obsessive behavior like this is the dopamine fix that comes when we hear back from the object of our obsession. "It's like we have a cocaine craving," says Dr. Legato.
Read more:
I was momentarily stumped. What advice did I have for coping with one of life's worst experiences—losing a romantic partner? What would help him channel his energy into positive, productive activities?
It's no secret that when we lose a lover, we tend to lose our willpower. Suddenly, despite our best intentions, we fall prey to obsessive thoughts ("What did I do wrong?"), feelings ("I'll be alone forever") and actions (calling, emailing, texting).
I reflected on the advice I got after a major breakup almost two years ago. "Literature, my dear, literature… " began one email from a good friend. My mother reminded me to listen to music because "it soothes the soul." Others suggested exercise, volunteer work, travel. All excellent advice—and difficult to follow when you are in pain.
"It's not a heartbroken thing, it's a brain-broken thing," says Marianne Legato, a cardiologist and founder of the Partnership for Gender-Specific Medicine at Columbia University.
Therapists say the emotional stages after a breakup parallel the well-known stages of grief: denial, anger, bargaining, depression, acceptance, rebuilding. In general, the more meaningful the relationship, the longer it will take to move through the stages after a breakup. Figure a couple months for a short relationship, six months to a year for one that lasted a few years, and two to three years to recover after a long-term marriage, says Tina B. Tessina, a marriage and family therapist in Long Beach, Calif.
In the age of smartphones and iPads, though, it's easy to try to hang on, simply by peeking at your ex's Facebook page or Twitter feed. Did your former flame call? Pretend you're just checking the time on your phone. Is he still ignoring you? Send a quick text. What we're looking for when we engage in obsessive behavior like this is the dopamine fix that comes when we hear back from the object of our obsession. "It's like we have a cocaine craving," says Dr. Legato.
Read more:
Case History Of A Wikipedia Page: Nabokov’s 'Lolita'
by Emily Morris
Wikipedia has an article on almost every subject—including, it turns out, one on how to write "the perfect Wikipedia article." The guidelines run through a list of the attributes such an article would have—e.g., "[i]s precise and explicit," "[i]s well-documented," "[i]s engaging"—before ending on a cautionary note: The perfect Wikipedia article is, by virtue of the collaborative editing process that creates it, "not attainable": "Editing may bring an article closer to perfection, but ultimately, perfection means different things to different editors." And as editors pursue perfection, they also must keep in mind another essential quality of a good Wikipedia entry: neutrality. That is, no matter how controversial a topic, an article must present "competing views on controversies logically and fairly, and pointing out all sides without favoring particular viewpoints."
As a member of the Arbitration Committee, Ira Matetsky settles the kinds of editorial disputes that controversial articles tend to incite. In a series of thoughtful guest posts on The Volokh Conspiracy, Matetsky explained some of the mechanics behind the editorial process. He noted that, generally, while “articles on non-contentious topics are usually accurate; articles on highly contentious articles are usually accurate on basic facts, but can be subject to bias and dispute with respect to the matters in controversy.” As a way of investigating Matetsky's point (and with Wikipedia editathons making news), we thought we'd chart the history of a single Wiki entry by using that nifty "View History" button. And what's a page that's constantly being edited, has as its subject a work of art with an, ahem, unconventional sense of morality, and is therefore constantly subjected to the editing whims of people with strong opinions, moral or otherwise? She goes by many names, but on my greasy MacBook Pro screen, she is always "Lolita."
Since 2001, the Wikipedia entry on Vladimir Nabokov's Lolita has been edited 2,303 times. It's a popular entry, too: of approximately 750,000 Wiki articles out there, it ranks at 2,075 in traffic.
Read more:
Wikipedia has an article on almost every subject—including, it turns out, one on how to write "the perfect Wikipedia article." The guidelines run through a list of the attributes such an article would have—e.g., "[i]s precise and explicit," "[i]s well-documented," "[i]s engaging"—before ending on a cautionary note: The perfect Wikipedia article is, by virtue of the collaborative editing process that creates it, "not attainable": "Editing may bring an article closer to perfection, but ultimately, perfection means different things to different editors." And as editors pursue perfection, they also must keep in mind another essential quality of a good Wikipedia entry: neutrality. That is, no matter how controversial a topic, an article must present "competing views on controversies logically and fairly, and pointing out all sides without favoring particular viewpoints."
As a member of the Arbitration Committee, Ira Matetsky settles the kinds of editorial disputes that controversial articles tend to incite. In a series of thoughtful guest posts on The Volokh Conspiracy, Matetsky explained some of the mechanics behind the editorial process. He noted that, generally, while “articles on non-contentious topics are usually accurate; articles on highly contentious articles are usually accurate on basic facts, but can be subject to bias and dispute with respect to the matters in controversy.” As a way of investigating Matetsky's point (and with Wikipedia editathons making news), we thought we'd chart the history of a single Wiki entry by using that nifty "View History" button. And what's a page that's constantly being edited, has as its subject a work of art with an, ahem, unconventional sense of morality, and is therefore constantly subjected to the editing whims of people with strong opinions, moral or otherwise? She goes by many names, but on my greasy MacBook Pro screen, she is always "Lolita."
Since 2001, the Wikipedia entry on Vladimir Nabokov's Lolita has been edited 2,303 times. It's a popular entry, too: of approximately 750,000 Wiki articles out there, it ranks at 2,075 in traffic.
Read more:
In Defense of Distraction
[ed. Interesting article on the benefits of focusing, and unfocusing.]
by Sam Anderson
by Sam Anderson
I. The Poverty of Attention I’m going to pause here, right at the beginning of my riveting article about attention, and ask you to please get all of your precious 21st-century distractions out of your system now. Check the score of the Mets game; text your sister that pun you just thought of about her roommate’s new pet lizard (“iguana hold yr hand LOL get it like Beatles”); refresh your work e-mail, your home e-mail, your school e-mail; upload pictures of yourself reading this paragraph to your “me reading magazine articles” Flickr photostream; and alert the fellow citizens of whatever Twittertopia you happen to frequent that you will be suspending your digital presence for the next twenty minutes or so (I know that seems drastic: Tell them you’re having an appendectomy or something and are about to lose consciousness). Good. Now: Count your breaths. Close your eyes. Do whatever it takes to get all of your neurons lined up in one direction. Above all, resist the urge to fixate on the picture, right over there, of that weird scrambled guy typing. Do not speculate on his ethnicity (German-Venezuelan?) or his backstory (Witness Protection Program?) or the size of his monitor. Go ahead and cover him with your hand if you need to. There. Doesn’t that feel better? Now it’s just you and me, tucked like fourteenth-century Zen masters into this sweet little nook of pure mental focus. (Seriously, stop looking at him. I’m over here.)
Over the last several years, the problem of attention has migrated right into the center of our cultural attention. We hunt it in neurology labs, lament its decline on op-ed pages, fetishize it in grassroots quality-of-life movements, diagnose its absence in more and more of our children every year, cultivate it in yoga class twice a week, harness it as the engine of self-help empires, and pump it up to superhuman levels with drugs originally intended to treat Alzheimer’s and narcolepsy. Everyone still pays some form of attention all the time, of course—it’s basically impossible for humans not to—but the currency in which we pay it, and the goods we get in exchange, have changed dramatically.
Back in 1971, when the web was still twenty years off and the smallest computers were the size of delivery vans, before the founders of Google had even managed to get themselves born, the polymath economist Herbert A. Simon wrote maybe the most concise possible description of our modern struggle: “What information consumes is rather obvious: It consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.” As beneficiaries of the greatest information boom in the history of the world, we are suffering, by Simon’s logic, a correspondingly serious poverty of attention...
This is troubling news, obviously, for a culture of BlackBerrys and news crawls and Firefox tabs—tools that, critics argue, force us all into a kind of elective ADHD. The tech theorist Linda Stone famously coined the phrase “continuous partial attention” to describe our newly frazzled state of mind. American office workers don’t stick with any single task for more than a few minutes at a time; if left uninterrupted, they will most likely interrupt themselves. Since every interruption costs around 25 minutes of productivity, we spend nearly a third of our day recovering from them. We keep an average of eight windows open on our computer screens at one time and skip between them every twenty seconds. When we read online, we hardly even read at all—our eyes run down the page in an F pattern, scanning for keywords. When you add up all the leaks from these constant little switches, soon you’re hemorrhaging a dangerous amount of mental power. People who frequently check their e-mail have tested as less intelligent than people who are actually high on marijuana. Meyer guesses that the damage will take decades to understand, let alone fix. If Einstein were alive today, he says, he’d probably be forced to multitask so relentlessly in the Swiss patent office that he’d never get a chance to work out the theory of relativity.
I’m not ready to blame my restless attention entirely on a faulty willpower. Some of it is pure impersonal behaviorism. The Internet is basically a Skinner box engineered to tap right into our deepest mechanisms of addiction. As B. F. Skinner’s army of lever-pressing rats and pigeons taught us, the most irresistible reward schedule is not, counterintuitively, the one in which we’re rewarded constantly but something called “variable ratio schedule,” in which the rewards arrive at random. And that randomness is practically the Internet’s defining feature: It dispenses its never-ending little shots of positivity—a life-changing e-mail here, a funny YouTube video there—in gloriously unpredictable cycles. It seems unrealistic to expect people to spend all day clicking reward bars—searching the web, scanning the relevant blogs, checking e-mail to see if a co-worker has updated a project—and then just leave those distractions behind, as soon as they’re not strictly required, to engage in “healthy” things like books and ab crunches and undistracted deep conversations with neighbors. It would be like requiring employees to take a few hits of opium throughout the day, then being surprised when it becomes a problem. Last year, an editorial in the American Journal of Psychiatry raised the prospect of adding “Internet addiction” to the DSM, which would make it a disorder to be taken as seriously as schizophrenia.
Read more:
The Billionaire King of Techtopia
Peter Thiel rose to fame by launching PayPal and funding a little upstart called Facebook. You'll find his fingerprints on—and his seed money in—everything from DNA manipulation to Hollywood movies along with any Silicon Valley enterprise worth knowing about. Now the 43-year-old gay libertarian is embarking on his most ambitious venture: a start-up country on the open ocean that will be governed by his Ayn Rand—inspired ideology. Will it be Thiel's crowing achievement or the biggest bust since Waterworld
by Jonathan Miles
When Peter Thiel ventures outside for a run, typically in the early-early morning, when the fog drifts low and slow into the San Francisco Bay, he's often drawn to what the poet Lawrence Ferlinghetti called "the end of land and land of beginning." That means the San Francisco waterfront—especially the one-and-a-half-mile stretch of pathway hugging the marshy shoreline from Crissy Field to the base of the Golden Gate Bridge. Aesthetically, the appeal is obvious—a postcard view of the bridge and the bay, the lapping tidal rhythm, that sort of thing—but for Thiel, a 43-year-old investor and entrepreneur whose knack for anticipating the next big thing has yielded him a $1.5 billion fortune and an iconic, even delphic status in Silicon Valley, there's a symbolic angle as well. This waterline is precisely where the Western frontier ended, where unlimited opportunity finally hit its limit. It's also where, if Thiel is betting correctly, the next—and most audacious—frontier begins.
by Jonathan Miles
When Peter Thiel ventures outside for a run, typically in the early-early morning, when the fog drifts low and slow into the San Francisco Bay, he's often drawn to what the poet Lawrence Ferlinghetti called "the end of land and land of beginning." That means the San Francisco waterfront—especially the one-and-a-half-mile stretch of pathway hugging the marshy shoreline from Crissy Field to the base of the Golden Gate Bridge. Aesthetically, the appeal is obvious—a postcard view of the bridge and the bay, the lapping tidal rhythm, that sort of thing—but for Thiel, a 43-year-old investor and entrepreneur whose knack for anticipating the next big thing has yielded him a $1.5 billion fortune and an iconic, even delphic status in Silicon Valley, there's a symbolic angle as well. This waterline is precisely where the Western frontier ended, where unlimited opportunity finally hit its limit. It's also where, if Thiel is betting correctly, the next—and most audacious—frontier begins.
Thiel spends a lot of time thinking about frontiers. "Way more than is healthy," he admits. Not just financial frontiers, though that's his day job: He cofounded PayPal, the online money-transfer service, and, most famously, was the angel investor whose half-million-dollar loan catapulted Facebook out of Harvard's dormitories and into the lives of its 750 million users. (In The Social Network, Thiel was portrayed as the crisp venture capitalist whose investment, and dark questioning, widen the rift between Facebook's cofounders.) He manages a hedge fund, Clarium Capital, and is a founding partner in a venture-capital firm called the Founders Fund, both of them housed in an airy brick building on the campuslike grounds of the Presidio, not far from Thiel's jogging path. Yet his frontier obsession extends much further than spreadsheets, further than even technology. Political frontiers, social frontiers, scientific frontiers: All these and more crowd Thiel's head as he navigates the shoreline."We're at this pretty important point in society," he says during a brisk walk toward the Golden Gate Bridge, "where we can either find a way to rediscover a frontier, or we're going to be forced to change in a way that's really tough." Thiel is a medium-size man with a compact and blocky frame, close-trimmed reddish-brown hair, and eyes the limpid-blue color of Windex; he has a small, nasal voice and tends to exert himself as he speaks, frequently circling back to amend or reconfigure or soften what he's saying. Discussing the concept of frontiers, however, animates him to an almost uninterruptible degree; concepts, more than anything else, seem to do that.
"One of the things that's endlessly dazzling and mesmerizing is this question about the future—what the world is going to be like in 20 years, and what can or should we do to make it better than the default track that it's on," he says, gesturing with his hands while maintaining a fixed stare on the pathway. "But it's a question you can never quite master. I played a lot of chess when I was growing up, and it's similar to some elements of chess, where you can see some moves but you can't see to the end of the game. Even a computer the size of the universe couldn't actually analyze it. There's, like, 10 to the 117th power possible games and something like 10 to the 80th atoms in the observable universe, so it's off by something like 37 orders of magnitude. And chess is something much simpler than reality—it's 32 pieces on an eight-by-eight board. Figuring out the complete future of a chess game is a problem more complicated than anything that can be solved in our universe, so figuring out this planet or just our society in the next 10 or 15 years is just not a solvable problem."
Thiel (center) with his Founder's Fund partners Ken Howery (left) and Sean Parker. Photograph Tom Shierlitz
Despite the innovations of the past quarter century, some of which have made him very, very wealthy, Thiel is unimpressed by how far we've come—technologically, politically, socially, financially, the works. The last successful American car company, he likes to note, was Jeep, founded in 1941. "And our cars aren't moving any faster," he says. The space-age future, as giddily envisioned in the fifties and sixties, has yet to arrive. Perhaps on the micro level—as in microprocessors—but not in the macro realm of big, audacious, and outlandish ideas where Thiel prefers to operate. He gets less satisfaction out of conventional investments in "cloud music" (Spotify) and Hollywood films (Thank You for Smoking) than he does in pursuing big ideas, which is why Thiel—along with an all-star cast of venture capitalists, including former PayPal cohorts Ken Howery and Luke Nosek, and Sean Parker, the Napster cofounder and onetime Facebook president—established the Founders Fund. Among its quixotic but potentially highly profitable investments are SpaceX, a space-transport company, and Halcyon Molecular, which aspires to use DNA sequencing to extend human life. Privately, however, Thiel is the primary backer for an idea that takes big, audacious, and outlandish to a whole other level. Two hundred miles west of the Golden Gate Bridge, past that hazy-blue horizon where the Pacific meets the sky, is where Thiel foresees his boldest venture of all. Forget start-up companies. The next frontier is start-up countries.
Read more:
"Massive Decline" in Use of Facebook
by John Aravosis
Greg Pouy summarizes a new GlobalWebIndex study of Internet usage worldwide. Greg's post is in French, but the study is in English. Here are some of the key points:
Greg Pouy summarizes a new GlobalWebIndex study of Internet usage worldwide. Greg's post is in French, but the study is in English. Here are some of the key points:
1. The data suggests a "massive decline" in the use of the Facebook, particularly in English-speaking countries.via:
Click image to see larger version.
2. For 16 to 24 year olds, the Web, and especially social media, is their primary information source.
3. Instant messenger use is declining (I think this means the use of instant messaging software such as AOL, MSN, iChat etc).
4. eCommerce remains weak in Italian and Spain.
5. Strong development of eCommerce and social media in Turkey, China and Brazil.
6. The use of mobile Internet (I think they mean Internet access via cell phones, but possibly also tablets) is strong in both Asia and Latin America, while the usage itself tends to take place at home.
7. Many consumers are willing to pay for online content, but there are big differences between countries and age groups.
8. Microblogging (retweeting news via Twitter, for example) is growing significantly in Brazil, Russia, India and China.
9. People are still watching lots of TV, even people who are very active online.
California or Bust
by Amy Wallace
Let me tell you what happened with my breasts today. First, I spilled a latte all over them at the Coffee Bean and Tea Leaf. The lid on my cup wasn’t tight, so when I went to take a sip, milk foam poured and then puddled on my sweater. Stooping to wipe up what I presumed would be a mess on the floor, I found that little coffee had gotten past me. For the first time ever, my breasts were too grande for my latte.
Later, I took my breasts out to lunch at the 3rd Street Promenade in Santa Monica, where they promptly attracted the attention of, well, everybody. Outside the Broadway Deli, two men approached. They were well dressed, respectable-looking, and as they veered toward me, the one in the black designer suit leaned in, his eyes fixed like spotlights. “We love them,” he announced, smiling wickedly.I’ve had breasts for years. But now I have the biggest, firmest breasts in sight–a plump, jiggling set that obscure my downward vision and get in the way when I drive. My new breasts are D cup. They weigh 23.2 ounces–about the same as a couple of average grapefruits. They sit high on my chest in a bra that makes the most of my cleavage.
I’ve spent my whole life pretending breasts don’t matter. Part of me still wants to believe it’s true. I can make all the arguments, which basically come down to this: Women should be valued for their selves, not their shelves. Still, I have to admit, at the moment the breasts I’m toting feel like more than mere flesh. They feel like the source of all power.
The perfectly rounded breast is to L.A. what big hair is to Dallas. More than palm trees or surfboards or stars on Hollywood Boulevard, the breast–especially the surgically augmented breast–has become this city’s icon. That it taps into an American obsession only makes the symbol more potent. Saline or silicone, globelike or teardrop, ta-tas put the la, la in Los Angeles.
Read more:
Monday, August 22, 2011
Obama Administration Takes Tough Stance on Banks
[ed. Finally, right? Read on.]
by Glenn Greenwald
In mid-May, I wrote about the commendable -- one might say heroic -- efforts of New York Attorney General Eric Schneiderman to single-handedly impose meaningful accountability on Wall Street banks for their role in the 2008 financial crisis and the mortgage fraud/foreclosure schemes. Not only was Schneiderman launching probing investigations at a time when the Obama DOJ was steadfastly failing to do so, but -- more importantly -- he was refusing to sign onto a global settlement agreement being pushed by the DOJ that would have insulated the mortgage banks (including Bank of America, Citigroup, JPMorgan Chase and Wells Fargo) from all criminal investigations in exchange for some relatively modest civil fines. In response, many commenters wondered whether Schneiderman, if he persisted, would be targeted by the banks with some type of campaign of destruction of the kind that brought down Eliot Spitzer, but fortunately for the banks, they can dispatch their owned servants in Washington to apply the pressure for them:
Read more:
by Glenn Greenwald In mid-May, I wrote about the commendable -- one might say heroic -- efforts of New York Attorney General Eric Schneiderman to single-handedly impose meaningful accountability on Wall Street banks for their role in the 2008 financial crisis and the mortgage fraud/foreclosure schemes. Not only was Schneiderman launching probing investigations at a time when the Obama DOJ was steadfastly failing to do so, but -- more importantly -- he was refusing to sign onto a global settlement agreement being pushed by the DOJ that would have insulated the mortgage banks (including Bank of America, Citigroup, JPMorgan Chase and Wells Fargo) from all criminal investigations in exchange for some relatively modest civil fines. In response, many commenters wondered whether Schneiderman, if he persisted, would be targeted by the banks with some type of campaign of destruction of the kind that brought down Eliot Spitzer, but fortunately for the banks, they can dispatch their owned servants in Washington to apply the pressure for them:
Eric T. Schneiderman, the attorney general of New York, has come under increasing pressure from the Obama administration to drop his opposition to a wide-ranging state settlement with banks over dubious foreclosure practices, according to people briefed on discussions about the deal.
In recent weeks, Shaun Donovan, the secretary of Housing and Urban Development, and high-level Justice Department officials have been waging an intensifying campaign to try to persuade the attorney general to support the settlement, said the people briefed on the talks.
Mr. Schneiderman and top prosecutors in some other states have objected to the proposed settlement with major banks, saying it would restrict their ability to investigate and prosecute wrongdoing in a variety of areas, including the bundling of loans in mortgage securities.
But Mr. Donovan and others in the administration have been contacting not only Mr. Schneiderman but his allies, including consumer groups and advocates for borrowers, seeking help to secure the attorney general's participation in the deal, these people said. One recipient described the calls from Mr. Donovan, but asked not to be identified for fear of retaliation.
Not surprising, the large banks, which are eager to reach a settlement, have grown increasingly frustrated with Mr. Schneiderman. Bank officials recently discussed asking Mr. Donovan for help in changing the attorney general’s mind, according to a person briefed on those talks.In response to this story, the DOJ claims that the settlement is necessary to help people whose homes are in foreclosure, an absurd rationalization which Marcy Wheeler simply destroys. Meanwhile, Yves Smith, whose coverage of banking and mortgage fraud (and the administration's protection of it) has long been indispensable, writes today:
It is high time to describe the Obama Administration by its proper name: corrupt.
Admittedly, corruption among our elites generally and in Washington in particular has become so widespread and blatant as to fall into the "dog bites man" category. But the nauseating gap between the Administration's propaganda and the many and varied ways it sells out average Americans on behalf of its favored backers, in this case the too big to fail banks, has become so noisome that it has become impossible to ignore the fetid smell.
The Administration has now taken to pressuring parties that are not part of the machinery reporting to the President to fall in and do his bidding. We’ve gotten so used to the US attorney general being conveniently missing in action that we have forgotten that regulators and the AG are supposed to be independent.Her entire analysis should be read. The President -- who kicked off his campaign vowing to put an end to "the era of Scooter Libby justice" -- will stand before the electorate in 2012 having done everything in his power to shield top Bush officials from all accountability for their crimes and will have done the same for Wall Street banks, all while continuing to preside over the planet's largest Prison State . . . for ordinary Americans convicted even of trivial offenses, particularly (though not only) from the War on Drugs he continues steadfastly to defend. And as Sam Seder noted this morning, none of this has anything to do with Congress and cannot be blamed on the Weak Presidency, the need to compromise, or the "crazy" GOP.
Read more:
Profile: Tim Berners-Lee
Tim Berners-Lee could have been a billionaire if he had sold his invention, the World Wide Web. Instead, he is campaigning for internet access to be a fundamental right of everyone on earth.
by John Naish
Twenty years ago, Tim Berners-Lee launched the World Wide Web among a small circle of fellow computer enthusiasts. Today, the 56-year-old Briton remains one of the internet's most vigorous advocates. Its vast success, however, has had a downside: it has exposed him to a bombardment of requests from visionaries, obsessives and rubberneckers, as well as hordes of children demanding help with school projects. All expect him to exist as some kind of open-source human being.by John Naish
Berners-Lee has never been an enthusiastic self-publicist. Nowadays, he shelters behind carapaces of email gateways and protective staff. He seldom gives interviews. If you're not persistent and pertinent, you may not even earn a rebuff. "I'm quite busy," he explains - a huge understatement - when eventually we talk on the phone.“I have built a moat around myself, along with ways over that moat so that people can ask questions. What I do has to be a function of what I can do, not a function of what people ask me to do." (He tends to use techy terms such as "function" quite a lot. He doesn't mind "geek", either.)
That the creator of the web - a father of two children, separated from his wife and based in Cambridge, Massachusetts, where he pursues his research at the Massachusetts Institute of Technology - has to live like an electronic Howard Hughes is just one of the many paradoxes that his invention has thrown up over the past two decades.
Nevertheless, Berners-Lee is campaigning for ever more openness, pushing for the internet to exist as a free-for-all, unfettered by creeping government interference or commercial intrigue. He believes that access to the internet should be a human right.
Born to mathematician parents in west London in June 1955, Berners-Lee studied at Oxford University, graduating with a First in physics in 1976. In 1980, he joined the European Organisation for Nuclear Research in Geneva, better known as Cern, as a consultant but left a year later to become director of the tech firm Image Computer Systems.
Returning to Cern in 1984, he started working on hypertext to help researchers share information. His new project could easily have been dismissed as another case of a back-room enthusiast tinkering with clunky, electronic networks: even though he had the ability to construct a computer using a soldering iron and an old television set, Berners-Lee was just one of many sandal-wearing scientists.
“I wanted to build a creative space, something like a sandpit where everyone could play together," he says now. "Life was very simple. I was too busy to think about the bigger questions. I was writing specs for the web, writing the code. My priority was getting more people to use it, looking for communities who might adopt it. I just wanted the thing to take off."
Berners-Lee formally introduced his hobby-built system to the world on 6 August 1991 by posting a message on an internet bulletin board for fellow hypertext program developers. That day, he put the world's first proper website online. It explained what a website was and gave details of how to create one. Neither initiative caused any immediate interest.
Read more:
Crisis of Confidence
How Washington Lost Faith in America's Courts
by Karen J. Greenberg
As the 10th anniversary of 9/11 approaches, the unexpected extent of the damage Americans have done to themselves and their institutions is coming into better focus. The event that “changed everything” did turn out to change Washington in ways more startling than most people realize. On terrorism and national security, to take an obvious (if seldom commented upon) example, the confidence of the U.S. government seems to have been severely, perhaps irreparably, shaken when it comes to that basic and essential American institution: the courts.by Karen J. Greenberg
If, in fact, we are a “nation of laws,” you wouldn’t know it from Washington’s actions over the past few years. Nothing spoke more strikingly to that loss of faith, to our country’s increasing incapacity for meeting violence with the law, than the widely hailed decision to kill rather than capture Osama bin Laden.
Clearly, a key factor in that decision was a growing belief, widely shared within the national-security establishment, that none of our traditional or even newly created tribunals, civilian or military, could have handled a bin Laden trial. Washington’s faith went solely to Navy SEALs zooming into another country’s sovereign airspace on a moonless night on a mission to assassinate bin Laden, whether he offered the slightest resistance or not. It evidently seemed so much easier to the top officials overseeing the operation -- and so much less messy -- than bringing a confessed mass murderer into a courtroom in, or even anywhere near, the United States.
The decision to kill bin Laden on sight rather than capture him and bring him to trial followed hard on the heels of an ignominious Obama administration climb-down on its plan to try the “mastermind” of the 9/11 attacks, Khalid Sheikh Mohammed, or KSM, in a federal court in New York City. Captured in Pakistan in May 2003 and transferred to Guantanamo in 2006, his proposed trial was, under political pressure, returned to a military venue earlier this year.
Given the extraordinary record of underperformance by the military commissions system -- only six convictions in 10 years -- it’s hard to escape the conclusion that the United States has little faith in its ability to put on trial a man assumedly responsible for murdering thousands.
And don’t assume that these high-level examples of avoiding the court system are just knotty exceptions that prove the rule. There is evidence that the administration’s skepticism and faint-heartedness when it comes to using the judicial system risks becoming pervasive.
Pushing Guilt Before Trial
Needless to say, this backing away from courts of law as institutions appropriate for handling terrorism suspects began in the Bush-Cheney years. Top officials in the Bush administration believed civilian courts to be far too weak for the Global War on Terror they had declared. This, as they saw it, was largely because those courts would supposedly gift foreign terrorist suspects with a slew of American legal rights that might act as so many get-out-of-jail-free cards.
As a result, despite a shining record of terrorism convictions in civilian courts in the 1990s -- including the prosecutions of those responsible for the 1993 attempt to take down a tower of the World Trade Center -- President Bush issued a military order on November 13, 2001, that established the court-less contours of public debate to come. It mandated that non-American terrorists captured abroad would be put under the jurisdiction of the Pentagon, not the federal court system. This was “war,” after all, and the enemy had to be confronted by fighting men, not those sticklers for due process, civilian judges and juries.
Sunday, August 21, 2011
Let Them Eat Cake
GOP may OK tax increase that Obama hopes to block
by Charles Babington
News flash: Congressional Republicans want to raise your taxes. Impossible, right? GOP lawmakers are so virulently anti-tax, surely they will fight to prevent a payroll tax increase on virtually every wage-earner starting Jan. 1, right?
Apparently not.
Many of the same Republicans who fought hammer-and-tong to keep the George W. Bush-era income tax cuts from expiring on schedule are now saying a different "temporary" tax cut should end as planned. By their own definition, that amounts to a tax increase.
The tax break extension they oppose is sought by President Barack Obama. Unlike proposed changes in the income tax, this policy helps the 46 percent of all Americans who owe no federal income taxes but who pay a "payroll tax" on practically every dime they earn.
At issue is a tax that the vast majority of workers pay, but many don't recognize because they don't read, or don't understand their pay stubs. Workers normally pay 6.2 percent of their wages toward a tax designated for Social Security. Their employer pays an equal amount, for a total of 12.4 percent per worker.
As part of a bipartisan spending deal last December, Congress approved Obama's request to reduce the workers' share to 4.2 percent for one year; employers' rate did not change. Obama wants Congress to extend the reduction for an additional year. If not, the rate will return to 6.2 percent on Jan. 1.
But Republican lawmakers haven't always worried about tax cuts increasing the deficit. They led the fight to extend the life of a much bigger tax break: the major 2001 income tax reduction enacted under Bush. It was scheduled to expire at the start of this year. Obama campaigned on a pledge to end the tax break only for the richest Americans, but solid GOP opposition forced him to back down.
Many Republicans are adamant about not raising taxes but largely silent on what it would mean to let the payroll tax break expire.
Read more:
by Charles Babington
News flash: Congressional Republicans want to raise your taxes. Impossible, right? GOP lawmakers are so virulently anti-tax, surely they will fight to prevent a payroll tax increase on virtually every wage-earner starting Jan. 1, right?
Apparently not.
Many of the same Republicans who fought hammer-and-tong to keep the George W. Bush-era income tax cuts from expiring on schedule are now saying a different "temporary" tax cut should end as planned. By their own definition, that amounts to a tax increase.
The tax break extension they oppose is sought by President Barack Obama. Unlike proposed changes in the income tax, this policy helps the 46 percent of all Americans who owe no federal income taxes but who pay a "payroll tax" on practically every dime they earn.
At issue is a tax that the vast majority of workers pay, but many don't recognize because they don't read, or don't understand their pay stubs. Workers normally pay 6.2 percent of their wages toward a tax designated for Social Security. Their employer pays an equal amount, for a total of 12.4 percent per worker.
As part of a bipartisan spending deal last December, Congress approved Obama's request to reduce the workers' share to 4.2 percent for one year; employers' rate did not change. Obama wants Congress to extend the reduction for an additional year. If not, the rate will return to 6.2 percent on Jan. 1.
But Republican lawmakers haven't always worried about tax cuts increasing the deficit. They led the fight to extend the life of a much bigger tax break: the major 2001 income tax reduction enacted under Bush. It was scheduled to expire at the start of this year. Obama campaigned on a pledge to end the tax break only for the richest Americans, but solid GOP opposition forced him to back down.
Many Republicans are adamant about not raising taxes but largely silent on what it would mean to let the payroll tax break expire.
Read more:
Why Software Is Eating The World
[ed. Contrasting perspective to the Silicon Valley article posted below.]
by Marc Andreessen
This week, Hewlett-Packard (where I am on the board) announced that it is exploring jettisoning its struggling PC business in favor of investing more heavily in software, where it sees better potential for growth. Meanwhile, Google plans to buy up the cellphone handset maker Motorola Mobility. Both moves surprised the tech world. But both moves are also in line with a trend I've observed, one that makes me optimistic about the future growth of the American and world economies, despite the recent turmoil in the stock market.by Marc Andreessen
In short, software is eating the world.
More than 10 years after the peak of the 1990s dot-com bubble, a dozen or so new Internet companies like Facebook and Twitter are sparking controversy in Silicon Valley, due to their rapidly growing private market valuations, and even the occasional successful IPO. With scars from the heyday of Webvan and Pets.com still fresh in the investor psyche, people are asking, "Isn't this just a dangerous new bubble?"
I, along with others, have been arguing the other side of the case. (I am co-founder and general partner of venture capital firm Andreessen-Horowitz, which has invested in Facebook, Groupon, Skype, Twitter, Zynga, and Foursquare, among others. I am also personally an investor in LinkedIn.) We believe that many of the prominent new Internet companies are building real, high-growth, high-margin, highly defensible businesses.
Today's stock market actually hates technology, as shown by all-time low price/earnings ratios for major public technology companies. Apple, for example, has a P/E ratio of around 15.2—about the same as the broader stock market, despite Apple's immense profitability and dominant market position (Apple in the last couple weeks became the biggest company in America, judged by market capitalization, surpassing Exxon Mobil). And, perhaps most telling, you can't have a bubble when people are constantly screaming "Bubble!"
But too much of the debate is still around financial valuation, as opposed to the underlying intrinsic value of the best of Silicon Valley's new companies. My own theory is that we are in the middle of a dramatic and broad technological and economic shift in which software companies are poised to take over large swathes of the economy.
More and more major businesses and industries are being run on software and delivered as online services—from movies to agriculture to national defense. Many of the winners are Silicon Valley-style entrepreneurial technology companies that are invading and overturning established industry structures. Over the next 10 years, I expect many more industries to be disrupted by software, with new world-beating Silicon Valley companies doing the disruption in more cases than not.
Read more:
Subscribe to:
Comments (Atom)










