Thursday, June 13, 2013
Secret War
Inside Fort Meade, Maryland, a top-secret city bustles. Tens of thousands of people move through more than 50 buildings—the city has its own post office, fire department, and police force. But as if designed by Kafka, it sits among a forest of trees, surrounded by electrified fences and heavily armed guards, protected by antitank barriers, monitored by sensitive motion detectors, and watched by rotating cameras. To block any telltale electromagnetic signals from escaping, the inner walls of the buildings are wrapped in protective copper shielding and the one-way windows are embedded with a fine copper mesh.
This is the undisputed domain of General Keith Alexander, a man few even in Washington would likely recognize. Never before has anyone in America’s intelligence sphere come close to his degree of power, the number of people under his command, the expanse of his rule, the length of his reign, or the depth of his secrecy. A four-star Army general, his authority extends across three domains: He is director of the world’s largest intelligence service, the National Security Agency; chief of the Central Security Service; and commander of the US Cyber Command. As such, he has his own secret military, presiding over the Navy’s 10th Fleet, the 24th Air Force, and the Second Army.
Alexander runs the nation’s cyberwar efforts, an empire he has built over the past eight years by insisting that the US’s inherent vulnerability to digital attacks requires him to amass more and more authority over the data zipping around the globe. In his telling, the threat is so mind-bogglingly huge that the nation has little option but to eventually put the entire civilian Internet under his protection, requiring tweets and emails to pass through his filters, and putting the kill switch under the government’s forefinger. “What we see is an increasing level of activity on the networks,” he said at a recent security conference in Canada. “I am concerned that this is going to break a threshold where the private sector can no longer handle it and the government is going to have to step in.”
In its tightly controlled public relations, the NSA has focused attention on the threat of cyberattack against the US—the vulnerability of critical infrastructure like power plants and water systems, the susceptibility of the military’s command and control structure, the dependence of the economy on the Internet’s smooth functioning. Defense against these threats was the paramount mission trumpeted by NSA brass at congressional hearings and hashed over at security conferences.
But there is a flip side to this equation that is rarely mentioned: The military has for years been developing offensive capabilities, giving it the power not just to defend the US but to assail its foes. Using so-called cyber-kinetic attacks, Alexander and his forces now have the capability to physically destroy an adversary’s equipment and infrastructure, and potentially even to kill. Alexander—who declined to be interviewed for this article—has concluded that such cyberweapons are as crucial to 21st-century warfare as nuclear arms were in the 20th.
And he and his cyberwarriors have already launched their first attack. The cyberweapon that came to be known as Stuxnet was created and built by the NSA in partnership with the CIA and Israeli intelligence in the mid-2000s. The first known piece of malware designed to destroy physical equipment, Stuxnet was aimed at Iran’s nuclear facility in Natanz. By surreptitiously taking control of an industrial control link known as a Scada (Supervisory Control and Data Acquisition) system, the sophisticated worm was able to damage about a thousand centrifuges used to enrich nuclear material.
The success of this sabotage came to light only in June 2010, when the malware spread to outside computers. It was spotted by independent security researchers, who identified telltale signs that the worm was the work of thousands of hours of professional development. Despite headlines around the globe, officials in Washington have never openly acknowledged that the US was behind the attack. It wasn’t until 2012 that anonymous sources within the Obama administration took credit for it in interviews with The New York Times.
But Stuxnet is only the beginning. Alexander’s agency has recruited thousands of computer experts, hackers, and engineering PhDs to expand US offensive capabilities in the digital realm. The Pentagon has requested $4.7 billion for “cyberspace operations,” even as the budget of the CIA and other intelligence agencies could fall by $4.4 billion. It is pouring millions into cyberdefense contractors. And more attacks may be planned.
Inside the government, the general is regarded with a mixture of respect and fear, not unlike J. Edgar Hoover, another security figure whose tenure spanned multiple presidencies. “We jokingly referred to him as Emperor Alexander—with good cause, because whatever Keith wants, Keith gets,” says one former senior CIA official who agreed to speak on condition of anonymity. “We would sit back literally in awe of what he was able to get from Congress, from the White House, and at the expense of everybody else.”
Now 61, Alexander has said he plans to retire in 2014; when he does step down he will leave behind an enduring legacy—a position of far-reaching authority and potentially Strangelovian powers at a time when the distinction between cyberwarfare and conventional warfare is beginning to blur. A recent Pentagon report made that point in dramatic terms. It recommended possible deterrents to a cyberattack on the US. Among the options: launching nuclear weapons.
This is the undisputed domain of General Keith Alexander, a man few even in Washington would likely recognize. Never before has anyone in America’s intelligence sphere come close to his degree of power, the number of people under his command, the expanse of his rule, the length of his reign, or the depth of his secrecy. A four-star Army general, his authority extends across three domains: He is director of the world’s largest intelligence service, the National Security Agency; chief of the Central Security Service; and commander of the US Cyber Command. As such, he has his own secret military, presiding over the Navy’s 10th Fleet, the 24th Air Force, and the Second Army.Alexander runs the nation’s cyberwar efforts, an empire he has built over the past eight years by insisting that the US’s inherent vulnerability to digital attacks requires him to amass more and more authority over the data zipping around the globe. In his telling, the threat is so mind-bogglingly huge that the nation has little option but to eventually put the entire civilian Internet under his protection, requiring tweets and emails to pass through his filters, and putting the kill switch under the government’s forefinger. “What we see is an increasing level of activity on the networks,” he said at a recent security conference in Canada. “I am concerned that this is going to break a threshold where the private sector can no longer handle it and the government is going to have to step in.”
In its tightly controlled public relations, the NSA has focused attention on the threat of cyberattack against the US—the vulnerability of critical infrastructure like power plants and water systems, the susceptibility of the military’s command and control structure, the dependence of the economy on the Internet’s smooth functioning. Defense against these threats was the paramount mission trumpeted by NSA brass at congressional hearings and hashed over at security conferences.
But there is a flip side to this equation that is rarely mentioned: The military has for years been developing offensive capabilities, giving it the power not just to defend the US but to assail its foes. Using so-called cyber-kinetic attacks, Alexander and his forces now have the capability to physically destroy an adversary’s equipment and infrastructure, and potentially even to kill. Alexander—who declined to be interviewed for this article—has concluded that such cyberweapons are as crucial to 21st-century warfare as nuclear arms were in the 20th.
And he and his cyberwarriors have already launched their first attack. The cyberweapon that came to be known as Stuxnet was created and built by the NSA in partnership with the CIA and Israeli intelligence in the mid-2000s. The first known piece of malware designed to destroy physical equipment, Stuxnet was aimed at Iran’s nuclear facility in Natanz. By surreptitiously taking control of an industrial control link known as a Scada (Supervisory Control and Data Acquisition) system, the sophisticated worm was able to damage about a thousand centrifuges used to enrich nuclear material.
The success of this sabotage came to light only in June 2010, when the malware spread to outside computers. It was spotted by independent security researchers, who identified telltale signs that the worm was the work of thousands of hours of professional development. Despite headlines around the globe, officials in Washington have never openly acknowledged that the US was behind the attack. It wasn’t until 2012 that anonymous sources within the Obama administration took credit for it in interviews with The New York Times.
But Stuxnet is only the beginning. Alexander’s agency has recruited thousands of computer experts, hackers, and engineering PhDs to expand US offensive capabilities in the digital realm. The Pentagon has requested $4.7 billion for “cyberspace operations,” even as the budget of the CIA and other intelligence agencies could fall by $4.4 billion. It is pouring millions into cyberdefense contractors. And more attacks may be planned.
Inside the government, the general is regarded with a mixture of respect and fear, not unlike J. Edgar Hoover, another security figure whose tenure spanned multiple presidencies. “We jokingly referred to him as Emperor Alexander—with good cause, because whatever Keith wants, Keith gets,” says one former senior CIA official who agreed to speak on condition of anonymity. “We would sit back literally in awe of what he was able to get from Congress, from the White House, and at the expense of everybody else.”
Now 61, Alexander has said he plans to retire in 2014; when he does step down he will leave behind an enduring legacy—a position of far-reaching authority and potentially Strangelovian powers at a time when the distinction between cyberwarfare and conventional warfare is beginning to blur. A recent Pentagon report made that point in dramatic terms. It recommended possible deterrents to a cyberattack on the US. Among the options: launching nuclear weapons.
by James Bamford, Wired | Read more:
Illustration by Mark WeaverThe Tragic Fall of the White Race in America
Let’s not mince words. It’s hard being a white person in America. I hadn’t noticed this but that’s what everyone seems to say. So I was struck when a group of stories came together today in what seemed like a serendipitous way.
It starts with the news that among the people 5 years old and younger in America whites are now a minority. It’s worth noting that that’s a pretty small part of the population. And all it really means is that white people are no longer a bigger slice of the population than everyone else combined. But it does put the trajectory of American demographics and life in sharp relief. If you figure that it was a big, big deal that the percentage of the white vote was 72% in 2012 and had fallen from 88% in 1980, you can see that the fact that in 20 years that percentage, among young voters, will be 50/50, it’s actually a very big deal.
Then there was this story of Rep. Steve King (R-IA), well-known crazy person and tireless driver of TPM audience numbers, going on Twitter to complain that because of President Obama’s climate of lawlessness a group of “illegal aliens have just invaded my DC office.” TPM’s Perry Stein caught up with the horde of illegals/aka ‘Dreamers’ to get their side of the story here.
And then just after noon, we heard about the story of 11-year-old Sebastien De La Cruz, who sang the national anthem at Tuesday nights NBA Finals game and then got deluged by racist tweets telling him to go back to Mexico and that he’d probably just snuck into the country hours before and other good stuff. De La Cruz was born in San Antonio.
Good times, as they say.
But it’s brought home some of my own thoughts about the changing racial makeup of the country and the persistent or perhaps growing climate of white racial panic as white people face a future as only the biggest single group in the country with most of the wealth and power. Race has, obviously, always been a central part of American politics. Always. But we don’t have to go back to 1619 or 1863 or any other ancient date. Let’s just talk about the 1990s or really any other time up to the last few years. It’s not that any of this stuff is new. It’s that until pretty recently we had this stuff and on balance it was successful. That’s the key. And now, though it’s a very close run thing, it tends not to be successful. And by successful I mean in a purely electoral sense. Does it get you more votes than it loses you. And at a certain level that’s all that matters.
Republicans invested heavily in voter suppression for the 2012 cycle. And while it is very important to note that a big reason why it didn’t ‘work’ was that courts struck down a lot of the most egregious laws (and huge kudos to the myriad civil rights and voting rights lawyers who made that possible), it also didn’t work because the attempt itself massively energized the growing non-white electorate. So every time a little Mexican-American kid dares to sing the national anthem at a basketball game wearing a mariachi suit and freaks start telling him on Twitter to go back to Mexico, it’s gross and it’s a bummer, but you also realize that it’s probably marginalizing the white racist freakshow vote more than it’s empowering it. And when conservative backbenchers in the House say ‘pathway to citizenship’ over my dead body or despair of American culture, well, sure bring it to the next election and let’s see what happens. And the one after that.
It’s worth remembering that the intensity of this kind of thinking will almost certainly grow as its political effectiveness wanes. But the simple fact is that calculus has changed. There are now enough non-white people in America and just as critically enough whites who are either at least comfortable or even welcome being in a multiracial party and country, that the electoral calculus has changed. And that’s a really good thing.
by Josh Marshall, TPM
Image: uncredited
Three Beards
In my life I have grown three beards, covering many of my adult faces. My present hairiness is monumental, and I intend to carry it into the grave. (I must avoid chemotherapy.) A woman has instigated each beard, the original bush requested by my first wife, Kirby. Why did she want it? Maybe she was tired of the same old face. Or maybe she thought a beard would be raffish; I did. In the fifties, no one wore beards. In Eisenhower’s day, as in the time of the Founding Fathers, all chins were smooth, while during the Civil War beards were as common as sepsis. Both my New Hampshire great-grandfathers wore facial hair, the Copperhead who fought in the war and the sheep farmer too old for combat. By the time I was sentient, in the nineteen-thirties, only my eccentric cousin Freeman was bearded, and even he shaved once a summer. Every September he endured a fortnight of scratchiness. Many men, after trying a beard for five or six days, have wanted to claw off their skin. They have picked up their Gillettes.
Despite the itch, I persisted until I looked something like a Brady photograph, or at least not like a professor of English literature at the University of Michigan. The elderly chairman of the department was intelligent and crafty. When he spoke in well-constructed paragraphs, with inviolate syntax, he sounded like a Member of Parliament—except for his Midwestern accent. He always addressed me as “Hall,” and used last names for all his staff. The summer of the beard, I dropped in at the department to pick up my mail. I wore plastic flip-flops, sagging striped shorts, a Detroit Tigers T-shirt, and a grubby stubble like male models in Vanity Fair. My chairman greeted me, noting my rank: “Good morning, Professor Hall.”
Dinner parties and cocktail parties dominated every Ann Arbor weekend. Women wore girdles; the jacket pockets of men’s gray suits showed the fangs of handkerchiefs. Among the smooth-faced crowds of Chesterfield smokers, I enjoyed cigars, which added to the singularity of my beard and rendered living rooms uninhabitable. When I lectured to students I walked up and down with my cigar, dropping ashes in a tin wastebasket. The girls in the front row smoked cigarettes pulled from soft, blue leather pouches stamped with golden fleurs-de-lis. As the sixties began, if I was sluggish beginning my lecture—maybe I had stayed up all night with a visiting poet—I paused by the front row and asked if anyone had some of those diet things. Immediately, female hands held forth little ceramic boxes full of spansules or round, pink pills. After I ingested Dexedrine, my lecture speeded up and rose in pitch until only dogs could hear it.
When I was bearded and my mother visited me, she stared at the floor, addressing me without making eye contact. Why did she hate beards so intensely? She adored her hairy grandfathers and her cousin Freeman. Her father, Wesley, of the next generation, shaved once or twice a week. On Saturday night before Sunday’s church, Wesley perched on a set tub. Looking into the mirror of a clock, he scraped his chin with a straight razor.
In 1967, my marriage, which had faltered for years, splintered and fell apart. As Vietnam conquered American campuses, I hung out with students who weaned me from cigars to cigar joints. “Make love not war” brought chicks and dudes together, raising everyone’s political consciousness. Middle-class boys from Bloomfield Hills proved they belonged to the movement by begging on the streets for spare change. A professor of physics told a well-dressed panhandler, “Get it from your mother.” When the student said, “She won’t give it to me,” the physicist answered, “That’s funny, she gave it to me this morning.”
I signed the last divorce papers while anesthetized for a biopsy of my left testicle. It was benign, but divorces aren’t. I shaved because the world had altered. Although my mother fretted about the divorce, she looked at my face again. My sudden singleness and my naked skin confused my friends. I was still invited to dinner parties, and therefore gave dinner parties back. I invited eight people for dinner. When I noticed that I had no placemats, I substituted used but laundered diapers, which I had bought for drying dishes. For dinner I served two entrĂ©es, Turkey Salad Amaryllis and Miracle Beans. I bought three turkey rolls, cooked them and chopped them up with onions and celery, then added basil and two jars of Hellmann’s Real. It was delicious, and so were Miracle Beans. Warm ten cans of B&Ms, add basil again, add dry mustard, stir, and serve. My friends enjoyed my dinner parties. I served eight bottles of chilled Louis Latour Chassagne-Montrachet Cailleret.
Despite the itch, I persisted until I looked something like a Brady photograph, or at least not like a professor of English literature at the University of Michigan. The elderly chairman of the department was intelligent and crafty. When he spoke in well-constructed paragraphs, with inviolate syntax, he sounded like a Member of Parliament—except for his Midwestern accent. He always addressed me as “Hall,” and used last names for all his staff. The summer of the beard, I dropped in at the department to pick up my mail. I wore plastic flip-flops, sagging striped shorts, a Detroit Tigers T-shirt, and a grubby stubble like male models in Vanity Fair. My chairman greeted me, noting my rank: “Good morning, Professor Hall.”
Dinner parties and cocktail parties dominated every Ann Arbor weekend. Women wore girdles; the jacket pockets of men’s gray suits showed the fangs of handkerchiefs. Among the smooth-faced crowds of Chesterfield smokers, I enjoyed cigars, which added to the singularity of my beard and rendered living rooms uninhabitable. When I lectured to students I walked up and down with my cigar, dropping ashes in a tin wastebasket. The girls in the front row smoked cigarettes pulled from soft, blue leather pouches stamped with golden fleurs-de-lis. As the sixties began, if I was sluggish beginning my lecture—maybe I had stayed up all night with a visiting poet—I paused by the front row and asked if anyone had some of those diet things. Immediately, female hands held forth little ceramic boxes full of spansules or round, pink pills. After I ingested Dexedrine, my lecture speeded up and rose in pitch until only dogs could hear it.
When I was bearded and my mother visited me, she stared at the floor, addressing me without making eye contact. Why did she hate beards so intensely? She adored her hairy grandfathers and her cousin Freeman. Her father, Wesley, of the next generation, shaved once or twice a week. On Saturday night before Sunday’s church, Wesley perched on a set tub. Looking into the mirror of a clock, he scraped his chin with a straight razor.
In 1967, my marriage, which had faltered for years, splintered and fell apart. As Vietnam conquered American campuses, I hung out with students who weaned me from cigars to cigar joints. “Make love not war” brought chicks and dudes together, raising everyone’s political consciousness. Middle-class boys from Bloomfield Hills proved they belonged to the movement by begging on the streets for spare change. A professor of physics told a well-dressed panhandler, “Get it from your mother.” When the student said, “She won’t give it to me,” the physicist answered, “That’s funny, she gave it to me this morning.”
I signed the last divorce papers while anesthetized for a biopsy of my left testicle. It was benign, but divorces aren’t. I shaved because the world had altered. Although my mother fretted about the divorce, she looked at my face again. My sudden singleness and my naked skin confused my friends. I was still invited to dinner parties, and therefore gave dinner parties back. I invited eight people for dinner. When I noticed that I had no placemats, I substituted used but laundered diapers, which I had bought for drying dishes. For dinner I served two entrĂ©es, Turkey Salad Amaryllis and Miracle Beans. I bought three turkey rolls, cooked them and chopped them up with onions and celery, then added basil and two jars of Hellmann’s Real. It was delicious, and so were Miracle Beans. Warm ten cans of B&Ms, add basil again, add dry mustard, stir, and serve. My friends enjoyed my dinner parties. I served eight bottles of chilled Louis Latour Chassagne-Montrachet Cailleret.
by Donald Hall, New Yorker | Read more:
Illustration by Victor KerlowFixie: Love and Hate
The first time I rode a fixie, in 2006, it nearly killed me. My legs locked in motion with the wheels, I built some speed to crest a rise.
On top, I gazed ahead down the hill, and started to descend. In an old habit I stopped pedaling and attempted to coast. Bad move. My cranks bucked sharply and the bike swerved, the pedals forcing my feet in circles as the frame cut air on the steep downhill.
The machine was alive! This horse wanted to run, and I wasn’t about to stop it. I felt a rush, the intoxication of riding on the back of something wild, a little dangerous and, most of all just plain fast and fun.
I haven't quit since.
The fixed-gear experience is like nothing else on two wheels. It's a special feeling, an "almost mystical connection," as bicycle mechanic/muse Sheldon Brown puts it in his well-read "Fixed Gear Bicycles for the Road."
Brown, who died in 2008, was no hipster. He was an old guy with a beard who rode regular and fixed-gear bikes, the latter of which he noted feel "like an extension of your body to a greater extent than does a freewheel-equipped machine."
by Stephen Regenold, Outside | Read more:
Why Fixies Belong in the Garbage
I’ll admit it: Fixies do have a certain appeal. They’re simple, aesthetically pleasing, and—in a very particular setting, like on the velodrome or in the trash—even functional. But 99 percent of the time, there’s a better tool for the job.
Hating on fixed-gear bikes is almost too easy. At their finest, bikes are efficient, safe, and eminently enjoyable means of transportation. However, strip away a couple key components—namely the brakes and freewheel—and they become dangerous and impractical.
Anyone who’s ridden a bike knows that drivers can be unpredictable. Even the calmest of on-road commutes invariably involves a fair bit of swerving and emergency braking. Cyclists absolutely need to be able to stop as quickly as possible, and the stopping distance of a fixie is reportedly twice that of a front-brake-equipped bike—in the best of cases.
Fixed-gear nuts will tell you that an inexperienced rider is more likely to flip over his bars emergency braking on a road bike than on a fixie. As someone who’s raced on the track and road, it’s far more intuitive to stop safely using two brakes than by backpedalling. You’re also less likely to burn through costly rubber trying to skid to a stop.
True, some riders add front brakes to their fixies, which makes them a little more practical (and, depending on where you live, legal). But if brakes add a level of sanity, they also adulterate the machine. Taking a bike which is essentially a style statement—a direct insult to conformity and functionality—and trying to make it practical seems self-defeating, almost like purchasing a hybrid Hummer. Sure, it’s better than riding without brakes, but is it really the best option?
by Scott Rosenfield, Outside | Read more:
Image: Stephen RegenoldWednesday, June 12, 2013
Will the Bombers Obliterate Merion? Let's Hope So
They most certainly could: soft fairways help competitors keep their tee shots out of the penal rough, damp greens play slower and easier than dry ones, and players might be permitted to “lift, clean, and place” their muddy balls. The result could be a slew of birdies. And if players do dismantle the course, the United States Golf Association, which runs the Open and likes to think of it as the ultimate test in golf, will be very upset. But I won’t be. The sight of someone shooting twenty under par, or even lower, to win the title would be quite a spectacle, and it would also force the U.S.G.A. to tackle an issue that it’s been avoiding for years: the incredible distances that the modern golf ball flies. This, rather than trifling issues such as the popularity of the belly putter, is what has really changed the nature of professional golf.
Modelled on famous courses in Scotland and England, Merion, where, in 1930, Bobby Jones completed what was then golf’s version of the Grand Slam—British Amateur, British Open, U.S. Amateur, U.S. Open—has always been a test of skill rather than brawn. Even lengthened by almost five hundred yards for this year’s Open, it is pretty short by modern standards, with five par fours that measure less than three hundred and seventy-five yards. For players like Tiger Woods, Rory McIlroy, and Adam Scott, who will play together on the first two days, that makes some of the holes potentially drivable. Alternatively, they are a long iron and a flick.
Even the longer holes at Merion—and one of them has been extended to more than six hundred yards—won’t necessarily present much of a challenge. Top players hit the ball so far these days that distance isn’t really a factor. En route to victory in the 1950 Open, Ben Hogan, playing the eighteenth in the final round, hit a 1-iron—the most difficult club to play—onto the green from about two hundred and twenty yards, a feat which was for years considered remarkable. (A plaque marks the spot Hogan hit from.) Today, such a shot would be a routine 4- or 5-iron. If it were downwind, some players would hit 6-iron, or even seven.
Isn’t that just progress, which should be saluted rather than bemoaned? Tough question. For a sports fan, progress—bigger, better, faster, stronger—isn’t always a virtue. Sometimes, it threatens to obliterate the heroic performances, and performers, of the past, making them seem pedestrian. How would Chris Evert, with her wooden racket, fare against Serena Williams? Julius Erving against LeBron James? Valeriy Borzov against Usain Bolt? Hogan against Tiger Woods? The honest answer is not very well. (In the case of Hogan, some golf experts would disagree. But at five feet eight with a slight build, he probably wouldn’t have had the strength or swing speed—even with modern equipment—to compete with today’s players.)
by John Cassidy, New Yorker | Read more:
Photograph by Ross Kinnaird/Getty.Peeping Thomism
[ed. Yes. My generation will soon be gone and I expect the next will have a much more forgiving attitude about youthful exuberance (which, I would've thought we'd have had hard wired long ago, but I guess not.]
At some point in your youth, someone warned you that “this, young man, is going to go on your permanent record.” In my case, it was a high school vice principal. I’ve forgotten the infraction, but I remember the warning. The vice principal wasn’t a bad man, but he was a bit of martinet. That’s probably a part of the job description. I knew plenty of teachers and principals who disciplined out of impatience or because of a poorly hidden streak of petty sadism, but Mr. R. wasn’t one of them; I think he held an abiding belief that structure and direction were good—not just practically good, but universally and categorically so. Most disciplinarians just believe that children, that people, are rotten. Mr. R. believed that we were basically good, just stupid. The diagnosis was correct if the prescription was wrong, and in any event he was able to moderate his meanness, especially for the hard luck kids. That, I think, was the real mark of his moral character. He was never vindictive, and while I disagree with his code to this day, he applied it justly, which is to say, unequally, and contingent on the circumstances. American society often views harsh punishment as a virtue, and when we complain about the unequal application of the rules, we usually mean that rich guys get off too easy, but Mr. R. knew that the real problem is poor guys get it too hard. Man, did we hate that SOB, but we also thought he was kind of okay. Kids are sophisticated like that, more so than adults.
Anyway, the permanent record was one of those semi-mythical creatures that you publicly dismissed while privately fearing when you were camping in the woods and the fire had burned down. I was a rich kid in that poor town, in public school mostly because of politics related to my father’s job, and most high school discipline rolled right off me. It was a given that I’d graduate at the top of my class and decamp for some fancy college, which, indeed, I did. But I do remember the permanent record thing making me ever so slightly nervous, and if I laughed about it to my friends, then I still privately fretted that some ambitious admissions officer would haul up my file and mark me off with a red X for some past minor infraction. Now, of course, kids really do get a permanent record because schools have followed the general trend of American social hysteria and started calling the cops for the slightest infraction; detention is now a misdemeanor, and so on. That’s a shame, because the permanent record ought to be as laughable now as it ever was. Do you remember yourself when you were sixteen? Many descriptors come to mind, but fully formed isn’t one of them.
As if that weren’t bad enough, that idea that one ought to be branded with one’s own youth like a poorly considered neck tattoo, we now find not only kids, but adults (especially new adults) getting constantly dinged with the dire warning that Social Media Lasts Forever. I think this is probably patently untrue in a purely physical sense; it strikes me as probable that fifty years from now, the whole electronic record of our era will be largely lost in a sea of forgotten passwords, proprietary systems, faulty hardware, and compatibility issues. But it should also be untrue in, dare I say it, the moral sense. Educators and employers are constantly yelling that you young people have an affirmative responsibility not to post anything where a teacher or principal or, worst of all, boss or potential boss might find it, which gets the ethics of the situation precisely backwards. It isn’t your sister’s obligation to hide her diary; it’s yours not to read it. Your boyfriend shouldn’t have to close all his browser windows and hide his cell phone; you ought to refrain from checking his history and reading his texts. But, says the Director of Human Resources and the Career Counselor, social media is public; you’re putting it out there. Yes, well, then I’m sure you won’t mind if I join you guys at happy hour with this flip-cam and a stenographer. Privacy isn’t the responsibility of individuals to squirrel away secrets; it’s the decency of individuals to leave other’s lives alone.
At some point, employers will have to face up to the unavoidability of hiring people whose first Google image is a shirtless selfie. Demographics will demand it. They’ll have to get used to it just as surely as they’ll have to get used to nose rings and, god help us, neck tattoos. It’s a shame, though, that it’ll be compulsory and reluctant. We should no more have to censor our electronic conversations than whisper in a restaurant. I suspect that as my own generation and the one after it finally manage to boot the Boomers from their tenacious hold on the steering wheel of this civilization that they’ve piloted ineluctably and inexorably toward the shoals, all the while whining about the lazy passengers, we will better understand this, and be better, and more understanding. And I hope that the kids today will refuse to heed the warnings and insist on making a world in which what is actually unacceptable is to make one’s public life little more than series of polite and carefully maintained lies.
So Many Handbags, So Little Time
[ed. See also: Lee Radziwill and Sophia Coppola in Conversation (NY Times)]
Alexis Neiers was 17 in 2008. Her mother, Andrea, was a former Playboy model and her father a director of photography on Friends. Thousand Oaks isn’t super-rich but it’s the sort of place where people care a lot about money. Alexis and her friend Tess, who lived with her, behaved as if shopping (and having things) was the only way not to be a nobody. Alexis never forgot there was gold in them there hills and she spent her late teens trying to establish contacts that would lift her into the Hollywood scene. The family did pole-dancing in the living room and Andrea gave the girls – including Alexis’s younger sister, Gabby – the amphetamine Adderall every morning. (She said they had ADHD.) The girls knew about first class. They knew about VIP areas and fast cars, but they’d never seen a dictionary. Many of the kids in the southern valley think you’re odd if you don’t have a card for medical marijuana.
In the autumn of 2008, and for a full year after that, Alexis began travelling up the freeway at night in the company of some of the kids she knew from Calabasas. Like her, they wanted to be famous, but not in the old style: the stars they liked best were the ones who didn’t really do anything. The goddess for them was Paris Hilton. They didn’t think about talent and they didn’t care about class: they loved the kinds of star who were just like them, only fully arrived. In their world of Facebook and Twitter, Instagram and TMZ, where everybody was a star of their own social universe, as well as being their own paparazzi, the suburban teenagers idolised the people they were close to being themselves. Perhaps it’s a new kind of narcissism, where you only get to feel fully realised, successful and self-loving when you look at your reflection in the pool and see your idol. And having your idol’s shoes and handbag is one of the ways to achieve that.
Fame today is a matryoshka doll: inside each celebrity is a series of smaller, hollow simulacra, and, at the very core, there is a hard little being who feels buried alive. In Alexis’s gang there were four girls and three boys: the main culprit, Nick Prugo, was a gay kid working his way out of the closet. When he was eventually arrested by the police he was wearing a striped top he’d stolen from the house of the actor Orlando Bloom. And that’s what they did: after days of shopping or doing pilates, hanging out on MySpace, texting or oh-my-god-ing on their iPhones, studying Google Maps or celebrity websites to find addresses, they would travel in their big, gas-guzzling cars to the houses of their heroes in the Hollywood Hills, and rob them. At Paris Hilton’s house, they tried on her perfume and her shoes, they took money and handbags. It was almost as if Paris had been waiting for them: there was a key under her doormat, and her dressing-room, the inner sanctum, was filled with cushions bearing her self-adoring image. The burglars stepped gingerly over the little dogs called Marilyn Monroe and Prada.
The relationship between modern celebrities and their greatest fans is rather like the relationship that once existed between cops and robbers in the movies. (And in life, if you believe the Mafia lore.) Classic cops and robbers have the same DNA: they understand each other, because, at some basic level, they are the same people. The Bling Ring (as the Los Angeles Times called them) already possessed many of the items they were stealing, but what they craved was proximity and identification. Anyone can have a Marc Jacobs handbag if they can raise the money, but it isn’t just anyone who can have the one belonging to Paris Hilton. Only Paris has that – unless someone goes over to her house and takes it. Soon, the kids were showing up on Facebook and at clubs wearing their new clobber. It took the victims a while to notice, of course – so many handbags, so little time – but eventually it became clear that the Bling Ring had stolen $3 million worth of stuff.
There’s nothing new in stealing from the rich. What is new is the idea that the purloined items aren’t the main thing that’s been taken. Alexis wanted to be Paris, or a version of Paris which meant being more like herself-as-celebrity. She’d noticed – how could she not – that the celebrities she admired weren’t a million miles away from her and that was the thrill – being close to her ‘true’ status. Not by achieving it or even by getting to know her heroes personally, but by stealing their shoes and wearing them as if she had the right. The group described their nights in the hills as ‘shopping’.
by Andrew O'Hagan, LBR | Read more:
Image Credit: Merrick Morton via:Tuesday, June 11, 2013
R.E.M Wolves Lower
[ed. Repost taken down some time ago on YouTube but back up again. Check it out while you can (who knows how long it'll be available). An amazing performance early in R.E.M's career.]
Yo! Sushi Drone Delivery
Chunks of prawn cracker are flying through the air, next comes an ominous crash. It's not the best start, but this, I am told, is the latest innovation in food delivery – service by drone.
With their reputation as controversial killing machines, it's not the most obvious fit, but food outlets seem intent on proving such warlike techonology has an altogether more civilised use – if only they could get them to behave. Earlier this month, Domino's Pizza released a video of what it called its "DomiCopter", a pizza-delivering lightweight aircraft – and now YO! Sushi is developing its "first flying serving tray", to be rolled out next year. It has a lightweight carbon fibre frame with four propellors, two fixed cameras and its own Wifi connection, and is controlled using an iPad app. But when I visit the chain's Soho restaurant to have a look at how the trial is going, there are a few teething problems.
Not yet capable of carrying the weight of the burger it was designed to serve, it is being tested outside with polysterene food and prawn crackers. But the inexperienced pilot is finding it hard to get it off the ground, and almost impossible to control once it's hovering.
Instead of flying serenely in front of him and landing gently on the table, the machine drunkenly lurches around at knee height, crashing into camera tripods and chairs or just the ground, as the pilot mutters darkly about the wind factor and low batteries. Its rotor blades are said to be powerful enough to speed it along at 20mph, at a range of 50m, but they also mean that when the tray tilts and the prawn crackers fall out they are chopped and sprayed through the area.
by Homa Khaleeli, The Guardian | Read more:
Photograph: Graeme Robertson for the Guardian
America's 50 Worst Charities Rake in Nearly $1 billion for Corporate Fundraisers
The worst charity in America operates from a metal warehouse behind a gas station in Holiday.
Every year, Kids Wish Network raises millions of dollars in donations in the name of dying children and their families.
Every year, it spends less than 3 cents on the dollar helping kids.
Most of the rest gets diverted to enrich the charity's operators and the for-profit companies Kids Wish hires to drum up donations.
In the past decade alone, Kids Wish has channeled nearly $110 million donated for sick children to its corporate solicitors. An additional $4.8 million has gone to pay the charity's founder and his own consulting firms.
No charity in the nation has siphoned more money away from the needy over a longer period of time.
But Kids Wish is not an isolated case, a yearlong investigation by the Tampa Bay Times and The Center for Investigative Reporting has found.
Using state and federal records, the Times and CIR identified nearly 6,000 charities that have chosen to pay for-profit companies to raise their donations.
Then reporters took an unprecedented look back to zero in on the 50 worst — based on the money they diverted to boiler room operators and other solicitors over a decade.
These nonprofits adopt popular causes or mimic well-known charity names that fool donors. Then they rake in cash, year after year.
The nation's 50 worst charities have paid their solicitors nearly $1 billion over the past 10 years that could have gone to charitable works.
Until today, no one had tallied the cost of this parasitic segment of the nonprofit industry or traced the long history of its worst offenders.
Among the findings:
• The 50 worst charities in America devote less than 4 percent of donations raised to direct cash aid. Some charities give even less. Over a decade, one diabetes charity raised nearly $14 million and gave about $10,000 to patients. Six spent nothing at all on direct cash aid.
• Even as they plead for financial support, operators at many of the 50 worst charities have lied to donors about where their money goes, taken multiple salaries, secretly paid themselves consulting fees or arranged fundraising contracts with friends. One cancer charity paid a company owned by the president's son nearly $18 million over eight years to solicit funds. A medical charity paid its biggest research grant to its president's own for-profit company.
• Some nonprofits are little more than fronts for fundraising companies, which bankroll their startup costs, lock them into exclusive contracts at exorbitant rates and even drive the charities into debt. Florida-based Project Cure has raised more than $65 million since 1998, but every year has wound up owing its fundraiser more than what was raised. According to its latest financial filing, the nonprofit is $3 million in debt.
• To disguise the meager amount of money that reaches those in need, charities use accounting tricks and inflate the value of donated dollar-store cast-offs — snack cakes and air fresheners — that they give to dying cancer patients and homeless veterans.
Over the past six months, the Times and CIR called or mailed certified letters to the leaders of Kids Wish Network and the 49 other charities that have paid the most to solicitors.
Nearly half declined to answer questions about their programs or would speak only through an attorney.
Approached in person, one charity manager threatened to call the police; another refused to open the door. A third charity's president took off in his truck at the sight of a reporter with a camera. (...)
To identify America's 50 worst charities, the Times and CIR pieced together tens of thousands of pages of public records collected by the federal government and 36 states. Reporters started in California, Florida and New York, where regulators require charities to report results of individual fundraising campaigns.
The Times and CIR used those records to flag a specific kind of charity: those that pay for-profit corporations to raise the vast majority of their donations year in and year out.
The effort identified hundreds of charities that run donation drives across the country and regularly give their solicitors at least two-thirds of the take. Experts say good charities should spend about half that much — no more than 35 cents to raise a dollar.
For the worst charities, writing big checks to telemarketers isn't an anomaly. It's a way of life.
Photo: AP
See the rest of the series:
Part 2: The Failure of Regulation
Part 3: The Reynolds Family Empire
Every year, Kids Wish Network raises millions of dollars in donations in the name of dying children and their families.Every year, it spends less than 3 cents on the dollar helping kids.
Most of the rest gets diverted to enrich the charity's operators and the for-profit companies Kids Wish hires to drum up donations.
In the past decade alone, Kids Wish has channeled nearly $110 million donated for sick children to its corporate solicitors. An additional $4.8 million has gone to pay the charity's founder and his own consulting firms.
No charity in the nation has siphoned more money away from the needy over a longer period of time.
But Kids Wish is not an isolated case, a yearlong investigation by the Tampa Bay Times and The Center for Investigative Reporting has found.
Using state and federal records, the Times and CIR identified nearly 6,000 charities that have chosen to pay for-profit companies to raise their donations.
Then reporters took an unprecedented look back to zero in on the 50 worst — based on the money they diverted to boiler room operators and other solicitors over a decade.
These nonprofits adopt popular causes or mimic well-known charity names that fool donors. Then they rake in cash, year after year.
The nation's 50 worst charities have paid their solicitors nearly $1 billion over the past 10 years that could have gone to charitable works.
Until today, no one had tallied the cost of this parasitic segment of the nonprofit industry or traced the long history of its worst offenders.
Among the findings:
• The 50 worst charities in America devote less than 4 percent of donations raised to direct cash aid. Some charities give even less. Over a decade, one diabetes charity raised nearly $14 million and gave about $10,000 to patients. Six spent nothing at all on direct cash aid.
• Even as they plead for financial support, operators at many of the 50 worst charities have lied to donors about where their money goes, taken multiple salaries, secretly paid themselves consulting fees or arranged fundraising contracts with friends. One cancer charity paid a company owned by the president's son nearly $18 million over eight years to solicit funds. A medical charity paid its biggest research grant to its president's own for-profit company.
• Some nonprofits are little more than fronts for fundraising companies, which bankroll their startup costs, lock them into exclusive contracts at exorbitant rates and even drive the charities into debt. Florida-based Project Cure has raised more than $65 million since 1998, but every year has wound up owing its fundraiser more than what was raised. According to its latest financial filing, the nonprofit is $3 million in debt.
• To disguise the meager amount of money that reaches those in need, charities use accounting tricks and inflate the value of donated dollar-store cast-offs — snack cakes and air fresheners — that they give to dying cancer patients and homeless veterans.
Over the past six months, the Times and CIR called or mailed certified letters to the leaders of Kids Wish Network and the 49 other charities that have paid the most to solicitors.
Nearly half declined to answer questions about their programs or would speak only through an attorney.
Approached in person, one charity manager threatened to call the police; another refused to open the door. A third charity's president took off in his truck at the sight of a reporter with a camera. (...)
To identify America's 50 worst charities, the Times and CIR pieced together tens of thousands of pages of public records collected by the federal government and 36 states. Reporters started in California, Florida and New York, where regulators require charities to report results of individual fundraising campaigns.
The Times and CIR used those records to flag a specific kind of charity: those that pay for-profit corporations to raise the vast majority of their donations year in and year out.
The effort identified hundreds of charities that run donation drives across the country and regularly give their solicitors at least two-thirds of the take. Experts say good charities should spend about half that much — no more than 35 cents to raise a dollar.
For the worst charities, writing big checks to telemarketers isn't an anomaly. It's a way of life.
See the rest of the series:
Part 2: The Failure of Regulation
Part 3: The Reynolds Family Empire
'Jurassic Park' Is The Perfect Movie And Explains Everything About The Amazing World Of Science
On June 11th, 1993, I had my one and only "religious experience." It began, as is tradition, by staring into the cold hard eye of a raptor. It lasted for 127 minutes, in which I was in a complete state of raptor—sorry, rapture (these words are synonyms to me). I emerged from the movie theatre a changed person. I was like Saint Paul after he fell off his horse and realized, "Holy crap, Jesus is a god-man-thing!" Only my revelation was about dinosaurs, and so is obviously superior.
I had borne witness to the birth of Jurassic Park. I had seen it bite through the fence of public anticipation and burst into the public sphere. And oh, how it bellowed.
I was 8, but I remember how completely earth-shattering this film was right away. My friends and family laughed off my obsession, and chalked it up to childhood dinophilia. Exactly 20 years later, I have signed portraits of the characters framed all over my room, two sets of Jurassic Park toys splayed across my work desk, and all of the dialogue of the movie memorized (and that includes the dinosaurs' "lines"). So who's laughing now? Answer: Ian Malcolm, like this.
To say that Jurassic Park is my favorite movie would be like saying Earth is my favorite planet. These are prejudices over which I have no control. I love the movie's subtle foreshadowing, such as the helicopter landing scene in which Alan Grant—that's Sam Neil as the paleontologist—has two female seatbelt buckles that won't connect. But Dr. Grant finds a way (just like the dinosaurs' lil gametes). I love how Jeff Goldblum makes a tyrannosaur bite look really sexy. I love that when Jurassic Park owner John Hammond is forced to cut the tour short, he whines, "Why didn't I build in Orlando?" This throwaway line summons magical visions of raptors and Rexes marauding around the Magic Kingdom eating Mickey mascots off of Porta Potties. People on the Jurassic Park theme park ride wouldn't know what the Fukuiraptor was going on! Makers of Jurassic Park 4, take note: this alternate universe is where you should set your movie.
The minor characters of JP are also beyond phenomenal. For example, Robert Muldoon, the game warden, who has spent months embroiled in crazy staring contests with raptors, and it completely shows. By the time we meet him, he's too far-gone into this weird rivalry with the "big one" in the pride, like he's slowly losing his soul to her or something. Indeed, one of the great insights of the movie is that Grant learned more about raptors by studying them as wild animals than Muldoon learned by observing them in captivity. If only Muldoon had overheard Grant's take-down of that bratty kid at the beginning, he might have understood the most important thing about raptors—they attack from the side. Clever girls.
I'm even an apologist for the movie's many mistakes. I mean, the Rex footsteps' produce these monstrous impact tremors, but when she arrives to save the day at the end, she literally materializes out of nowhere. It definitely makes you wonder if she learned to tiptoe. Also, could Dennis Nedry, the park's computer programmer, have made a more suspicious exit speech? Would that even be possible? Try to sweat and stutter a little more there, guy. And did you notice that the embryo vials for Tyrannosaurus Rex and Stegosaurus were both spelled wrong? I think that screw-up might actually be intentional, a subtle endorsement of Malcolm's criticism of how Hammond slaps stuff on lunchboxes before he even knows what he has. Indeed, if you watch the movie closely, you can see that Hammond's charismatic hypocrisy is a running gag. My favorite example is that he claims to be present for every raptor birth as it helps them to, no joke, "trust" him. Cut to: the highest security paddock in the park. Because trust.
But I digress, and on this subject, I always will. So let's get down to the meat, which is what the dinosaurs would want. What I really love about Jurassic Park is that it is about everything. Or at least, it's about the everything of science, and that is the most interesting kind of everything.
by Becky Ferreira, The Awl | Read more:
"Electric Rex" is by Kyle McCoy
Subscribe to:
Comments (Atom)















