Tuesday, June 23, 2015
Gmail Formally Adds ‘Undo Send’ Option
An email meant for your husband goes to your boss. A message meant for your bridesmaids goes to your mother-in-law. Or the nuclear option: an awkward workplace reply all.
Just reading about it brings a familiar feeling of dread, the one that sets in about a millisecond after an email is sent too soon.
If you are a Gmail user, you will be relieved to know that Google will now assist you in snatching a premature message back from the ether.
After years of experimenting with it as a Labs feature, Google announced that it was formally adding an “undo send” option for web-based Gmail users. (If you are a repeat offender on mobile, the Inbox app also has an undo feature.) The new tool allows users to choose a delay time from 5 to 30 seconds in case of a change of heart.
Just reading about it brings a familiar feeling of dread, the one that sets in about a millisecond after an email is sent too soon.
If you are a Gmail user, you will be relieved to know that Google will now assist you in snatching a premature message back from the ether.
by Katie Rogers, NY Times | Read more:
Image: uncredited
Of Weapons Programs in Iran and Israel
A country in the Middle East has a clandestine nuclear development program, involving facilities hidden in the desert. After several years, the country is on the verge of acquiring nuclear weapons, even though the United States has been using all its resources to prevent that from happening. Frantic communications fly behind the scenes, between Washington and Tel Aviv.
And where is the nuclear program located? Israel.
Although Iran’s nuclear program dominates the headlines now (and did apparently have a military dimension at one time), that program has yet to produce a nuclear weapon, judging from the available public evidence. Meanwhile, the country pushing most aggressively for complete elimination of any prospect of an Iranian bomb—Israel—has an unacknowledged nuclear arsenal of its own. Although others project higher numbers, nuclear arsenal experts Hans M. Kristensen and Robert S. Norris estimate that Israel has roughly 80 warheads, built in secret.
It is noteworthy that while negotiations over limiting Iran’s enrichment program have taken center stage in news coverage—and will likely dominate the headlines as a final agreement is or is not reached at the end of this month—the history of Israel’s covert nuclear program draws relatively little media attention. Israel has long maintained a policy of nuclear ambiguity, neither confirming nor directly denying that it has a nuclear deterrent, and the United States government has officially taken the same stance, prohibiting its officials from stating that Israel is a nuclear weapons country.
But as shown in the Bulletin’s coverage over the years, the Israeli government does indeed have a robust nuclear program that began decades ago; it continues to operate outside the international nuclear nonproliferation regime to this day. This program has a convoluted history.
In a July 2013 article, nuclear proliferation scholar Leonard Weiss outlined the Lavon Affair, a failed 1954 Israeli covert operation against Egypt, undertaken in hopes it would destabilize the regime of Egypt’s leader, Gamel Abdel Nasser. In a complicated way, the bungled effort eventually deepened the Franco-Israeli military cooperation that helped Israel create its nuclear arsenal.
The details of the Lavon Affair are complex, but essentially Israeli Military Intelligence (often known by its Hebrew abbreviation AMAN) activated a sleeper cell tasked with setting off a series of bombs in Egypt, targeted against Western and Egyptian institutions, in hopes that the attacks could be blamed on Egyptian members of the Muslim Brotherhood or the Communist Party. AMAN apparently figured that the ensuing chaos would persuade Western governments that Nasser’s relatively new regime was unstable and, therefore, unworthy of financial aid and other support.
But the best-laid schemes often go astray, and the entire Israeli operation was exposed; its members were eventually tried and convicted by an Egyptian court. This caused Israel to conduct a retaliatory military raid into Gaza that killed 39 Egyptians, upsetting Egypt still further. The Egyptians, in turn, moved closer to the sphere of the old Soviet Union, concluding an arms deal that angered American and British leaders. This led to the West’s withdrawal from previously pledged support for the building of Egypt’s Aswan Dam; Nasser retaliated by nationalizing the Suez Canal; and Israel, France, and Britain subsequently tried (and failed) to invade Egypt and topple Nasser. In the wake of the failed invasion, France expanded and accelerated its ongoing nuclear cooperation with Israel, which eventually helped enable the Jewish state to build nuclear weapons.
It is easy to see why the average news editor might blanch at diving into these complicated waters to give a full, warts-and-all explication of the Israeli nuclear weapons program from its earliest days. But that is no reason to fail to report about the weapons program as a fait accompli; Israel’s program is as much a legitimate subject for media debate as the Iranian program—especially when Israel criticizes the proposed Iranian nuclear agreement.
Also given relatively short shrift in mainstream news coverage of Middle Eastern nuclear matters is the NUMEC affair, in which Israel apparently stole 100 kilograms of US bomb-grade uranium in the 1960s from a Pennsylvania nuclear fuel-processing plant. The theft was not discovered until years later, and President-elect Jimmy Carter was apparently not briefed about it until December 1976. The unexplained loss of large amounts of bomb-grade fissile material is a matter of concern, no matter what the context, but in this case it also involved a close ally—and Israel’s bomb-making program could have derailed the Carter administration’s Middle East peace efforts.
by Dan Drollette Jr, Bulletin of the Atomic Scientists | Read more:
Image: uncredited
And where is the nuclear program located? Israel.Although Iran’s nuclear program dominates the headlines now (and did apparently have a military dimension at one time), that program has yet to produce a nuclear weapon, judging from the available public evidence. Meanwhile, the country pushing most aggressively for complete elimination of any prospect of an Iranian bomb—Israel—has an unacknowledged nuclear arsenal of its own. Although others project higher numbers, nuclear arsenal experts Hans M. Kristensen and Robert S. Norris estimate that Israel has roughly 80 warheads, built in secret.
It is noteworthy that while negotiations over limiting Iran’s enrichment program have taken center stage in news coverage—and will likely dominate the headlines as a final agreement is or is not reached at the end of this month—the history of Israel’s covert nuclear program draws relatively little media attention. Israel has long maintained a policy of nuclear ambiguity, neither confirming nor directly denying that it has a nuclear deterrent, and the United States government has officially taken the same stance, prohibiting its officials from stating that Israel is a nuclear weapons country.
But as shown in the Bulletin’s coverage over the years, the Israeli government does indeed have a robust nuclear program that began decades ago; it continues to operate outside the international nuclear nonproliferation regime to this day. This program has a convoluted history.
In a July 2013 article, nuclear proliferation scholar Leonard Weiss outlined the Lavon Affair, a failed 1954 Israeli covert operation against Egypt, undertaken in hopes it would destabilize the regime of Egypt’s leader, Gamel Abdel Nasser. In a complicated way, the bungled effort eventually deepened the Franco-Israeli military cooperation that helped Israel create its nuclear arsenal.
The details of the Lavon Affair are complex, but essentially Israeli Military Intelligence (often known by its Hebrew abbreviation AMAN) activated a sleeper cell tasked with setting off a series of bombs in Egypt, targeted against Western and Egyptian institutions, in hopes that the attacks could be blamed on Egyptian members of the Muslim Brotherhood or the Communist Party. AMAN apparently figured that the ensuing chaos would persuade Western governments that Nasser’s relatively new regime was unstable and, therefore, unworthy of financial aid and other support.
But the best-laid schemes often go astray, and the entire Israeli operation was exposed; its members were eventually tried and convicted by an Egyptian court. This caused Israel to conduct a retaliatory military raid into Gaza that killed 39 Egyptians, upsetting Egypt still further. The Egyptians, in turn, moved closer to the sphere of the old Soviet Union, concluding an arms deal that angered American and British leaders. This led to the West’s withdrawal from previously pledged support for the building of Egypt’s Aswan Dam; Nasser retaliated by nationalizing the Suez Canal; and Israel, France, and Britain subsequently tried (and failed) to invade Egypt and topple Nasser. In the wake of the failed invasion, France expanded and accelerated its ongoing nuclear cooperation with Israel, which eventually helped enable the Jewish state to build nuclear weapons.
It is easy to see why the average news editor might blanch at diving into these complicated waters to give a full, warts-and-all explication of the Israeli nuclear weapons program from its earliest days. But that is no reason to fail to report about the weapons program as a fait accompli; Israel’s program is as much a legitimate subject for media debate as the Iranian program—especially when Israel criticizes the proposed Iranian nuclear agreement.
Also given relatively short shrift in mainstream news coverage of Middle Eastern nuclear matters is the NUMEC affair, in which Israel apparently stole 100 kilograms of US bomb-grade uranium in the 1960s from a Pennsylvania nuclear fuel-processing plant. The theft was not discovered until years later, and President-elect Jimmy Carter was apparently not briefed about it until December 1976. The unexplained loss of large amounts of bomb-grade fissile material is a matter of concern, no matter what the context, but in this case it also involved a close ally—and Israel’s bomb-making program could have derailed the Carter administration’s Middle East peace efforts.
by Dan Drollette Jr, Bulletin of the Atomic Scientists | Read more:
Image: uncredited
Monday, June 22, 2015
All My Cats Are Dead
My cat died last month. He had a good life—fifteen long, treat and cuddle-filled years during which he loved parties, burly men, sleeping with his head on mine, eating cardboard—and a good death.
Scientists who study such things say that we should all aim for “compression of mortality”—a long and healthy life, and then you die real fast. You don’t linger, you don’t make any tough decisions; you just live and then you die. My cat did compression of mortality like a champ. He started acting odd, was quickly diagnosed with a serious brain tumor, and went a couple of days later.
It’s hasn’t been that hard to accept that he’s being dead; it’s been hard to accept living without him. I’ve been crying a lot. For the first time in more than twenty years, I don’t have a cat. There is no one excited to see me when I get home, there is no one who will watch BBC period pieces TV with me and think they are having the best time ever, and there is no one whose delight in a piece of string can take my mind off of things. (I’m single!)
Of course, everyone keeps telling me to get a new cat. Or just assumes I will. But no fucking way. No fucking way am I getting another lovable, adorable, cuddly, affectionate, loyal little creature who is in fact a ticking time bomb set to explode my heart into a thousand pieces at some unknown point in the future. (I’m single.)
My first cat, Monster, was unplanned. I was going through a breakup—in the nineties, I was always going through a breakup—and had scuttled out of my apartment for errands before going back inside to lie on the couch, watch the OJ Simpson trial, and mull over whether life was worth living. At the hardware store on Santa Monica Boulevard, they had just found a little teeny black and white kitten, who had been pretty horribly abused. She had cuts and what seemed to be burns. She looked like the world had let her down and no one could be trusted; she looked like how I felt. I left my groceries at the hardware store and took her home.
The kitten, who turned out to be a year or two old—she was just tiny for her age— immediately went under the bed. She stayed there for about six months, with the only proof of life being a pair of glowing green eyes staring back whenever I put my head down to check on her or to introduce her to someone. My friend Ron asked if I was sure she wasn’t an owl. I got her out to go to the vet and get a clean bill of health, but otherwise my main contact with Monster was when I lay in bed at night, motionless, until she thought I was asleep: I’d listen to her scurry out to eat her food and use the litterbox before scurrying back under the bed as quickly as possible.
One night, as I was lying there, I felt a little beat of warm breath on the right side of my neck, and a faint purr. She had snuggled her tiny self on my shoulder, trusting and trembling at the same time. I held my breath and didn’t move, and she lasted about two minutes before diving back to her hideaway. (This, of course, is why I named her Monster—what else lives under the bed?)
Scientists who study such things say that we should all aim for “compression of mortality”—a long and healthy life, and then you die real fast. You don’t linger, you don’t make any tough decisions; you just live and then you die. My cat did compression of mortality like a champ. He started acting odd, was quickly diagnosed with a serious brain tumor, and went a couple of days later.It’s hasn’t been that hard to accept that he’s being dead; it’s been hard to accept living without him. I’ve been crying a lot. For the first time in more than twenty years, I don’t have a cat. There is no one excited to see me when I get home, there is no one who will watch BBC period pieces TV with me and think they are having the best time ever, and there is no one whose delight in a piece of string can take my mind off of things. (I’m single!)
Of course, everyone keeps telling me to get a new cat. Or just assumes I will. But no fucking way. No fucking way am I getting another lovable, adorable, cuddly, affectionate, loyal little creature who is in fact a ticking time bomb set to explode my heart into a thousand pieces at some unknown point in the future. (I’m single.)
My first cat, Monster, was unplanned. I was going through a breakup—in the nineties, I was always going through a breakup—and had scuttled out of my apartment for errands before going back inside to lie on the couch, watch the OJ Simpson trial, and mull over whether life was worth living. At the hardware store on Santa Monica Boulevard, they had just found a little teeny black and white kitten, who had been pretty horribly abused. She had cuts and what seemed to be burns. She looked like the world had let her down and no one could be trusted; she looked like how I felt. I left my groceries at the hardware store and took her home.
The kitten, who turned out to be a year or two old—she was just tiny for her age— immediately went under the bed. She stayed there for about six months, with the only proof of life being a pair of glowing green eyes staring back whenever I put my head down to check on her or to introduce her to someone. My friend Ron asked if I was sure she wasn’t an owl. I got her out to go to the vet and get a clean bill of health, but otherwise my main contact with Monster was when I lay in bed at night, motionless, until she thought I was asleep: I’d listen to her scurry out to eat her food and use the litterbox before scurrying back under the bed as quickly as possible.
One night, as I was lying there, I felt a little beat of warm breath on the right side of my neck, and a faint purr. She had snuggled her tiny self on my shoulder, trusting and trembling at the same time. I held my breath and didn’t move, and she lasted about two minutes before diving back to her hideaway. (This, of course, is why I named her Monster—what else lives under the bed?)
by Mikki Halpin, The Hairpin | Read more:
Image: Mikki Halpin
Daddy Issues
Until recently, I’d never been on the website AskMen.com, I suppose largely because I never had the occasion to ask a man anything. The site’s tagline touts that it is a place where men can become better men, though on my first visit I’m already suspicious that any of my questions will be answered or that I will become a better man. (...)
“Are her daddy issues to blame?” asks the post I land on. In it, the author describes the symptoms of diagnosable daddy issues, which your girlfriend or hookup partner may be suffering from, adding that he plans to advise you on how best to “handle” them if you are tasked with the daunting, unfortunate task of reversing years of neglect and mistreatment from a woman’s father.
Sexual aggressiveness is listed as a the first symptom of daddy issues, excessive flirting the second, and clinginess the last, all of these comprising the holy triumvirate of characteristics you do not want to see yourself dealing with in a girlfriend. If you end up with a woman who exhibits any one of the these behaviors, you do your best to curb them, as with a dog:
As I’d expected from even my first seconds on AskMen.com, this was grade-F male-advice “locker room” pandering, the kind that seems almost too perfect to be true or available for the casual reader of the web. Because of its home, there was no reason for me to be taking any of this seriously or thinking of it as a representative of what most rational people would conjure up when the term “daddy issues” arose. (...)
The term “daddy issues” has been so ingrained as to become commonplace, almost forgotten—one of those colloquialisms that no longer seems significant or relevant. It can be brushed aside and dismissed almost as a joke, a Lana Del Rey song so obvious that it’s surprising. But the connotation is still singular. Unlike a man who’s a “mama’s boy,” a woman with “daddy issues” has nothing soft or pleasant circling the problem. If you have daddy issues, you are certainly, without question, fucked up. Don’t ask me—ask men:
“Daddy issues” may not be the hottest term in psychobabble right now, as women are encouraged to Lean In and take responsibility for themselves despite what their fathers have wrought, but something about how normalized the term is is troubling. When it appears that we’ve let this concept slide relatively unnoticed through our cultural dialect, is there ever a way to correct and reverse that harmful language—or is it like this forever? When “she might grow up expecting the worst from men” is written down as symptom of a problem women suffer, who exactly is to blame?
“Are her daddy issues to blame?” asks the post I land on. In it, the author describes the symptoms of diagnosable daddy issues, which your girlfriend or hookup partner may be suffering from, adding that he plans to advise you on how best to “handle” them if you are tasked with the daunting, unfortunate task of reversing years of neglect and mistreatment from a woman’s father.Sexual aggressiveness is listed as a the first symptom of daddy issues, excessive flirting the second, and clinginess the last, all of these comprising the holy triumvirate of characteristics you do not want to see yourself dealing with in a girlfriend. If you end up with a woman who exhibits any one of the these behaviors, you do your best to curb them, as with a dog:
Every woman wants care and assurance from her partner and, of course, girlfriends want to spend quality time with their boyfriends. But a girl with daddy issues wants those things in excess. She may throw a fit whenever you make plans without her. She might beg and bargain whenever you try to leave her apartment. It’s important to keep her daddy issues in check by establishing strict boundaries. Stick to your guns and maintain a separate social life. If you give in to a bout of clinginess once, you’re sunk forever.Sunk forever, broham, is not where you’d like to be.
As I’d expected from even my first seconds on AskMen.com, this was grade-F male-advice “locker room” pandering, the kind that seems almost too perfect to be true or available for the casual reader of the web. Because of its home, there was no reason for me to be taking any of this seriously or thinking of it as a representative of what most rational people would conjure up when the term “daddy issues” arose. (...)
The term “daddy issues” has been so ingrained as to become commonplace, almost forgotten—one of those colloquialisms that no longer seems significant or relevant. It can be brushed aside and dismissed almost as a joke, a Lana Del Rey song so obvious that it’s surprising. But the connotation is still singular. Unlike a man who’s a “mama’s boy,” a woman with “daddy issues” has nothing soft or pleasant circling the problem. If you have daddy issues, you are certainly, without question, fucked up. Don’t ask me—ask men:
If her dad failed to show her love and affection, she might grow up expecting the worst from men. If you find her blowing up over minor screw-ups, it might be because your mistake reminds her of her father’s poor parenting.The term “daddy issues” originates from Carl Jung’s theory of the Electra complex, a counteracting theory to the Oedipus complex that suggests women want to compete with their mothers in possession of their fathers. It’s cropped up again and again in pop culture, most notably in Sylvia Plath’s poem “Daddy,” where the author claims to be through with her issues surrounding her father after killing them at the conclusion of the poem.
“Daddy issues” may not be the hottest term in psychobabble right now, as women are encouraged to Lean In and take responsibility for themselves despite what their fathers have wrought, but something about how normalized the term is is troubling. When it appears that we’ve let this concept slide relatively unnoticed through our cultural dialect, is there ever a way to correct and reverse that harmful language—or is it like this forever? When “she might grow up expecting the worst from men” is written down as symptom of a problem women suffer, who exactly is to blame?
by Dayna Evans, Jezebel | Read more:
Image: Mad Men
The Suffering of Dustin Johnson
[ed. See also: Dustin Johnson’s Dream Turns Real on No. 18, Then Nightmare Sets In.]
Congratulations to Rory McIlroy, who just earned a prize that eluded Tiger Woods for his entire career: A true rival.
That's how I planned to end this piece, back when I sat on the bank overlooking the 14th hole on Sunday, dead sure that Jordan Spieth would win the U.S. Open. I had just left Dustin Johnson behind after two bogeys and a three-putt par on 12 that might as well have been a bogey. I was supposed to stay with him all day, but when the energy starts to gather around a player like Spieth, you'd be an idiot to stay away. The prospect of walking up the 13th hole with DJ while the real action was taking place by the water was too daunting to consider, and so I abandoned him.
Johnson's playing partner, Jason Day, wasn't much better. He couldn't hit a short putt all day, and was visibly sagging after his bout with vertigo that led to an on-course collapse Friday. The heroism of Saturday's 68 was long past, and now he looked impossibly feeble. At times, the stiffness of his gait, the pained expressions, and the way he used his club as a cane all took on the appearance of melodrama -- he couldn't bend down to pick up his tee on the 11th, but he scooped it with ease on the 12th -- and it never felt quite as compelling as it had a day earlier. The geniuses at Fox didn't help matters by dedicating a camera to watching him walk between holes, even using a pointless split screen to follow his movements when actual golf was being played elsewhere. How, I wondered, is it possible to make even vertigo tacky? All they were missing was a sensational slogan: "When he collapses, we'll be there!" A few Internet wits on my Twitter feed theorized that if Day didn't oblige them by crumpling into a heap at some point, a Fox executive would appear on site with an elephant gun loaded with tranquilizers...or maybe they'd take the coward's way out and just fly a drone into his head.
In reality, the drama never transpired. Instead, Day played like he usually plays in these moments: Lots of missed putts. On nine and 11 and 12, his tee-to-green game looked fine, but his short putts slid by the hole -- USGA czar Mike Davis, looking on, gave a "wow!" after the miss on 10, possibly inspired by the fear that a mutant stalk of poa annua had shot up at the last moment to stop the ball in its tracks -- and then he lost his chance for good with a double bogey on 13.
Dustin's fade was slower, and somewhat less agonizing, but it followed a similar formula: Opportunity after opportunity wasted, sometimes against all logic. He bogeyed again on 13 after I had cut across the fescue to the 14th fairway, so I wrote him off and came up with that cute line about Rory and Tiger.
I felt I had learned something about Dustin anyway -- something debilitating and a little bit sad -- stemming from the fact that he rarely spoke with his brother and caddie Austin. It presented a stark contrast with Spieth, who kept up a neurotic monologue with Michael Greller all day, constantly seeking and receiving reassurance about the wind, the terrain, the distance, the break, and god knows what else. He uses Greller as his own personal security blanket, and Greller knows exactly how to play the role. Even in the moments of tension, the caddie is careful not to break character. On 15, for instance, Spieth had to make a short but tricky par putt after a tee shot that, despite his exhortations, rolled down a false front after flirting with the flag. A recovery putt set up the par chance, and when the ball went in the hole, Greller turned away from Spieth and just stared into the distance, his face taut as the skin of a drum. You could feel his desire to scream in relief, to let the tension emanate like doppler waves and knock us all over, but that's not his role -- he's the rock in Spieth's never-ending storm of emotions, and even a simple "oh thank God" isn't in the cards. So he just stared out over our heads for a nonverbal moment, and then he turned back to Spieth with an encouraging word. Greller's mask doesn't slip, and that's what it means to be a pro.
With Dustin, though, there's a sense of anarchy that doesn't go very well with the tension of a major championship. Austin is not the caddie with the exhaustive plan, or the supportive word. On Sunday, he didn't even serve to loosen his brother up at critical moments -- it was all silence and a few awkward laughs. I've heard a theory going around the media center that -- let's just put it bluntly -- Dustin is too dumb to be affected by nerves. But nerves are like water seeping through the cracks in a rock, and they will always find a way. The idea that a lack of intelligence makes someone immune is nonsense, and Chambers Bay proved it for the third time in Johnson's career. What he needed instead was a comprehensive plan.
With Spieth, there was always the sense that a meticulous, all-encompassing strategy was being deployed, with plan Bs and Cs where A wouldn't fly. This is what a golfer covets -- it's why they all use the royal "we" when talking about themselves in press conferences. One person strikes the ball, but a whole team can take part in the preparation and at least give the helpful illusion of collaboration. In some kind of metaphysical way, I believe this kind of group forethought somehow makes a golfer luckier, as though he can convince the universe to be on his side.
But where were Dustin's collaborators? Where was his brother when I saw him shaking his head vigorously after a poor approach on 10, as if trying to rid himself of a bad thought? By the time he struck his tee shot on the 13th hole after the run of bogeys, I felt a surge of pure pity toward him. I realize how strange that sounds, since he has the body of a god and the money of a king, but in that moment I saw him laid bare in a state of pure solitude. He had nobody to help curb the terrible loneliness inherent in golf, and he had to stand up to the relentless pressure all by himself. It was like watching a hurricane make landfall, and while Team Spieth had a fortified underground bunker ready, Dustin didn't even have the sense to strap himself to a tree.
Congratulations to Rory McIlroy, who just earned a prize that eluded Tiger Woods for his entire career: A true rival.
That's how I planned to end this piece, back when I sat on the bank overlooking the 14th hole on Sunday, dead sure that Jordan Spieth would win the U.S. Open. I had just left Dustin Johnson behind after two bogeys and a three-putt par on 12 that might as well have been a bogey. I was supposed to stay with him all day, but when the energy starts to gather around a player like Spieth, you'd be an idiot to stay away. The prospect of walking up the 13th hole with DJ while the real action was taking place by the water was too daunting to consider, and so I abandoned him.
Johnson's playing partner, Jason Day, wasn't much better. He couldn't hit a short putt all day, and was visibly sagging after his bout with vertigo that led to an on-course collapse Friday. The heroism of Saturday's 68 was long past, and now he looked impossibly feeble. At times, the stiffness of his gait, the pained expressions, and the way he used his club as a cane all took on the appearance of melodrama -- he couldn't bend down to pick up his tee on the 11th, but he scooped it with ease on the 12th -- and it never felt quite as compelling as it had a day earlier. The geniuses at Fox didn't help matters by dedicating a camera to watching him walk between holes, even using a pointless split screen to follow his movements when actual golf was being played elsewhere. How, I wondered, is it possible to make even vertigo tacky? All they were missing was a sensational slogan: "When he collapses, we'll be there!" A few Internet wits on my Twitter feed theorized that if Day didn't oblige them by crumpling into a heap at some point, a Fox executive would appear on site with an elephant gun loaded with tranquilizers...or maybe they'd take the coward's way out and just fly a drone into his head.In reality, the drama never transpired. Instead, Day played like he usually plays in these moments: Lots of missed putts. On nine and 11 and 12, his tee-to-green game looked fine, but his short putts slid by the hole -- USGA czar Mike Davis, looking on, gave a "wow!" after the miss on 10, possibly inspired by the fear that a mutant stalk of poa annua had shot up at the last moment to stop the ball in its tracks -- and then he lost his chance for good with a double bogey on 13.
Dustin's fade was slower, and somewhat less agonizing, but it followed a similar formula: Opportunity after opportunity wasted, sometimes against all logic. He bogeyed again on 13 after I had cut across the fescue to the 14th fairway, so I wrote him off and came up with that cute line about Rory and Tiger.
I felt I had learned something about Dustin anyway -- something debilitating and a little bit sad -- stemming from the fact that he rarely spoke with his brother and caddie Austin. It presented a stark contrast with Spieth, who kept up a neurotic monologue with Michael Greller all day, constantly seeking and receiving reassurance about the wind, the terrain, the distance, the break, and god knows what else. He uses Greller as his own personal security blanket, and Greller knows exactly how to play the role. Even in the moments of tension, the caddie is careful not to break character. On 15, for instance, Spieth had to make a short but tricky par putt after a tee shot that, despite his exhortations, rolled down a false front after flirting with the flag. A recovery putt set up the par chance, and when the ball went in the hole, Greller turned away from Spieth and just stared into the distance, his face taut as the skin of a drum. You could feel his desire to scream in relief, to let the tension emanate like doppler waves and knock us all over, but that's not his role -- he's the rock in Spieth's never-ending storm of emotions, and even a simple "oh thank God" isn't in the cards. So he just stared out over our heads for a nonverbal moment, and then he turned back to Spieth with an encouraging word. Greller's mask doesn't slip, and that's what it means to be a pro.
With Dustin, though, there's a sense of anarchy that doesn't go very well with the tension of a major championship. Austin is not the caddie with the exhaustive plan, or the supportive word. On Sunday, he didn't even serve to loosen his brother up at critical moments -- it was all silence and a few awkward laughs. I've heard a theory going around the media center that -- let's just put it bluntly -- Dustin is too dumb to be affected by nerves. But nerves are like water seeping through the cracks in a rock, and they will always find a way. The idea that a lack of intelligence makes someone immune is nonsense, and Chambers Bay proved it for the third time in Johnson's career. What he needed instead was a comprehensive plan.
With Spieth, there was always the sense that a meticulous, all-encompassing strategy was being deployed, with plan Bs and Cs where A wouldn't fly. This is what a golfer covets -- it's why they all use the royal "we" when talking about themselves in press conferences. One person strikes the ball, but a whole team can take part in the preparation and at least give the helpful illusion of collaboration. In some kind of metaphysical way, I believe this kind of group forethought somehow makes a golfer luckier, as though he can convince the universe to be on his side.
But where were Dustin's collaborators? Where was his brother when I saw him shaking his head vigorously after a poor approach on 10, as if trying to rid himself of a bad thought? By the time he struck his tee shot on the 13th hole after the run of bogeys, I felt a surge of pure pity toward him. I realize how strange that sounds, since he has the body of a god and the money of a king, but in that moment I saw him laid bare in a state of pure solitude. He had nobody to help curb the terrible loneliness inherent in golf, and he had to stand up to the relentless pressure all by himself. It was like watching a hurricane make landfall, and while Team Spieth had a fortified underground bunker ready, Dustin didn't even have the sense to strap himself to a tree.
by Shane Ryan, Golf Digest | Read more:
Image: Getty
Sunday, June 21, 2015
Thursday, June 18, 2015
Wednesday, June 17, 2015
Artless
The fine arts don’t matter any more to most educated people. This is not a statement of opinion; it is a statement of fact.
As recently as the late 20th century, well-educated people were expected to be able to bluff their way through a dinner party with at least some knowledge of “the fine arts” — defined, since the late 18th century, as painting, sculpture, orchestral or symphonic music, as distinct from popular music, and dance/ballet. (“Starchitects” notwithstanding, architecture has never really been one of the fine arts — it is too utilitarian, too collaborative and too public).
A few decades ago, in American gentry circles, it would have been a terrible faux pas not to have heard of Martha Graham. You were expected to know the difference between a French impressionist and an abstract expressionist. Being taken to the symphony and ballet as a child was a rite of initiation into what Germans call the Bildungsburgertum (the cultivated bourgeoisie). (...)
There is still an art world, to be sure, in New York and London and Paris and elsewhere. But it is as insular and marginal as the fashion world, with a similar constituency of rich buyers interacting with producers seeking to sell their wares and establish their brands. Members of the twenty-first century educated elite, even members of the professoriate, will not embarrass themselves if they have never heard of the Venice Biennale.
Many of the Arts Formerly Known as Fine seem to have lost even a small paying constituency among rich people, and live a grant-to-mouth existence. In the old days, bohemian painters lived in garrets and tried to interest gallery owners in their work. Their modern heirs — at least the ones fortunate to have university jobs — can teach classes and apply for grants from benevolent foundations, while creating works of art that nobody may want to buy. Born in bohemia, many aging arts have turned universities into their nursing homes.
What happened? How is it that, in only a generation or two, educated Americans went from at least pretending to know and care about the fine arts to paying no attention at all?
The late Hilton Kramer, editor of The New Criterion, blamed the downfall of the fine arts on purveyors of Pop Art like Andy Warhol. And Jeff Koons, who replaced Arnoldian “high seriousness” and the worship of capital-c Culture with iconoclasm, mockery, and irony. A Great Tradition of two millenia that could be felled by Andy Warhol must have been pretty feeble! But the whole idea of a Phidias-to-Pollock tradition of Great Western Art was unhistorical. The truth is that the evolution (or if you like the degeneration) from Cezanne to Warhol was inevitable from the moment that royal, aristocratic and ecclesiastical patronage was replaced by the market.
Having lost their royal and aristocratic patrons, and finding little in the way of public patronage in modern states, artists from the 19th century to the 21st have sought new patrons among the wealthy people and institutions who have formed the tiny art market. It was not the mockery of Pop artists but the capitalist art market itself which, in its ceaseless quest for novelty, trivialized and marginalized the arts.
The dynamic is clearest in the case of painting and allied visual arts. Markets tend to prize fashionable novelty over continuity. The shocking and sensational get more attention than subtle variations on traditional conventions and themes. Capitalism, applied to the fine arts, created the arms race that led to increasingly drastic departures from premodern artistic tradition, until finally, by the late 20th century, “art” could be everything and therefore nothing.
As recently as the late 20th century, well-educated people were expected to be able to bluff their way through a dinner party with at least some knowledge of “the fine arts” — defined, since the late 18th century, as painting, sculpture, orchestral or symphonic music, as distinct from popular music, and dance/ballet. (“Starchitects” notwithstanding, architecture has never really been one of the fine arts — it is too utilitarian, too collaborative and too public).A few decades ago, in American gentry circles, it would have been a terrible faux pas not to have heard of Martha Graham. You were expected to know the difference between a French impressionist and an abstract expressionist. Being taken to the symphony and ballet as a child was a rite of initiation into what Germans call the Bildungsburgertum (the cultivated bourgeoisie). (...)
There is still an art world, to be sure, in New York and London and Paris and elsewhere. But it is as insular and marginal as the fashion world, with a similar constituency of rich buyers interacting with producers seeking to sell their wares and establish their brands. Members of the twenty-first century educated elite, even members of the professoriate, will not embarrass themselves if they have never heard of the Venice Biennale.
Many of the Arts Formerly Known as Fine seem to have lost even a small paying constituency among rich people, and live a grant-to-mouth existence. In the old days, bohemian painters lived in garrets and tried to interest gallery owners in their work. Their modern heirs — at least the ones fortunate to have university jobs — can teach classes and apply for grants from benevolent foundations, while creating works of art that nobody may want to buy. Born in bohemia, many aging arts have turned universities into their nursing homes.
What happened? How is it that, in only a generation or two, educated Americans went from at least pretending to know and care about the fine arts to paying no attention at all?
The late Hilton Kramer, editor of The New Criterion, blamed the downfall of the fine arts on purveyors of Pop Art like Andy Warhol. And Jeff Koons, who replaced Arnoldian “high seriousness” and the worship of capital-c Culture with iconoclasm, mockery, and irony. A Great Tradition of two millenia that could be felled by Andy Warhol must have been pretty feeble! But the whole idea of a Phidias-to-Pollock tradition of Great Western Art was unhistorical. The truth is that the evolution (or if you like the degeneration) from Cezanne to Warhol was inevitable from the moment that royal, aristocratic and ecclesiastical patronage was replaced by the market.
Having lost their royal and aristocratic patrons, and finding little in the way of public patronage in modern states, artists from the 19th century to the 21st have sought new patrons among the wealthy people and institutions who have formed the tiny art market. It was not the mockery of Pop artists but the capitalist art market itself which, in its ceaseless quest for novelty, trivialized and marginalized the arts.
The dynamic is clearest in the case of painting and allied visual arts. Markets tend to prize fashionable novelty over continuity. The shocking and sensational get more attention than subtle variations on traditional conventions and themes. Capitalism, applied to the fine arts, created the arms race that led to increasingly drastic departures from premodern artistic tradition, until finally, by the late 20th century, “art” could be everything and therefore nothing.
by Michael Lind, The Smart Set | Read more:
Image:
The Geek’s Chihuahua
Rather than thinking of the iPhone as a smartphone, like a Treo or a BlackBerry or, eventually, the Android devices that would mimic it, one would do better to think of the iPhone as a pet. It is the toy dog of mobile devices, a creature one holds gently and pets carefully, never sure whether it might nuzzle or bite. Like a Chihuahua, it rides along with you, in arm or in purse or in pocket, peering out to assert both your status as its owner and its mastery over you as empress. And like a toy dog, it reserves the right never to do the same thing a second time, even given the same triggers. Its foibles and eccentricities demand far greater effort than its more stoic smartphone cousins, but in so doing, it challenges you to make sense of it. (...)
At the start of 2015, fewer than eight short years since the first launch of the iPhone, Apple was worth more than seven hundred billion dollars—more than the gross national product of Switzerland. Despite its origins as a computer company, this is a fortune built from smartphones more than laptops. Before 2007, smartphones were a curiosity, mostly an affectation of would-be executives carting BlackBerries and Treos in unfashionable belt holsters. Not even a decade ago, they were wild and feral. Today, smartphones are fully domesticated. Tigers made kittens, which we now pet ceaselessly. More than two-thirds of Americans own them, and they have become the primary form of computing.
But along with that domestication comes the inescapability of docility. Have you not accepted your smartphone’s reign over you rather than lamenting it? Stroking our glass screens, Chihuahua-like, is just what we do now, even if it also feels sinful. The hope and promise of new computer technology have given way to the malaise of living with it. (...)
Technology moves fast, but its speed now slows us down. A torpor has descended, the weariness of having lived this change before—or one similar enough, anyway—and all too recently. The future isn’t even here yet, and it’s already exhausted us in advance.
It’s a far cry from “future shock,” Alvin Toffler’s 1970 term for the postindustrial sensation that too much change happens in too short a time. Where once the loss of familiar institutions and practices produced a shock, now it produces something more tepid and routine. The planned obsolescence that coaxes us to replace our iPhone 5 with an iPhone 6 is no longer disquieting but just expected. I have to have one has become Of course I’ll get one. The idea that we might willingly reinvent social practice around wristwatch computers less than a decade after reforming it for smart- phones is no longer surprising but predictable. We’ve heard this story before; we know how it ends.
Future shock is over. Apple Watch reveals that we suffer a new affliction: future ennui. The excitement of a novel technology (or anything, really) has been replaced—or at least dampened—by the anguish of knowing its future burden. This listlessness might yet prove even worse than blind boosterism or cynical naysaying. Where the trauma of future shock could at least light a fire under its sufferers, future ennui exudes the viscous languor of indifferent acceptance. It doesn’t really matter that the Apple Watch doesn’t seem necessary, no more than the iPhone once didn’t too. Increasingly, change is not revolutionary, to use a word Apple has made banal, but presaged.
Our lassitude will probably be great for the companies like Apple that have worn us down with the constancy of their pestering. The poet Charles Baudelaire called ennui the worst sin, the one that could “swallow the world in a yawn.” As Apple Watch leads the suppuration of a new era of wearables, who has energy left to object? Who has the leisure for revolution, as we keep up with our social media timelines and emails and home thermostats and heart monitors?
by Ian Bogost, Longreads | Read more:
Image: LWYang
At the start of 2015, fewer than eight short years since the first launch of the iPhone, Apple was worth more than seven hundred billion dollars—more than the gross national product of Switzerland. Despite its origins as a computer company, this is a fortune built from smartphones more than laptops. Before 2007, smartphones were a curiosity, mostly an affectation of would-be executives carting BlackBerries and Treos in unfashionable belt holsters. Not even a decade ago, they were wild and feral. Today, smartphones are fully domesticated. Tigers made kittens, which we now pet ceaselessly. More than two-thirds of Americans own them, and they have become the primary form of computing.
But along with that domestication comes the inescapability of docility. Have you not accepted your smartphone’s reign over you rather than lamenting it? Stroking our glass screens, Chihuahua-like, is just what we do now, even if it also feels sinful. The hope and promise of new computer technology have given way to the malaise of living with it. (...)
Technology moves fast, but its speed now slows us down. A torpor has descended, the weariness of having lived this change before—or one similar enough, anyway—and all too recently. The future isn’t even here yet, and it’s already exhausted us in advance.
It’s a far cry from “future shock,” Alvin Toffler’s 1970 term for the postindustrial sensation that too much change happens in too short a time. Where once the loss of familiar institutions and practices produced a shock, now it produces something more tepid and routine. The planned obsolescence that coaxes us to replace our iPhone 5 with an iPhone 6 is no longer disquieting but just expected. I have to have one has become Of course I’ll get one. The idea that we might willingly reinvent social practice around wristwatch computers less than a decade after reforming it for smart- phones is no longer surprising but predictable. We’ve heard this story before; we know how it ends.
Future shock is over. Apple Watch reveals that we suffer a new affliction: future ennui. The excitement of a novel technology (or anything, really) has been replaced—or at least dampened—by the anguish of knowing its future burden. This listlessness might yet prove even worse than blind boosterism or cynical naysaying. Where the trauma of future shock could at least light a fire under its sufferers, future ennui exudes the viscous languor of indifferent acceptance. It doesn’t really matter that the Apple Watch doesn’t seem necessary, no more than the iPhone once didn’t too. Increasingly, change is not revolutionary, to use a word Apple has made banal, but presaged.
Our lassitude will probably be great for the companies like Apple that have worn us down with the constancy of their pestering. The poet Charles Baudelaire called ennui the worst sin, the one that could “swallow the world in a yawn.” As Apple Watch leads the suppuration of a new era of wearables, who has energy left to object? Who has the leisure for revolution, as we keep up with our social media timelines and emails and home thermostats and heart monitors?
by Ian Bogost, Longreads | Read more:
Image: LWYang
Definition of Race Becoming Fluid
[ed. See also: Kareem Abdul-Jabbar: Let Rachel Dolezal be as black as she wants to be.]
Rachel Dolezal, born to white parents, self-identifies as black - a decision that illustrates how fluid identity can be in a diversifying America, as the rigid racial structures that have defined most of this country's history seem, for some, to be softening.
Dolezal resigned as the leader of the NAACP's Spokane, Washington, branch after questions surfaced about her racial identity. When asked directly on NBC's "Today" show Tuesday whether she is "an African-American woman," Dolezal replied, "I identify as black."
Her parents identified her as white with a trace of Native American heritage, and her mother, Ruthanne Dolezal, has said Rachel began to "disguise herself" as black after her parents adopted four black children more than a decade ago.
Dolezal isn't the first person to make this type of change. Millions of Americans changed racial or ethnic identities between the 2000 and 2010 censuses, even though their choices may have contradicted what their skin color appeared to be, or who their parents said they are.
"It forces us to really question whether or not this biological basis for identity is a smart path to continue down in the future," said Camille Gear Rich, a University of Southern California law and sociology professor who writes about elective race choices.
Americans have become comfortable with people self-identifying their race, Rich said, "but often that invocation of identity based on a biological claim isn't backed up by anything else after the claim is made."
In the United States, there is an expectation that people would have a biological connection to a racial or an ethnic identity they are claiming, said Nikki Khanna, a University of Vermont sociology professor. She co-authored a 2010 study that found increasing numbers of biracial adults were choosing to self-identify as multiracial or black instead of white.
"There really is no biological basis to race, but what I'm saying is that in our society the everyday person tends to think race must have some link to ancestry," Khanna said. "So we expect that when people self-identify with a particular group they must have some ancestral link to that group."
In the past, race was determined mostly by what other people thought a person was. For example, the Census Bureau's enumerators would determine on their own what a person's race was, and classify them as such. By the 1960s and 1970s, census officials were allowing people to self-identify.
Currently, the Census Bureau allows people to choose a racial category, or even multiple categories, to which they think they belong. The census identifies races as white; black or African-American; American Indian or Alaska Native; Asian; Native Hawaiian or Other Pacific Islander; and "some other race" for those claiming more than one race. There is also a Hispanic ethnic category.
People have been using that freedom since the early 2000s to move back and forth. They switched between races, moved from multiple races to a single race or back, or decided to add or drop Hispanic ethnicity from their identifiers on census forms.
Last year, a study showed that 1 in 16 people - or approximately 9.8 million of 162 million - who responded to both the 2000 and 2010 censuses gave different answers when it came to race and ethnicity. In addition, in the 2010 census, more than 21.7 million - at least 1 in 14 - went beyond the standard labels and wrote in such terms as "Arab," "Haitian," "Mexican" and "multiracial."
Rachel Dolezal, born to white parents, self-identifies as black - a decision that illustrates how fluid identity can be in a diversifying America, as the rigid racial structures that have defined most of this country's history seem, for some, to be softening.
Dolezal resigned as the leader of the NAACP's Spokane, Washington, branch after questions surfaced about her racial identity. When asked directly on NBC's "Today" show Tuesday whether she is "an African-American woman," Dolezal replied, "I identify as black."
Her parents identified her as white with a trace of Native American heritage, and her mother, Ruthanne Dolezal, has said Rachel began to "disguise herself" as black after her parents adopted four black children more than a decade ago.Dolezal isn't the first person to make this type of change. Millions of Americans changed racial or ethnic identities between the 2000 and 2010 censuses, even though their choices may have contradicted what their skin color appeared to be, or who their parents said they are.
"It forces us to really question whether or not this biological basis for identity is a smart path to continue down in the future," said Camille Gear Rich, a University of Southern California law and sociology professor who writes about elective race choices.
Americans have become comfortable with people self-identifying their race, Rich said, "but often that invocation of identity based on a biological claim isn't backed up by anything else after the claim is made."
In the United States, there is an expectation that people would have a biological connection to a racial or an ethnic identity they are claiming, said Nikki Khanna, a University of Vermont sociology professor. She co-authored a 2010 study that found increasing numbers of biracial adults were choosing to self-identify as multiracial or black instead of white.
"There really is no biological basis to race, but what I'm saying is that in our society the everyday person tends to think race must have some link to ancestry," Khanna said. "So we expect that when people self-identify with a particular group they must have some ancestral link to that group."
In the past, race was determined mostly by what other people thought a person was. For example, the Census Bureau's enumerators would determine on their own what a person's race was, and classify them as such. By the 1960s and 1970s, census officials were allowing people to self-identify.
Currently, the Census Bureau allows people to choose a racial category, or even multiple categories, to which they think they belong. The census identifies races as white; black or African-American; American Indian or Alaska Native; Asian; Native Hawaiian or Other Pacific Islander; and "some other race" for those claiming more than one race. There is also a Hispanic ethnic category.
People have been using that freedom since the early 2000s to move back and forth. They switched between races, moved from multiple races to a single race or back, or decided to add or drop Hispanic ethnicity from their identifiers on census forms.
Last year, a study showed that 1 in 16 people - or approximately 9.8 million of 162 million - who responded to both the 2000 and 2010 censuses gave different answers when it came to race and ethnicity. In addition, in the 2010 census, more than 21.7 million - at least 1 in 14 - went beyond the standard labels and wrote in such terms as "Arab," "Haitian," "Mexican" and "multiracial."
by Jesse J. Holland, AP | Read more:
Image:Dan Pelle/APTuesday, June 16, 2015
Chambers Bay
Chambers Bay is unknown by most, unproven to many, and undeniably a strange concoction. Why is it positioned to set so many U.S. Open records? The players have yet to tee off, but the 2015 U.S. Open, the first in the Pacific Northwest, is already making history. A decade ago the course, as improbable and unconventional as they come, didn't exist. Now it's hosting the U.S. Open? Inconceivable.
If it hasn't happened before in an Open, it's probably happening June 18-21 at Chambers Bay.
It's the first U.S. Open to be contested in a sand box. Chambers Bay lies in an old sand and gravel pit on the western edge of the Tacoma, Wash., suburb of University Place. It's a tilted bowl, open on the west, with railroad tracks and gorgeous Puget Sound beyond. To the east is a high, long cliff. Atop its rim is Grandview Drive, where rubberneckers can stand with binoculars and scout for Rory, Phil & Co. some 80 feet below. (...)
This will not be the first U.S. Open played on sandy soil. Shinnecock Hills on Long Island has been an Open site as far back as 1896, as recently as 2004 and will host again in 2018. But Shinnecock consists of holes that were staked along tree-dotted sand hills, following lines of least resistance. Chambers Bay was dotted with piles of mining spoils, free to be sifted, shifted and molded to creative whims. A sand box, in which 1.5 million cubic yards were pushed around. (...)
It's the first all-fescue U.S. Open Course. The fescue turf, ideal in a maritime climate, is common on the links of Scotland, Ireland and the English coastline, but not on courses in America.
Today, everything could be mowed at greens height if desired. For the Open, the highly contoured greens will be mowed at .18 inches, which will translate to a Stimpmeter reading of 12, and there will be noticeable grain. There will be a belt of fescue rough at about three to four inches (narrowing some of the widest fairways to 40 or 50 yards), then taller stuff farther out.
"The beauty of fine fescue, besides needing less water and less fertilization than other grasses, is that it's the least tacky grass I know of," Davis says. "You get a wonderful bounce on it." (...)
It's the first U.S. Open course to have holes that will alternate par. For the Open, Davis will convert the fourth, normally a par 5, into a par 4, so the course will play as a par 70. "It's a much more interesting drive zone when you move the tee up," he says.
Total yardage will vary every day. The maximum length is 7,940 yards, but for the Open, yardage will range from 7,200 to 7,700, depending on weather, wind conditions and tee and hole locations.
For a time, Davis toyed with the idea of playing the course as a par 71 on certain days and par 70 for other rounds because he was undecided on whether to play the first and 18th holes as long par 5s each day (par 71), or one of them as a par 4 (par 70). Then it occurred to him, because the two holes are parallel in opposite directions, he could alternate the par each day and still retain the overall par of 70.
"When we play the first hole as a par 4, 18 will be a par 5, and vice versa," Davis says. "Both holes are so neat architecturally, both as par 4s and par 5s. It speaks volumes for the incredible flexibility of the design."
The straightaway par-4 first hole becomes a dogleg-left par 5 from a new back tee, with different fairway slopes in separate landing areas. The 604-yard, par-5 18th has completely different fairway bunkering when played as a 525-yard par 4."I don't know which rounds we'll switch them," Davis says. "Would I rather have 18 as a par 4 or a par 5 on the final day? I don't know. If it's a par 5, there's a possibility of making history, making eagle or birdie to win the Open. That's never happened.
"But part of me says, Hey, this is the U.S. Open. It ought to be set up so a hard-earned par 4 wins it all. I've chewed on that over and over. I suspect we'll look at what the wind conditions will be the last couple of days, and decide then."
In recent years, Davis added a last-minute bunker on the 17th at Olympic in 2012 and turned the famed par-4 fifth at Pinehurst No. 2 into a par 5 last year. For Chambers Bay, he directed that a deep bunker be installed in the middle of the fairway about 120 yards short of the 18th green.
"When playing 18 as a par 5, there needed to be something in the lay-up area," he says. "The fairway was 85 yards wide in that second landing area. A guy could be blindfolded and couldn't miss the fairway. I didn't want to bastardize the hole by bringing in rough. So we suggested sticking something in the middle, so they'd have to play around, or short, or left, or right." A crew dug the six-foot-deep diagonal bunker where Davis wanted it, but he wanted something so deep even a great player couldn't reach the green. So the crew dug some more. The bunker is now 10 feet deep. Curiously, though Davis defends its placement and depth, he doesn't think it will see any action. "If there is one, single, solitary player in the U.S. Open in that bunker, I'll be amazed," he says. "But they're going to have to think about it. And that's the whole idea." Alfred Hitchcock called that sort of device a MacGuffin. Local caddies call it the Chambers Basement. [ed. I've also heard it called the "Whine Cellar".]
by Ron Whitten, Golf Digest | Read more:
Images: Chambers Bay and Golf Club Atlas
Not All Equal When It Comes To Water
Drought or no drought, Steve Yuhas resents the idea that it is somehow shameful to be a water hog. If you can pay for it, he argues, you should get your water.
People “should not be forced to live on property with brown lawns, golf on brown courses or apologize for wanting their gardens to be beautiful,” Yuhas fumed recently on social media. “We pay significant property taxes based on where we live,” he added in an interview. “And, no, we’re not all equal when it comes to water.”
Yuhas lives in the ultra-wealthy enclave of Rancho Santa Fe, a bucolic Southern California hamlet of ranches, gated communities and country clubs that guzzles five times more water per capita than the statewide average. In April, after Gov. Jerry Brown (D) called for a 25 percent reduction in water use, consumption in Rancho Santa Fe went up by 9 percent.
But a moment of truth is at hand for Yuhas and his neighbors, and all of California will be watching: On July 1, for the first time in its 92-year history, Rancho Santa Fe will be subject to water rationing.
“It’s no longer a ‘You can only water on these days’ ” situation, said Jessica Parks, spokeswoman for the Santa Fe Irrigation District, which provides water service to Rancho Santa Fe and other parts of San Diego County. “It’s now more of a ‘This is the amount of water you get within this billing period. And if you go over that, there will be high penalties.’ ”
So far, the community’s 3,100 residents have not felt the wrath of the water police. Authorities have issued only three citations for violations of a first round of rather mild water restrictions announced last fall. In a place where the median income is $189,000, where PGA legend Phil Mickelson once requested a separate water meter for his chipping greens, where financier Ralph Whitworth last month paid the Rolling Stones $2 million to play at a local bar, the fine, at $100, was less than intimidating.
All that is about to change, however. Under the new rules, each household will be assigned an essential allotment for basic indoor needs. Any additional usage — sprinklers, fountains, swimming pools — must be slashed by nearly half for the district to meet state-mandated targets.
Residents who exceed their allotment could see their already sky-high water bills triple. And for ultra-wealthy customers undeterred by financial penalties, the district reserves the right to install flow restrictors — quarter-size disks that make it difficult to, say, shower and do a load of laundry at the same time. (...)
People “should not be forced to live on property with brown lawns, golf on brown courses or apologize for wanting their gardens to be beautiful,” Yuhas fumed recently on social media. “We pay significant property taxes based on where we live,” he added in an interview. “And, no, we’re not all equal when it comes to water.”Yuhas lives in the ultra-wealthy enclave of Rancho Santa Fe, a bucolic Southern California hamlet of ranches, gated communities and country clubs that guzzles five times more water per capita than the statewide average. In April, after Gov. Jerry Brown (D) called for a 25 percent reduction in water use, consumption in Rancho Santa Fe went up by 9 percent.
But a moment of truth is at hand for Yuhas and his neighbors, and all of California will be watching: On July 1, for the first time in its 92-year history, Rancho Santa Fe will be subject to water rationing.
“It’s no longer a ‘You can only water on these days’ ” situation, said Jessica Parks, spokeswoman for the Santa Fe Irrigation District, which provides water service to Rancho Santa Fe and other parts of San Diego County. “It’s now more of a ‘This is the amount of water you get within this billing period. And if you go over that, there will be high penalties.’ ”
So far, the community’s 3,100 residents have not felt the wrath of the water police. Authorities have issued only three citations for violations of a first round of rather mild water restrictions announced last fall. In a place where the median income is $189,000, where PGA legend Phil Mickelson once requested a separate water meter for his chipping greens, where financier Ralph Whitworth last month paid the Rolling Stones $2 million to play at a local bar, the fine, at $100, was less than intimidating.
All that is about to change, however. Under the new rules, each household will be assigned an essential allotment for basic indoor needs. Any additional usage — sprinklers, fountains, swimming pools — must be slashed by nearly half for the district to meet state-mandated targets.
Residents who exceed their allotment could see their already sky-high water bills triple. And for ultra-wealthy customers undeterred by financial penalties, the district reserves the right to install flow restrictors — quarter-size disks that make it difficult to, say, shower and do a load of laundry at the same time. (...)
“I call it the war on suburbia,” said Brett Barbre, who lives in the Orange County community of Yorba City, another exceptionally wealthy Zip code.
Barbre sits on the 37-member board of directors of the Metropolitan Water District of Southern California, a huge water wholesaler serving 17 million customers. He is fond of referring to his watering hose with Charlton Heston’s famous quote about guns: “They’ll have to pry it from my cold, dead hands.”
“California used to be the land of opportunity and freedom,” Barbre said. “It’s slowly becoming the land of one group telling everybody else how they think everybody should live their lives.”
Jurgen Gramckow, a sod farmer north of Los Angeles in Ventura County, agrees. He likens the freedom to buy water to the freedom to buy gasoline.
“Some people have a Prius; others have a Suburban,” Gramckow said. “Once the water goes through the meter, it’s yours.”
Barbre sits on the 37-member board of directors of the Metropolitan Water District of Southern California, a huge water wholesaler serving 17 million customers. He is fond of referring to his watering hose with Charlton Heston’s famous quote about guns: “They’ll have to pry it from my cold, dead hands.”
“California used to be the land of opportunity and freedom,” Barbre said. “It’s slowly becoming the land of one group telling everybody else how they think everybody should live their lives.”
Jurgen Gramckow, a sod farmer north of Los Angeles in Ventura County, agrees. He likens the freedom to buy water to the freedom to buy gasoline.
“Some people have a Prius; others have a Suburban,” Gramckow said. “Once the water goes through the meter, it’s yours.”
by Rob Kuznia, Washington Post | Read more:
Image: Sandy HuffakerRun Rabbit Run
[ed. Probably the coolest thing I've seen this morning (which, on most mornings, isn't saying much).]
Virtual Reality Headsets Raise Very Real Concerns
[ed. If you think cell phones are obnoxious and isolating (and they are), just wait. VR is going to have a massive impact on society.]
Every Friday, a dozen or so people strap on virtual reality headsets, log on to the Internet and do something that would normally require driving to a local multiplex: watch a movie with a bunch of strangers.
Their avatars all sit in the seats of a virtual movie theater, staring at a screen playing a movie from Netflix. The sound from the theater is so accurate that if participants munch potato chips into their microphones, it sounds as though it is emanating from their avatars.
“When all of a sudden 10 avatars turn around and look at you, you know you should be quiet,” said Eric Romo, the chief executive of AltspaceVR, a Silicon Valley start-up that organizes the virtual movie gatherings and other virtual reality events.
The ability of virtual reality to transport people to locales both exotic and ordinary, is well known. Yet how the medium will fit into people’s online and offline lives is a new frontier. (...)
That makes the thousands of developers and early adopters, who already have prototype virtual reality headsets, effectively lab rats for these devices. They’re the ones figuring out how to navigate their real-life surroundings when their vision of the real world is shut out.
They’re learning which virtual reality experiences are fun, which are creepy and which might make people nauseated from motion sickness.
Etiquette around social forms of virtual reality is already taking shape since this technology has the potential to turn some of the more noxious forms of online behavior into something far more menacing.
Every Friday, a dozen or so people strap on virtual reality headsets, log on to the Internet and do something that would normally require driving to a local multiplex: watch a movie with a bunch of strangers.
Their avatars all sit in the seats of a virtual movie theater, staring at a screen playing a movie from Netflix. The sound from the theater is so accurate that if participants munch potato chips into their microphones, it sounds as though it is emanating from their avatars.“When all of a sudden 10 avatars turn around and look at you, you know you should be quiet,” said Eric Romo, the chief executive of AltspaceVR, a Silicon Valley start-up that organizes the virtual movie gatherings and other virtual reality events.
The ability of virtual reality to transport people to locales both exotic and ordinary, is well known. Yet how the medium will fit into people’s online and offline lives is a new frontier. (...)
That makes the thousands of developers and early adopters, who already have prototype virtual reality headsets, effectively lab rats for these devices. They’re the ones figuring out how to navigate their real-life surroundings when their vision of the real world is shut out.
They’re learning which virtual reality experiences are fun, which are creepy and which might make people nauseated from motion sickness.
Etiquette around social forms of virtual reality is already taking shape since this technology has the potential to turn some of the more noxious forms of online behavior into something far more menacing.
by Nick Wingfield, NY Times | Read more:
Image: Ramin Talaie
Subscribe to:
Comments (Atom)









