Saturday, March 24, 2012

Native Hawaiians Provide Lessons In Fisheries Management

Roughly three-quarters of the Earth’s surface is covered with water. As I stand on a beach in Hawaii and look out over the vast, blue expanse in front of me, I am overwhelmed by the immensity of the Pacific Ocean. My brain wrestles with numbers far beyond its capacity to visualize. In that moment, it is incomprehensible that even seven billion humans could deplete such a boundless and unimaginable resource. Yet, I know that we are. We are emptying the oceans of their fish, one species at a time.

Today, 85 percent of the world’s fisheries are either fully exploited, overexploited or have already collapsed. Combined, the world’s fishermen catch 2.5 times the sustainable number of fish every year. Scientists predict that if current trends continue, world food fisheries may collapse entirely by 2050. “We are in the situation where 40 years down the line we, effectively, are out of fish,” explains Pavan Sukhdev, special advisor to the UN Environment Programme.

What we need are better management strategies. Now, researchers from the Center for Ocean Solutions at Stanford University are turning to the past for advice. Loren McClenachan and Jack Kittinger used historical records to reconstruct fish catches for the past seven hundred years to see if earlier civilizations did a better job than we are at managing their fisheries. The authors were able to characterize historical catch rates in the Florida Keys and Hawaii by reviewing a variety of historical sources, including species-specific catch records from the 1800s and archaeological reconstructions of population densities and per-capita fish consumption.

“Seven hundred years of history clearly demonstrate that management matters,” said Loren McClenachan, co-author of the study and assistant professor of environmental studies at Colby College. In Florida, fisheries were characterized by years of boom and bust through sequential collapse of high-value species, many which are still endangered or extinct today. The Keys fisheries were set up for failure – unlike other historical island communities, the Keys were highly connected to other markets, increasing fisheries demand. Furthermore, they have historically lacked a centralized management system. But, while fisheries in the Florida Keys have always been poorly supervised, fisheries in Hawaii were once far better than they are today.

“Before European contact, Native Hawaiians were catching fish at rates that far exceed what reefs currently provide society,” said Kittinger, co-author and early career fellow at the Center for Ocean Solutions. Native Hawaiians pulled in over 15,000 metric tons of fish per year, and these high yields were sustained over several hundred years, despite a dense Hawaiian population. “These results show us that fisheries can be both highly productive and sustainable, if they’re managed effectively.”

by Christie Wilcox, Scientific American |  Read more:

What Comes After the Hipster? We Ask the Experts


With Lana Del Rey’s meteoric, blog hype-fueled rise and rapid, SNL-catalyzed descent, the mere existence of MTV’s I Just Want My Pants Back and the trendy intellectual publication n+1 already taking a wishful backward glance at the subculture, hipsterdom appears to be on the wane. Have we reached a tipping point? If so, what’s next for American youth-based movements? While aware that the ability to predict the future is a rare trait, we asked several intrepid thinkers, writers, and academic types to hazard a guess. Specifically, we asked: 1. Keeping in mind the crude progression of subcultures from Beatnik to Hippie to Punk to Grunge to Hipster, what kind of prominent group will emerge next? 2. Or is the Hipster some form of the last widespread, cohesive subculture in this post-war lineage, since the Internet and other changes to American life are making this a nation of fragmented cultural tribes? Here’s what they said…

Robert Sloane, Instructor of American Culture Studies at Bowling Green State University (with Alex Champlin):

It’s difficult to talk about these groups as a “lineage,” because besides being groups that were associated with young Americans, they all had different levels of cohesion, formed in response to different social conditions, and produced different results. It seems to me that the beatniks and hippies were reacting more to society-level characteristics (conformity, political and cultural conservatism), whereas I associate the punks and “grunge” folks (slackers? Generation X?) with a cultural rebellion, reacting against a certain ossification in corporate culture (and especially music, although not exclusively). Interestingly, hip hop is missing from this list, and it seems to be doing both and neither at once, creating something new out of very limited opportunities. Hipsters seem to be a more general taste culture, embodying a number of different critiques of modern society in a more holistic, but I think less defined, way.

Is the Internet “making this a nation of fragmented cultural tribes”? Yes and no. The Internet is definitely the most elaborate and far-reaching site using the niche and target marketing techniques that have attacked the mass-media “mainstream’ forged in the middle of the 20th century. However, the US has always been a nation of “fragmented cultural tribes,” and even when there appeared to be unity, it mostly papered over, ignored, or erased differences among smaller groups. But I don’t think the Internet means the end of subcultures, because I don’t see hipsters as particularly cohesive, in a national sense. In each of these subcultural examples, people have experiences primarily at the local level, and then they are joined together in a network, to a greater or lesser extent, that connects these localities across the nation.

For example, after the first flurry of punk rose up in the mid-’70s, and then seemingly “died” with the Sex Pistols tour of the US, like-minded individuals in cities all over the country began to play in bands, make their own records, etc. Through touring, exchanging records and zines, college radio, and other interpersonal experiences (all done pre-Internet), a national network was created that could truly be called an “American underground.” (This is the topic of Michael Azerrad’s book Our Band Could Be Your Life.) Thus, when Nirvana broke in 1991, it was somewhat less surprising to those who knew about this fan base that grew over the 1980s; the emergence of “grunge,” and “alternative” music more generally, was just the coming to fruition of the original punk movement that had been nurtured underground for over a decade.

by Paul Hiebert, Flavorwire |  Read more:

The Song Machine

On a mild Monday afternoon in mid-January, Ester Dean, a songwriter and vocalist, arrived at Roc the Mic Studios, on West Twenty-seventh Street in Manhattan, for the first of five days of songwriting sessions. Her engineer, Aubry Delaine, whom she calls Big Juice, accompanied her. Dean picked up an iced coffee at a Starbucks on Seventh Avenue, took the elevator up to Roc the Mic, and passed through a lounge that had a pool table covered in taupe-colored felt. Two sets of soundproofed doors led to the control room, a windowless cockpit that might have been the flight deck of a spaceship.  (...)

Most of the songs played on Top Forty radio are collaborations between producers like Stargate and “top line” writers like Ester Dean. The producers compose the chord progressions, program the beats, and arrange the “synths,” or computer-made instrumental sounds; the top-liners come up with primary melodies, lyrics, and the all-important hooks, the ear-friendly musical phrases that lock you into the song. “It’s not enough to have one hook anymore,” Jay Brown, the president of Roc Nation, and Dean’s manager, told me recently. “You’ve got to have a hook in the intro, a hook in the pre-chorus, a hook in the chorus, and a hook in the bridge.” The reason, he explained, is that “people on average give a song seven seconds on the radio before they change the channel, and you got to hook them.”

The top-liner is usually a singer, too, and often provides the vocal for the demo, a working draft of the song. If the song is for a particular artist, the top-liner may sing the demo in that artist’s style. Sometimes producers send out tracks to more than one top-line writer, which can cause problems. In 2009, both Beyoncé and Kelly Clarkson had hits (Beyoncé’s “Halo,” which charted in April, and Clarkson’s “Already Gone,” which charted in August) that were created from the same track, by Ryan Tedder. Clarkson wrote her own top line, while Beyoncé shared a credit with Evan Bogart. Tedder had neglected to tell the artists that he was double-dipping, and when Clarkson heard “Halo” and realized what had happened she tried to stop “Already Gone” from being released as a single, because she feared the public would think she had copied Beyoncé’s hit. But nobody cared, or perhaps even noticed; “Already Gone” became just as big a hit. (...)

Dean’s preferred method of working is to delay listening to a producer’s track until she is in the studio, in front of the mike. “I go into the booth and I scream and I sing and I yell, and sometimes it’s words but most time it’s not,” she told me. “And I just see when I get this little chill, here”—she touched her upper arm, just below the shoulder—“and then I’m, like, ‘Yeah, that’s the hook.’ ” If she doesn’t feel that chill after five minutes, she moves on to the next track, and tries again.

In advance of Dean’s arrival at Roc the Mic, Stargate had prepared several dozen tracks. They created most of them by jamming together on keyboards until they came up with an “idea”—generally, a central chord progression or a riff—around which they quickly built up a track, using the vast array of preprogrammed sounds and beats at their disposal. Hermansen likens their tracks to new flavors awaiting the right soft-drink or potato-chip maker to come along and incorporate them into a product.

Their plan with Dean was to finish one or two songs at each session. Given their record of success, they dared hope that one of these would be a smash. The others would be relegated to the “good but not good enough” file. Around Roc the Mic, writing songs for any reason other than making hits is a waste of time.

by John Seabrook, The New Yorker |  Read more:
Illustration: Michael Gillette

Inside the Matrix


The spring air in the small, sand-dusted town has a soft haze to it, and clumps of green-gray sagebrush rustle in the breeze. Bluffdale sits in a bowl-shaped valley in the shadow of Utah’s Wasatch Range to the east and the Oquirrh Mountains to the west. It’s the heart of Mormon country, where religious pioneers first arrived more than 160 years ago. They came to escape the rest of the world, to understand the mysterious words sent down from their god as revealed on buried golden plates, and to practice what has become known as “the principle,” marriage to multiple wives.

Today Bluffdale is home to one of the nation’s largest sects of polygamists, the Apostolic United Brethren, with upwards of 9,000 members. The brethren’s complex includes a chapel, a school, a sports field, and an archive. Membership has doubled since 1978—and the number of plural marriages has tripled—so the sect has recently been looking for ways to purchase more land and expand throughout the town.

But new pioneers have quietly begun moving into the area, secretive outsiders who say little and keep to themselves. Like the pious polygamists, they are focused on deciphering cryptic messages that only they have the power to understand. Just off Beef Hollow Road, less than a mile from brethren headquarters, thousands of hard-hatted construction workers in sweat-soaked T-shirts are laying the groundwork for the newcomers’ own temple and archive, a massive complex so large that it necessitated expanding the town’s boundaries. Once built, it will be more than five times the size of the US Capitol.

Rather than Bibles, prophets, and worshippers, this temple will be filled with servers, computer intelligence experts, and armed guards. And instead of listening for words flowing down from heaven, these newcomers will be secretly capturing, storing, and analyzing vast quantities of words and images hurtling through the world’s telecommunications networks. In the little town of Bluffdale, Big Love and Big Brother have become uneasy neighbors.

Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.” It is, in some measure, the realization of the “total information awareness” program created during the first term of the Bush administration—an effort that was killed by Congress in 2003 after it caused an outcry over its potential for invading Americans’ privacy.

But “this is more than just a data center,” says one senior intelligence official who until recently was involved with the program. The mammoth Bluffdale center will have another important and far more secret role that until now has gone unrevealed. It is also critical, he says, for breaking codes. And code-breaking is crucial, because much of the data that the center will handle—financial information, stock transactions, business deals, foreign military and diplomatic secrets, legal documents, confidential personal communications—will be heavily encrypted. According to another top official also involved with the program, the NSA made an enormous breakthrough several years ago in its ability to cryptanalyze, or break, unfathomably complex encryption systems employed by not only governments around the world but also many average computer users in the US. The upshot, according to this official: “Everybody’s a target; everybody with communication is a target.”

For the NSA, overflowing with tens of billions of dollars in post-9/11 budget awards, the cryptanalysis breakthrough came at a time of explosive growth, in size as well as in power. Established as an arm of the Department of Defense following Pearl Harbor, with the primary purpose of preventing another surprise assault, the NSA suffered a series of humiliations in the post-Cold War years. Caught offguard by an escalating series of terrorist attacks—the first World Trade Center bombing, the blowing up of US embassies in East Africa, the attack on the USS Cole in Yemen, and finally the devastation of 9/11—some began questioning the agency’s very reason for being. In response, the NSA has quietly been reborn. And while there is little indication that its actual effectiveness has improved—after all, despite numerous pieces of evidence and intelligence-gathering opportunities, it missed the near-disastrous attempted attacks by the underwear bomber on a flight to Detroit in 2009 and by the car bomber in Times Square in 2010—there is no doubt that it has transformed itself into the largest, most covert, and potentially most intrusive intelligence agency ever created.

In the process—and for the first time since Watergate and the other scandals of the Nixon administration—the NSA has turned its surveillance apparatus on the US and its citizens. It has established listening posts throughout the nation to collect and sift through billions of email messages and phone calls, whether they originate within the country or overseas. It has created a supercomputer of almost unimaginable speed to look for patterns and unscramble codes. Finally, the agency has begun building a place to store all the trillions of words and thoughts and whispers captured in its electronic net. And, of course, it’s all being done in secret. To those on the inside, the old adage that NSA stands for Never Say Anything applies more than ever.

by James Bamford, Wired |  Read more:
Photo: Name Withheld; Digital Manipulation: Jesse Lenz

Friday, March 23, 2012


Anchorage, Alaska
via:

A Gentleman’s Guide to Staying Cool in the 21st Century

The world is a hard place to survive. Falling in love. Broken hearts. Standing up for what you believe. Arachnids and an entire week dedicated to sharks. Earthquakes and economic meltdowns. Nine dollar beer nights at your favorite pub, and fashion statements that should have died years ago returning from the grave. It’s not easy to be alive through all this, let alone survive it as a gentleman. We must remain intact and stand out amongst the rest. No matter how difficult it gets out there. And the best part is, you can still look cool while doing it.

Now that you have the essential tools for surviving the 21st Century as a gentleman, it’s time to be cool.

Call her the next day. The “wait three days rule” will always be cool to break.

Rules that are not cool to break: Never date your friend’s ex, never date your ex’s friend, never date your ex’s roommate, no cutting in line, no cutting a “line” in the bathroom, keep a napkin on your lap while dining, and it’s still not okay to go in a woman’s purse.

LOL is not only NOT okay, it’s never cool.

Your socks should be as colorful as your wardrobe. Don’t be afraid of plaid, argyle, or stripes beneath your jeans or black slacks.

A gentleman never publicly worries, complains, or comments about money. A gentleman who is cool never hesitates to pick up a tab or loan to a friend in need. Remember, if you’re making more than 2 dollars a day, you’re considered in the top 3% of the richest people on the planet. Would you like some perspective with your coffee this morning?

At a business lunch, give your credit card to the waiter before you’re seated. This ends the debate when the meal is finished before it even starts.

Make eye contact with a woman. Then buy her a drink before going on your way, and leave it at that. The transaction is over, and she’ll be left thinking about you the rest of the night.

Buy a cup of coffee for the person in line behind you.

by Max Andrew Dubinsky, Make It Mad |  Read more:

Gravity at the End of the World


We spend our nights cruising up and down Highway DD, a road that is more dirt and gravel than pavement—ten or twenty cars, full of people drinking and blasting trash rock and yelling dirty things to one another. We call the road Knockers. Most of the time, I’m with my boyfriend, Craig, his older brother Cliff and Cliff’s girlfriend Tammy. We’re all in our late twenties. Anywhere else, we would have stopped cruising Knockers a long time ago, but at the end of the world there is little else to do but go around and around and around.

Cliff drives an old Chevy pickup that is slowly disintegrating. Craig and I lie in the truck bed so worn out I can taste flakes of rust on his breath when we kiss. On clear nights, Craig and I shout faster faster faster into the wind. The faster Cliff drives, the more the sky above us stills.

Tammy has a reputation because she wears short skirts and high boots and lets her bra straps show. She’s a nice girl though. She loves real hard. Tammy sits so close to Cliff while he’s driving she’s practically wedged between his legs and the steering wheel. She knows he’s the type of man you have to hold real tight. Cliff and Craig are loggers for a small operation that clears land for people who need the money or want to build a house or whatever it is that people who own land do. Cliff says someday he’s going to start his own company and his little brother is going to work with him. They’re going to be rich like the men who live in the grand houses overlooking our Upper Michigan town. Cliff is long on ideas but short on everything else.

Growing up, we always looked up and wondered what it would be like to live in the grand homes looking down. At Christmas, our parents drove us along the overlook. Our mothers cooed at the beautiful decorations even though on every other day they cleaned those houses and took care of the children living in them. Our fathers, who worked for the grand homeowners, grunted. They said it wasn’t anything special. They swallowed the bitterness of their envy and chased it with a nip of whiskey. The grand houses had huge windows. We could see the perfectly decorated Christmas trees and the beautiful dining room tables and the illuminated chandeliers in the foyer. We never saw any people. In houses that big, there are lots of places to hide.

by Roxanne Gay, Knee-Jerk |  Read more:
Photo: Michael Wriston  via
h/t: GS

Antonio Carlos Jobim



Francis Picabia, Udnie, Jeune fille américaine, Danse, 1913, huile sur toile/oil on canvas, 290 x 300 cm, Centre Pompidou, Paris.
via:

Scent of a Woman
via:

Does It Matter Whether God Exists?

Discussions of religion are typically about God. Atheists reject religion because they don’t believe in God; Jews, Christians and Muslims take belief in God as fundamental to their religious commitment. The philosopher John Gray, however, has recently been arguing that belief in God should have little or nothing to do with religion. He points out that in many cases — for instance, “polytheism, Hinduism and Buddhism, Daoism and Shinto, many strands of Judaism and some Christian and Muslim traditions” — belief is of little or no importance. Rather, “practice — ritual, meditation, a way of life — is what counts.” He goes on to say that “it’s only religious fundamentalists and ignorant rationalists who think the myths we live by are literal truths” and that “what we believe doesn’t in the end matter very much. What matters is how we live.”

Even if God is powerful enough to save the souls of the devout, and loving enough to want to, he still might not.

The obvious response to Gray is that it all depends on what you hope to find in a religion. If your hope is simply for guidance and assistance in leading a fulfilling life here on earth, a “way of living” without firm beliefs in any supernatural being may well be all you need. But many religions, including mainline versions of Christianity and Islam, promise much more. They promise ultimate salvation. If we are faithful to their teachings, they say, we will be safe from final annihilation when we die and will be happy eternally in our life after death.

If our hope is for salvation in this sense — and for many that is the main point of religion—then this hope depends on certain religious beliefs’ being true. In particular, for the main theistic religions, it depends on there being a God who is good enough to desire our salvation and powerful enough to achieve it.

But here we come to a point that is generally overlooked in debates about theism, which center on whether there is reason to believe in God, understood as all-good and all-powerful. Suppose that the existence of such a God could be decisively established. Suppose, for example, we were to be entirely convinced that a version of the ontological argument, which claims to show that the very idea of an all-perfect being requires that such a being exist, is sound. We would then be entirely certain that there is a being of supreme power and goodness. But what would this imply about our chances for eternal salvation?

On reflection, very little. Granted, we would know that our salvation was possible: an all-powerful being could bring it about. But would we have any reason to think that God would in fact do this? Well, how could an all-good being not desire our salvation? The problem is that an all-good being needs to take account of the entire universe, not just us.

Here, discussions of the problem of evil become crucial. An all-good being, even with maximal power, may have to allow considerable local evils for the sake of the overall good of the universe; some evils may be necessary for the sake of avoiding even worse evils. We have no way of knowing whether we humans might be the victims of this necessity.

by Gary Gutting, NY Times |  Read more:
Photo: Dr. Paul Wolff and Alfred Tritschler Autumn Mood In Frankfurt, 1930 via:

Greek Strategy for Saving the World


The Case Against Google


For the last two months, you've seen some version of the same story all over the Internet: Delete your search history before Google's new privacy settings take effect. A straightforward piece outlining a rudimentary technique, but also evidence that the search titan has a serious trust problem on its hands.

Our story on nuking your history was read nearly 200,000 times on this site alone—and it was a reprint of a piece originally put out by the EFF. Many other outlets republished the same piece. The Reddit page linking to the original had more than 1,000 comments. And the topic itself was debated on decidedly non-techie forums like NPR.

It's not surprising that the tracking debate had people up in arms. A Pew Internet study, conducted just before Google combined its privacy policies (and after it rolled out personalized search results in Search Plus Your World) found that three quarters of people don't want their search results tracked, and two thirds don't even want them personalized based on prior history.

The bottom line: People don't trust Google with their data. And that's new.

Google is a fundamentally different company than it has been in the past. Its culture and direction have changed radically in the past 18 months. It is trying to maneuver into position to operate in a post-pc, post-Web world, reacting to what it perceives as threats, and moving to where it thinks the puck will be.

At some point in the recent past, the Mountain View brass realized that owning the Web is not enough to survive. It makes sense—people are increasingly using non Web-based avenues to access the Internet, and Google would be remiss to not make a play for that business. The problem is that in branching out, Google has also abandoned its core principles and values.

Many of us have entered into a contract with the ur search company because its claims to be a good actor inspired our trust. Google has always claimed to put the interests of the user first. It's worth questioning whether or not that's still the case. Has Google reached a point where it must be evil?

by Mat Honan, Gizmodo |  Read more:

John Stuart Mill and the Right to Die

A British man, Tony Nicklinson, wants to die. In 2005, Mr Nicklinson suffered a stroke that has left him with “locked-in syndrome”. This syndrome is, according to the National Institute of Neurological Disorders and Stroke, “a rare neurological disorder characterized by complete paralysis of voluntary muscles in all parts of the body except for those that control eye movement.” Mr Nicklinson is only able to communicate through a perspex board, which interprets his blinking. He wishes now to end his life “lawfully”, because he considers it “dull, miserable, demeaning, undignified and intolerable”. He is, therefore, seeking protection for any doctor that aids him in suicide. At the moment, the case is proceeding after a ruling from a High Court Judge.

Killing, whether oneself or others, is obviously a difficult topic. We cannot so easily dismiss it as merely a private affair of the individual, nor place it within the domain of government to restrict people from doing so. What we can be certain of is that each case demands its own engagement, looking at the facts, the evidence and the arguments. The imposition of outrage, premised on vague notions like dignity or sanctity, are at best unhelpful and at worst harmful.

What Mr Nicklinson’s case demonstrates though is the inconsistency of state interventions on individuals’ activities. Furthermore, Mr Nicklinson’s reasoning of banality and incapability – as a functioning adult – confirm findings in euthanasia research that indicates these as being the most common reasons for wanting euthanasia (or, in Mr Nicklinson's case, doctor-assisted suicide though I'll use "euthanasia" in this post) – it is not, as many people think, merely physical pain or the inevitability of death.

Destroy your lungs but don’t kill yourself

We’ve noted previously that John Stuart Mill's Harm Principle seems to be tacitly in place in Western societies, when we allow others to harm themselves through personally chosen activities: from smoking to rock-climbing. As Mill noted: “the only purpose for which power can be rightfully exercised over any member of a civilised community, against his will, is to prevent harm to others. His own good, either physical or moral, is not a sufficient warrant.” When it comes to Mr Nicklinson’s case, we have a strange inconsistency: our principle that allows smokers to destroy their lungs disappears when it comes to the idea of ending the life we’ve otherwise allowed to be harmed. We allow a person to slowly or quickly destroy his life, but we don’t allow him to end it – even when the choice is determined by that same person.

Who else, rather than Mr Nicklinson, should decide how he should live or, indeed, whether he should live at all, when he is capable of communicating and contemplating this choice?  It is true that we ought do all we can to provide him with reasons to live, since this amounts to giving more information with which he can make a more informed decision: the more information one has, the better decision it will be. This is not coercion but making available more evidence so that Mr Nicklinson is able to exercise his autonomy.

As we noted, we have no good reason to stop him from performing a self-harming act, unless it unnecessarily and excessively harms the lives of others.

by Tauriq Moosa, Big Think |  Read more:

Tempest by Edward Gordon
via: