Thursday, April 30, 2015

The Future of College?

On a Friday morning in April, I strapped on a headset, leaned into a microphone, and experienced what had been described to me as a type of time travel to the future of higher education. I was on the ninth floor of a building in downtown San Francisco, in a neighborhood whose streets are heavily populated with winos and vagrants, and whose buildings host hip new businesses, many of them tech start-ups. In a small room, I was flanked by a publicist and a tech manager from an educational venture called the Minerva Project, whose founder and CEO, the 39-year-old entrepreneur Ben Nelson, aims to replace (or, when he is feeling less aggressive, “reform”) the modern liberal-arts college.

Minerva is an accredited university with administrative offices and a dorm in San Francisco, and it plans to open locations in at least six other major world cities. But the key to Minerva, what sets it apart most jarringly from traditional universities, is a proprietary online platform developed to apply pedagogical practices that have been studied and vetted by one of the world’s foremost psychologists, a former Harvard dean named Stephen M. Kosslyn, who joined Minerva in 2012.

Nelson and Kosslyn had invited me to sit in on a test run of the platform, and at first it reminded me of the opening credits of The Brady Bunch: a grid of images of the professor and eight “students” (the others were all Minerva employees) appeared on the screen before me, and we introduced ourselves. For a college seminar, it felt impersonal, and though we were all sitting on the same floor of Minerva’s offices, my fellow students seemed oddly distant, as if piped in from the International Space Station. I half expected a packet of astronaut ice cream to float by someone’s face.

Within a few minutes, though, the experience got more intense. The subject of the class—one in a series during which the instructor, a French physicist named Eric Bonabeau, was trying out his course material—was inductive reasoning. Bonabeau began by polling us on our understanding of the reading, a Nature article about the sudden depletion of North Atlantic cod in the early 1990s. He asked us which of four possible interpretations of the article was the most accurate. In an ordinary undergraduate seminar, this might have been an occasion for timid silence, until the class’s biggest loudmouth or most caffeinated student ventured a guess. But the Minerva class extended no refuge for the timid, nor privilege for the garrulous. Within seconds, every student had to provide an answer, and Bonabeau displayed our choices so that we could be called upon to defend them.

Bonabeau led the class like a benevolent dictator, subjecting us to pop quizzes, cold calls, and pedagogical tactics that during an in-the-flesh seminar would have taken precious minutes of class time to arrange. He split us into groups to defend opposite propositions—that the cod had disappeared because of overfishing, or that other factors were to blame. No one needed to shuffle seats; Bonabeau just pushed a button, and the students in the other group vanished from my screen, leaving my three fellow debaters and me to plan, using a shared bulletin board on which we could record our ideas. Bonabeau bounced between the two groups to offer advice as we worked. After a representative from each group gave a brief presentation, Bonabeau ended by showing a short video about the evils of overfishing. (“Propaganda,” he snorted, adding that we’d talk about logical fallacies in the next session.) The computer screen blinked off after 45 minutes of class.

The system had bugs—it crashed once, and some of the video lagged—but overall it worked well, and felt decidedly unlike a normal classroom. For one thing, it was exhausting: a continuous period of forced engagement, with no relief in the form of time when my attention could flag or I could doodle in a notebook undetected. Instead, my focus was directed relentlessly by the platform, and because it looked like my professor and fellow edu-nauts were staring at me, I was reluctant to ever let my gaze stray from the screen. Even in moments when I wanted to think about aspects of the material that weren’t currently under discussion—to me these seemed like moments of creative space, but perhaps they were just daydreams—I felt my attention snapped back to the narrow issue at hand, because I had to answer a quiz question or articulate a position. I was forced, in effect, to learn. If this was the education of the future, it seemed vaguely fascistic. Good, but fascistic. (...)

Nelson’s long-term goal for Minerva is to radically remake one of the most sclerotic sectors of the U.S. economy, one so shielded from the need for improvement that its biggest innovation in the past 30 years has been to double its costs and hire more administrators at higher salaries.

The paradox of undergraduate education in the United States is that it is the envy of the world, but also tremendously beleaguered. In that way it resembles the U.S. health-care sector. Both carry price tags that shock the conscience of citizens of other developed countries. They’re both tied up inextricably with government, through student loans and federal research funding or through Medicare. But if you can afford the Mayo Clinic, the United States is the best place in the world to get sick. And if you get a scholarship to Stanford, you should take it, and turn down offers from even the best universities in Europe, Australia, or Japan. (Most likely, though, you won’t get that scholarship. The average U.S. college graduate in 2014 carried $33,000 of debt.)Some claim education is an art and a science. Nelson has disputed this: “It’s a science and a science.”

Financial dysfunction is only the most obvious way in which higher education is troubled. In the past half millennium, the technology of learning has hardly budged. The easiest way to picture what a university looked like 500 years ago is to go to any large university today, walk into a lecture hall, and imagine the professor speaking Latin and wearing a monk’s cowl. The most common class format is still a professor standing in front of a group of students and talking. And even though we’ve subjected students to lectures for hundreds of years, we have no evidence that they are a good way to teach. (One educational psychologist, Ludy Benjamin, likens lectures to Velveeta cheese—something lots of people consume but no one considers either delicious or nourishing.) (...)

The Minerva boast is that it will strip the university experience down to the aspects that are shown to contribute directly to student learning. Lectures, gone. Tenure, gone. Gothic architecture, football, ivy crawling up the walls—gone, gone, gone. What’s left will be leaner and cheaper. (Minerva has already attracted $25 million in capital from investors who think it can undercut the incumbents.) And Minerva officials claim that their methods will be tested against scientifically determined best practices, unlike the methods used at other universities and assumed to be sound just because the schools themselves are old and expensive. Yet because classes have only just begun, we have little clue as to whether the process of stripping down the university removes something essential to what has made America’s best colleges the greatest in the world.

Minerva will, after all, look very little like a university—and not merely because it won’t be accessorized in useless and expensive ways. The teaching methods may well be optimized, but universities, as currently constituted, are only partly about classroom time. Can a school that has no faculty offices, research labs, community spaces for students, or professors paid to do scholarly work still be called a university?

If Minerva fails, it will lay off its staff and sell its office furniture and never be heard from again. If it succeeds, it could inspire a legion of entrepreneurs, and a whole category of legacy institutions might have to liquidate. One imagines tumbleweeds rolling through abandoned quads and wrecking balls smashing through the windows of classrooms left empty by students who have plugged into new online platforms.

by Graeme Wood, Atlantic |  Read more:
Image: Adam Vorhees

Who Gets to Wear Shredded Jeans?

Recently I scanned the statement of authenticity on a brand-new pair of good old bluejeans. Printed on the inside of the left pocket, beneath an equine insignia, an 1873 patent date and a boast of its status as “an American tradition, symbolizing the vitality of the West,” Levi Strauss & Company reissued its ancient invitation to inspect the dry goods: “We shall thank you to carefully examine the sewing, finish and fit.” The fit was slim, the sewing sound, the finish glamorously traumatized, as if intending homage to clothes Steve McQueen might have worn home from a bike crash.

A ragged extravagance of fraying squiggled from each knee, where an irregular network of holes was patched from behind by a white-cotton rectangle stretchier than sterile gauze. Knotted to a belt loop was a paper tag headed “Destruction,” explaining that these Levi’s, shredded to resemble “the piece you just can’t part with,” merited gentle treatment: “Be sure to take extra care when wearing and washing.” The process of proving the denim tough had endowed it with the value of lace.

These jeans sent a dual message — of armor, of swaddling — in the accepted doublespeak of distressed denim. Pre-washed bluejeans are now sold already on their last legs: ripped, blasted, trashed, wrecked, abused, destroyed, sabotaged, devastated and, in what may be a borrowing of aerospace jargon for drones obliterated by remote control, destructed. Below this disaster-headline language, the fine print babbles smoothly about the soft comfort of deep familiarity, as the textile historian Beverly Gordon observed in a paper titled “American Denim.” These are clothes that suit the Friday-evening needs of Forever 21-year-olds buttressing their unformed selves with ceremonial battle scars, and they also meet the Saturday-morning wants of grown-ups who, arrayed as if to hint at having been out all night, enliven the running of errands by wearing trousers that look and feel like an opiated hangover.

The mass clique of distressed denim exists in polar opposition to another school of bluejean enthusiasm: the dye-stained cult of raw denim. The denim purists — looking professional in unsullied indigo fresh off the shuttle loom, in their natural habitat of bare brick walls and old gnarled wood and other textures invested with magical thinking — are likely to meet the approval of strict good taste. As opposed to people who buy their jeans prefaded and abraded, with a thumb-wide key punch in the watch pocket and the sham phantom of a wallet’s edge in back. But sometimes good taste goes on holiday, to a music festival, for example, turned out in acid-streaked, bleach-stained, chaotically nasty cutoffs. This is the order of things. One point of beat-up bluejeans is to bother good taste, which is a muscular aesthetic stance, a canny market footing and an ambiguous moral position.

Some distressed denim is beauty-marked with subtle scuffs amounting to off-duty signs. Some is lavishly slashed into canvases for abstract craft work, with a fleeciness of bare threads asymmetrically outlined by stubby blue tufts, a kind of plumage for people treating a humble fiber as a vehicle for expressing splendor. There are bluejeans serially slit up the front, space striped as if by the shadows of window blinds in a film noir, and sometimes they are sold by shop assistants wearing jeans sliced to bare hamstrings, as if everyone’s bored of the old ways of constraining the sight and shape of the body. There is a place in Paris that gathers old bluejeans as raw material for reassembled jeans that will cost $1,450 a pair. Which would be a bargain if you believed the piece worthy of framing as a collage deconstructing aperture and entropy and the tensions of a labor-class fabric reworked as universal playwear. (...)

“Everything that was directly lived has receded into a representation,” the Situationist theorist Guy Debord wrote in “Society of the Spectacle.” He was describing a phenomenom now exemplified by new denim marketed as having been “aged to mimic look and feel of 11-year-old denim.” The product lets its buyers slip into the approximation of a lived-in skin and by proximity, to enhance their own personal histories.

The insolence of indecent denim has evolved into a prefab mannerism, a marker of “punk chic” or “grunge cool.” The holes can still reify a generation gap, I think, having heard a 35-year-old banker say that she cannot put on such jeans without imagining her parents’ disapproval: “You should have worn those dungarees all day long until you wore them out yourself.” But that purist’s objection misses the point. The patent insincerity of distressed denim is integral to its appeal. What to make, glancing around the waiting room, of the precision-shredded knees of a pair of plainly expensive maternity jeans promoted for their “rock ’n’ roll appeal”? No one supposes that a woman wearing an elasticized waistband to accommodate the fullness of her third trimester wiped out on her skateboard. The lie is not a lie but a statement of participation in a widespread fantasy. Contentedly pretending to be a dangerous bohemian, she is simply exercising the right to be her own Joey Ramone. We put on jeans with ruined threading in a self-adoring performance of annihilation.

by Troy Patterson, NY Times |  Read more:
Image: Mauricio Alejo

Wednesday, April 29, 2015

The Embattled First Amendment

Can free speech wreck the American experiment? The question at first seems crazy: Free speech is almost universally regarded as the heartbeat of democracy. (...)

However sacred the idea of free speech remains for us today, we should recognize that its most fervent champions are not standing up for mistrusted outliers, such as Holmes had in mind, or for the dispossessed and powerless. Today’s advocates do the bidding of insiders—the super-rich and the ultra-powerful, the airline, drug, petroleum, and tobacco industries, all the winners in America’s winner-take-all society. In a country where the gap between the haves and have-nots has grown so extreme that both political parties now pay lip service to populism, the haves have seized free speech as their cause—and their shield.

The landmark Citizens United v. Federal Election Commission case in 2010, in which the Supreme Court ruled that the government may not ban so-called independent spending by corporations in elections, is often described as being about campaign finance law, since it dealt with a statute intended to boost confidence in the political system by reducing the role of big money in elections. But to the justices in the majority (Roberts, Scalia, Anthony M. Kennedy, Clarence Thomas, and Samuel A. Alito Jr.), the case was about free speech. The principle, Kennedy wrote, is that “the Government lacks the power to restrict political speech based on the speaker’s corporate identity.” To mark the fifth anniversary of the Citizens United ruling, public-interest organizations issued reports that, as a result of it, corporations, unions, and individuals have spent more than a billion dollars on political campaigns, with the Center for Responsive Politics estimating that contributions from business dwarf those from labor by about 15-to-one.

Citizens United was about political speech, but it was built on principles established for commercial speech—the kind of solicitation that a business makes to potential customers. The Supreme Court initially treated commercial speech as having less importance than political speech. But the protection of commercial speech is now a formidable tool for American enterprise—and Citizens United shows how far the Court has taken the concept.

In 1976, in a case about whether pharmacists had the right to advertise prices for prescription drugs, the Supreme Court ruled for the first time that the First Amendment covers commercial speech. Unless an ad for a drug is false or misleading or promotes something illegal, the Court held, government must let a business make its pitch and trust that consumers will make good use of the information.

The new ruling gave commercial speech enough importance to come under the First Amendment’s coverage, but Justice Harry Blackmun noted “common sense differences” between commercial and political speech and said that it was “less necessary to tolerate inaccurate statements” in commercial speech because of its lower political and social value. The ruling seemed to strike a balance between the interest of a business in touting its lower prices and the interest of the government in ensuring that commercial information flows, as Blackmun put it, “cleanly as well as freely.”

For the past decade, however, corporations have used the idea of commercial speech as a basis for sweeping claims about what the First Amendment entitles them to. With it, they have persuaded courts to strike down a broad range of well-founded regulations, from health warnings on cigarette packs to bans against pharmacies selling data about prescriptions for marketing. Spirit Airlines, joined by other carriers, argued that the government violated its First Amendment rights by requiring it to prominently list the total price of a ticket, to avoid confusing customers with separate lists of the base fare, taxes, and other charges. Reflecting the new libertarian outlook of businesses about free speech, Spirit insisted it had a right to tell its customers about “the huge tax burden that the federal government imposes on air travel.” The federal appeals court in Washington, D.C., ruled against the airlines, by two to one, but the dissent embraced their libertarian argument: “if discourse regarding these charges results in the government lessening the financial burden it imposes, airfares would become more affordable and people would fly more often.”

The decision in Citizens United was even more aggressive. It took a central concept of the Court’s rulings in commercial speech cases and twisted it drastically, viewing the matter not from the viewpoint of the consumer—its original intention—but from the viewpoint of the corporation.

In his 2014 book Citizens Divided, Yale Law School’s dean, Robert C. Post, who specializes in the First Amendment, explained what the Court got wrong: “the speech of an ordinary commercial corporation possesses constitutional value only because it provides information to auditors”—that is, it provides consumers with truthful information by removing government restrictions that kept them from getting it. Or so the Court said four decades ago, when it extended First Amendment coverage to commercial speech. (...)

Nothing in the text or history of the amendment says exactly what the freedom of speech means—or abridging, for that matter. The Supreme Court has explicitly identified five categories of speech that the First Amendment doesn’t cover: lewd, obscene, profane, and libelous expressions, plus face-to-face insults that trigger a violent response, known as “fighting words.” Ronald K. L. Collins of the University of Washington has counted “at least 43 other additional types of unprotected expression,” ranging from blackmail and bribery to perjury and harassment in the workplace; from plagiarism and child pornography to some kinds of panhandling; from telemarketing to lying to government officials. And free speech in public schools, courtrooms, prisons, the military, and other public institutions may be limited—from the government’s viewpoint, to help them function effectively.

Rather than developing a unified theory about free speech, the Supreme Court has taken an issue-by-issue approach, explains Geoffrey Stone. The Court has been mindful of three recurring problems that the law must guard against: the chilling effect, the pretext effect, and the crisis effect.

by Lincoln Caplan, American Scholar |  Read more:
Image: David Herbic

How Have We Got Education So Disastrously Wrong?

As we read of the open letter signed by more than 1,200 teachers complaining that stress is destroying the profession, it might be worth pausing to ask what has happened to make teaching, once the most rewarding and satisfying of jobs, so deeply frustrating and unfulfilling?

How have we allowed so many initiatives done in the name of ‘improving standards’ to wreck havoc on our schools? How, in the interests of trying to improve the quality of the education, have we got it so disastrously wrong?

When it comes to compiling a charge list, where to begin? Perhaps with the extension of schools into their extended role as providers of wrap-round care and the extra pressures that has placed upon teachers?

Perhaps with the amount of time required to be given over at inset days and staff meetings to topics as diverse as child protection, safeguarding, e-safety, inspections, changes in legislation, health and safety updates, risk assessments and compliance, all valid in themselves, but leaving no room left to discuss the education of children?

Perhaps in encouraging parents to act as champions for their children without any account of their own responsibilities in raising and disciplining them? Or in society’s expectations that schools are where all social problems should be dealt with?

Perhaps with the quite unreasonable demands placed on teachers to constantly record evidence, work to targets and be subject to endless monitoring, appraisal and inspections?

Perhaps with the ever changing regulations for inspections and compliance designed to keep us on our toes?

However well-intended, each initiative, each change, has exacted a cost, and the cumulative effect on the profession has made it almost untenable.

by Peter Tait, Telegraph |  Read more:
Image: uncredited

Tokyo

via:

Tuesday, April 28, 2015

The New New Museum

... museums have changed — a lot. Slowly over the past quarter-century, then quickly in the past decade. These changes have been complicated, piecemeal, and sometimes contradictory, with different museums embracing them in different ways. But the transformation is visible everywhere. Put simply, it is this: The museum used to be a storehouse for the art of the past, the display of supposed masterpieces, the insightful exploration of the present in the context of the long or compressed histories that preceded it. Now — especially as embodied by the Tate Modern, Guggenheim Bilbao, and our beloved MoMA — the museum is a revved-up showcase of the new, the now, the next, an always-activated market of events and experiences, many of which lack any reason to exist other than to occupy the museum industry — an industry that critic Matthew Collings has called “bloated and foolish, corporatist, ghastly and death-ridden.”

The list of fun-house attractions is long. At MoMA, we’ve had overhyped, badly done shows of Björk and Tim Burton, the Rain Room selfie trap, and the daylong spectacle of Tilda Swinton sleeping in a glass case. This summer in London you can ride Carsten Höller’s building-high slides at the Hayward Gallery — there, the fun house is literal. Elsewhere, it is a little more “adult”: In 2011, L.A.’s MoCA staged Marina Abramovic’s Survival MoCA Dinner, a piece of megakitsch that included naked women with skeletons atop them on dinner tables where attendees ate. In 2012, the Los Angeles County Museum of Art paid $70,000 for a 21-foot-tall, 340-ton boulder by artist Michael Heizer and installed it over a cement trench in front of the museum, paying $10 million for what is essentially a photo op. Last year, the Museum of Contemporary Art in Chicago mounted a tepid David Bowie show, which nevertheless broke records for attendance and sales of catalogues, “limited-edition prints,” and T-shirts. Among the many unfocused recent spectacles at the Guggenheim were Cai Guo-Qiang’s nine cars suspended in the rotunda with lights shooting out of them. The irony of these massively expensive endeavors is that the works and shows are supposedly “radical” and “interdisciplinary,” but the experiences they generate are closer, really, to a visit to Graceland — “Shut up, take a selfie, keep moving.”

In this way, an old museum model has been replaced by another one. Museums that were roughly bookish, slow, a bit hoity-toity, not risk-averse but careful, oddly other, and devoted to reflection, connoisseurship, cultivation, and preservation (mostly of the past but also of new great works) — these museums have transformed into institutions that feel faster, indifferent to existing collections, and at all times intensely in pursuit of new work, new crowds, and new money. We used to look at these places as something like embodiments and explorations of the canon — or canons, since some (MoMA’s and Guggenheim’s modernism collections) were narrower and more specialized than others (the Met’s, the Louvre’s). But whatever long-view curating and collecting museums do now — and many of them still do it well — the institutions that are sucking up the most energy are the ones that have made themselves into platforms for spectacle, as though the party-driven global-art-fair feeding frenzy had taken up residence in one place, and one building, permanently. Plus, accessibility has become everything. More museums are making collections available online — sad to say, art is sometimes better viewed there than in the flesh, thanks to so much bad museum architecture and so little actual space to display permanent collections. Acousti­guides have become more and more common, and while there’s much good they can do, it often seems their most important function is crowd control — moving visitors through quickly to make room for the next million.

The museums of New York can already feel alien with this new model taking over. And we’re really at the beginning rather than the end of the transformation. All four of Manhattan’s big museums — the Met, MoMA, the Whitney, and the Guggenheim — have undertaken or are involved in massive expansion, renovation, and rebuilding. These are more than just infrastructure updates: We are witnessing a four-way competition for supremacy in the new art-museum universe...

What makes this all so startling is that these museums have never been all-out competitors before. Until now, they had distinct missions, collections, and curatorial identities: The Met specialized in 5,000 years of art; the Whitney was about American art; MoMA was modernism’s Francophile Garden of Eden; and the Guggenheim — well, the Guggenheim has always been a bit confused, mostly distinguished by its incredible building. But now, all of a sudden and for the first time, it is not unusual for curators to speak of being unable to do shows because “that artist is already taken.”

Each of these museums still preserves, collects, and exhibits the art of the past. But with the action and big money centered on contemporary art, galleries, auctions, art fairs, and biennials, each is more committed than ever before to the art of the now and the cult of the new. I love the new. I am a member of that cult, in part because the art world has become my surrogate family of gypsies and dreamers (yes, I’m a mush). But that cult, and the ascendance of spectacle, may be the end of museums as we know them and has been the subject of countless conversations I’ve had over the past year with curators, artists, gallerists, and collectors, all of whom acknowledge a major shift under way. “The problem is museums trying to be as up-to-date with contemporary art as galleries are,” says painter and critic Peter Plagens. “The cultural distance between what a museum preserves (Cézanne, Joan Mitchell, etc.) and how it spotlights the present (Björk, interactive art, etc.) is greater than ever.” As former Venice- and Whitney-biennial curator Francesco Bonami puts it, “They’re like those in the fashion world who only follow the last collection and are content to have their shows look like those of other museums.” Plagens says that a few years ago, ex–L.A. MoCA director and impresario Jeffrey Deitch told him that “museums needed young audiences and that what young audiences wanted to see is events, whether the events are fashion shows, rock concerts, or exhibition openings.” And now? “I mean, fucking James Franco is everywhere,” Plagens says. “Miley Cyrus is on art-world tongues, curators are courtiers, museums are the runway.” Of course, he acknowledges, “museums will survive. But in what form?”

by Jerry Saltz, NY Magazine |  Read more:
Image: Nic Lehoux

What to Say When Police Tell You to Stop Filming Them

First of all, they shouldn’t ask.

“As a basic principle, we can’t tell you to stop recording,” says Delroy Burton, chairman of D.C.’s metropolitan police union and a 21-year veteran on the force. “If you’re standing across the street videotaping, and I’m in a public place, carrying out my public functions, [then] I’m subject to recording, and there’s nothing legally the police officer can do to stop you from recording.”

“What you don’t have a right to do is interfere,” he says. “Record from a distance, stay out of the scene, and the officer doesn’t have the right to come over and take your camera, confiscate it.”

Officers do have a right to tell you to stop interfering with their work, Burton told me, but they still aren’t allowed to destroy film.

Yet still some officers do. Last week, an amateur video appeared to show a U.S. Marshal confiscating and destroying a woman’s camera as she filmed him.

“Photography is a form of power, and people are loath to give up power, including police officers. It’s a power struggle where the citizen is protected by the law but, because it is a power struggle, sometimes that’s not enough,” says Jay Stanley, a senior policy analyst at the American Civil Liberties Union (ACLU).

Stanley wrote the ACLU’s “Know Your Rights” guide for photographers, which lays out in plain language the legal protections that are assured people filming in public. Among these: Photographers can take pictures of anything in plain view from public space—including public officials—but private land owners may set rules for photography on their property. Cops also can’t “confiscate or demand to view” audio or video without a warrant, and they can’t ever delete images.

The ACLU’s guide does caution that “police officers may legitimately order citizens to cease activities that are truly interfering with legitimate law enforcement operations.”

What if that happens, and you disagree with the officer?

by Robinson Meyer, The Atlantic |  Read more:
Image: Carlo Allegri / Reuters

Washington State Turns to Neurotoxins to Save Its Oysters

What could go wrong?

Six decades ago, when Dick Sheldon first got into the oyster business, the tide flats of Washington states Willapa Bay were almost free of blight. There were no crabs (well, almost none) and Sheldon used to hike onto the mud at low tide, with a bucket for oysters. “We’d walk a mile or more, even when it was freezing outside, with the wind blowing at 50 miles an hour,” he remembers wistfully, “and then we’d stoop to the mud and start gathering them and throwing them into the buckets.”

In Sheldon’s memory, the oysters were bountiful and the mud floor was firm and pleasant to walk on. Then the shrimp arrived, and everything changed. Burrowing shrimp dig holes in the mud and live there. They pock the tide flats with a zillion holes, and today Sheldon, the 80-year-old eminence grise at his family’s small Northern Oyster Co., considers Willapa Bay a vanished world floored by “quicksand. If you’re not careful out there,” he says, “you’re up to your waist in that shit.”

The shrimp began proliferating—mysteriously, like a plague of locusts—in the early 1960s. They dominated the bay floor where oysters lived, but back then there was a simple solution: The oystermen just bombed the shrimp with carbaryl, a DDT-era neurotoxin. The shrimp, which few humans would want to eat, died. Oyster harvests were good, and along the bay, in towns like Nahcotta and South Bend and Oysterville, eight now-defunct oyster canneries flourished.

By the early 2000s, though, carbaryl had become a legal liability. Numerous researchers have linked the chemical to cancer, and in 2002 environmentalists strong-armed the Willapa/Grays Harbor Oyster Growers Association (WGHOGA) into beginning a 10-year phaseout.

A way of life and a small corner of the world were suddenly in jeopardy. Willapa Bay produces more cultured oysters than any other bay in the U.S., yielding about $35 million a year in product. The oyster industry forms the economic backbone of Pacific County, pop. 20,000, where road signs celebrate “The Oyster Capital of the World.” But in 2002, oystermen were scrambling to figure out a new way to kill off shrimp—and they found just the scientist to help them.

Kim Patten, an agricultural extension specialist for Washington State University (WSU), tried blasting the shrimp to bits with dynamite. He did not succeed. He turned his sights next on hitting the shrimp with “a very thin layer of sprayable concrete,” he says, “just an eighth of an inch, so they’d suffocate.” No luck. “By the time the concrete firmed up,” Patten says, recounting a test run, “the shrimp had already poked holes through it.”

Patten, who was trained as a horticulturalist, tried electrocution, super-spicy habanero peppers, and mustard gas. The shrimp held their ground. At last he turned to a newer neurotoxin, imidacloprid, which temporarily paralyzes the shrimp, so that they stop digging and within two days suffocate in the mud.

Imidacloprid is the world’s most popular pesticide, and highly controversial. It belongs to a family of neurotoxins, neonicotinoids, that is increasingly being blamed for colony collapse disorder—the sharp die-off of honeybees that has plagued North America since 2006. The U.S. Fish and Wildlife Service, the National Marine Fisheries Service, the National Audubon Society, and the Xerces Society, which advocates for invertebrates, have all opposed the chemical’s use on Willapa Bay. But their protests are now moot. On April 16, the Washington Department of Ecology approved the spraying of imidacloprid on 1,500 acres of Willapa Bay and 500 acres of nearby Grays Harbor. In about a month, crop-dusting helicopters will begin dousing both estuaries with the chemical.

Unlike carbaryl, imidacloprid dissolves in water, meaning that fish will swim through trace quantities of the chemical and oysters will grow in an imidacloprid-laced bay. It will be a first: Imidacloprid has never been applied on water before in the U.S.

by Bill Donahue, Bloomberg Business | Read more:
Image: Cameron Karsten Photography

Monday, April 27, 2015


Nigel Van Wieck
via:

Rooms We Have Been by Jo Seaquist
via:

Murder Your Darling Hypotheses But Don't Bury Them


"Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it—whole-heartedly—and delete it before sending your manuscript to press. Murder your darlings."

                                               Sir Arthur Quiller-Couch (1863–1944). On the Art of Writing. 1916

Murder your darlings. The British writer Sir Arthur Quiller Crouch shared this piece of writerly wisdom when he gave his inaugural lecture series at Cambridge, asking writers to consider deleting words, phrases or even paragraphs that are especially dear to them. The minute writers fall in love with what they write, they are bound to lose their objectivity and may not be able to judge how their choice of words will be perceived by the reader. But writers aren't the only ones who can fall prey to the Pygmalion syndrome. Scientists often find themselves in a similar situation when they develop "pet" or "darling" hypotheses.

How do scientists decide when it is time to murder their darling hypotheses? The simple answer is that scientists ought to give up scientific hypotheses once the experimental data is unable to support them, no matter how "darling" they are. However, the problem with scientific hypotheses is that they aren't just generated based on subjective whims. A scientific hypothesis is usually put forward after analyzing substantial amounts of experimental data. The better a hypothesis is at explaining the existing data, the more "darling" it becomes. Therefore, scientists are reluctant to discard a hypothesis because of just one piece of experimental data that contradicts it.

In addition to experimental data, a number of additional factors can also play a major role in determining whether scientists will either discard or uphold their darling scientific hypotheses. Some scientific careers are built on specific scientific hypotheses which set apart certain scientists from competing rival groups. Research grants, which are essential to the survival of a scientific laboratory by providing salary funds for the senior researchers as well as the junior trainees and research staff, are written in a hypothesis-focused manner, outlining experiments that will lead to the acceptance or rejection of selected scientific hypotheses. Well written research grants always consider the possibility that the core hypothesis may be rejected based on the future experimental data. But if the hypothesis has to be rejected then the scientist has to explain the discrepancies between the preferred hypothesis that is now falling in disrepute and all the preliminary data that had led her to formulate the initial hypothesis. Such discrepancies could endanger the renewal of the grant funding and the future of the laboratory. Last but not least, it is very difficult to publish a scholarly paper describing a rejected scientific hypothesis without providing an in-depth mechanistic explanation for why the hypothesis was wrong and proposing alternate hypotheses.

by Jalees Rehman, 3 Quarks Daily | Read more:
Image: Wikipedia

No-Knock Warrants

[ed. See also: I thought it was a home invasion]

No-knock warrants have become the strategy of first choice for many police departments. Most of these target those suspected of drug possession or sales, rather than the truly dangerous situations they should be reserved for. The rise in no-knock warrants has resulted in an increased number of deadly altercations. Cops have been shot in self-defense by residents who thought their homes were being invaded by criminals. Innocent parties have been wounded or killed because the element of surprise police feel is so essential in preventing the destruction of evidence puts cops -- often duded up in military gear -- into a mindset that demands violent reaction to any perceived threat. In these situations, the noise and confusion turns everythinginto a possible threat, even the motions of frightened people who don't have time to grasp the reality -- and severity -- of the situation.

No-knock warrants are basically SWATting, with cops -- rather than 13-year-old gamers -- instigating the response. Judges should be holding any no-knock warrant request to a higher standard and demand more evidentiary justification for the extreme measure -- especially considering the heightened probability of a violent outcome. But they don't.

A Massachusetts court decision posted by the extremely essential FourthAmendment.com shows just how little it takes to obtain a no-knock warrant. The probable cause provided to obtain the no-knock warrant was ridiculous, but it wasn't challenged by the magistrate who signed off on the request. What's detailed here should raise concerns in every citizen.
The affidavit supporting the warrant contained the following representations: 1) the extensive training and experience in drug investigations, controlled purchases and arrests of the officer who made the affidavit, 2) the confidential informant's report that the apartment for which a warrant was sought was "small, confined and private," 3) the confidential informant's report that the defendant "keeps his door locked and admits only people whom he knows," 4) the fact that the defendant sold drugs to the informant only after arrangements were made by telephone, and 5) the officer's assessment that, given the retail nature of the defendant's operation and the fragile nature of the illegal drugs involved, "it would not be difficult for [the defendant] to destroy the narcotics if given the forewarning."
In other words, if you have a "private" home with working toilets and locks and you don't routinely allow complete strangers to wander around your home, you, too, could be subjected to a no-knock warrant. This description fits pretty much every person who lives in a residence anywhere. All it takes is an officer's "upon information and belief" statement and a few assertions from a confidential informant, whose otherwise unreliable narration (if, say, he/she was facing charges in court) is routinely treated as infallible by cops and courts alike.

by Tim Cushing, Tech Dirt |  Read more:
Image: Roman Genn via:

The Rehypothecation of Gold, and Why It Matters

Rehypothecation occurs when your broker, to whom you have hypothecated -- or pledged -- securities as collateral for a margin loan, pledges those same securities to a bank or other lender to secure a loan to cover the firm's exposure to potential margin account losses.

When you open a margin account, you typically sign a general account agreement with your broker, in which you authorize your broker to rehypothecate.

Now, let’s put this into easy to understand language. Let’s say that you have ten dollars. You take it to the bank to let them “borrow” it, while paying you interest. What you have done, in reality, is given them your money to use as they see fit, while giving you a small percentage of the gains that they will earn. A bank would loan the money to a home buyer or perhaps a small business. At the very least, they can lend all the money in excess of their requirement to hold some cash as reserves--say 10% for ease of math.

They now have nine dollars to invest. Their last resort is to offer it to another bank for that bank to “hold”, because that bank doesn’t have enough money to meet its required reserves. Seems simple enough, right?

Welcome to the games bankers play to make money. Now that this simple format is in place, let’s move to where the serious dangers lie.

Precious Metals:

During World War II, many foreign countries feared that their gold reserves, which at the time backed their paper money, might be taken by an enemy and in 1939, the good old USA was a very neutral country, like Switzerland, only there was a much better deterrent than the Alps-- the Atlantic Ocean. So, many countries--England, France, and others--sent us their gold bars to be stored alongside ours in Fort Knox. Later, after the war was over, we convinced them that it was fine to leave it there and in fact, with the Cold War starting other countries joined in, including Germany.

Now, what good is a pile of gold sitting in a fort going to do? It costs a lot to protect it, and the US was paying a small sum in interest, while getting a smaller sum back in “protection fees”. So, the Federal Reserve had a wonderful idea, at least in their minds.

Since we have this gold, let’s issue paper on that gold as though it was ours, after all it is sitting in Fort Knox, and earn a bit of money on the side. So long as the Cold War lasted, the gold certainly wasn’t going anywhere. Here is where the trouble began. It was pretty small potatoes for a good while, until we went off the gold standard in 1971 during the Nixon Administration. What good is having a precious metal to back fiat currency, when a promise is just as good? Enter the danger zone.

Now, the gold in Fort Knox isn’t doing anything. So, what to do? Well, each bar of gold has a unique mark on it to say who owns it. The Cold War is still raging, so no one is going to ask for it back anytime soon. Let’s melt down some of that gold, just a small percentage of it, and sell it off as bullion. Gold is high and the foreign countries won’t ask for it all, so let’s skim a bit here and there. No one will know, and we can make money.

Then debts started to accrue, so they got brazen and started melting bars and reselling bars as their own gold, because they don’t want to use their own gold, when German gold is just the same, except for that little mark. Erase the mark and put your own on it and sell it as yours, using your gold as the “backer” in case Germany asks for some of it back.

Well, it wasn’t long until greed set in. Those gold bars that were sold to say, China or Japan, were resold to Austria or Iraq. Much like the bundling and reselling of home loans in the 1990’s, soon the German melted gold was in seven different countries with seven different marks, but no German mark upon them remained. This still wasn’t the breaking point though, after all there is still plenty of Gold in Fort Knox to cover what is owed to them.

I don’t know who’s idea it was, but it was a bad idea. They decided that they could sell paper promises of gold being held in the vaults. The last number I saw was 140%. Which means that if they have 100 pounds of gold, they can sell paper as though they have 140 pounds of gold. Now, they can also sell that gold outright as well. So, it's possible that they could sell 140 pounds of paper gold and sell a portion of the physical gold. too.

Confused yet? Here is where we stand today. No one knows how much gold is really in Fort Knox. We only know what they say is in Fort Knox. The same is likely true for the Federal Reserve and possibly the major banks; after all, if the Fed starts demanding to know what’s in those banks, they might have to show theirs too. So, let’s say that the economy starts to really go south around the world. As you know from the news, Germany asked to see their gold at Fort Knox and were denied, so they asked for their gold back. Smart move on Germany’s part in my mind. Answer from the Fed, we will get it to you sometime in the near future. This wasn’t challenged by Germany.

Why? Rehypothication. Germany knows that they have been doing the same thing with gold that we have. It’s been sold to multiple people at the same time, under the theory that not everyone will want it at the same time, so we can just move it around as needed.

by Scott A. Batten, Zero Hedge |  Read more:
Image: via:

Sunday, April 26, 2015


[ed. See also: New AG Lynch is no friend.]

Hit Job: "Feds rule, states drool, and marijuana isn’t medicine."
Image: Kyle Jaeger.

Saturday, April 25, 2015


“We all have the potential to fall in love a thousand times in our lifetime. It's easy. The first girl I ever loved was someone I knew in sixth grade. Her name was Missy; we talked about horses. The last girl I love will be someone I haven't even met yet, probably. They all count. But there are certain people you love who do something else; they define how you classify what love is supposed to feel like. These are the most important people in your life, and you’ll meet maybe four or five of these people over the span of 80 years. But there’s still one more tier to all this; there is always one person you love who becomes that definition. It usually happens retrospectively, but it happens eventually. This is the person who unknowingly sets the template for what you will always love about other people, even if some of these loveable qualities are self-destructive and unreasonable. The person who defines your understanding of love is not inherently different than anyone else, and they’re often just the person you happen to meet the first time you really, really, want to love someone. But that person still wins. They win, and you lose. Because for the rest of your life, they will control how you feel about everyone else.”

The Light of God Behind Him

We like Emanuel Pacquiao because he is small. We admire him because he will tuck his head and duck inside the dangerous space made by a much larger man, where he will punch upward, like a deranged songbird pecking away at a cat. We like Manny because this situation reminds us of his childhood, wherein a backwoods Filipino boy, so poor he sometimes survived on a single meal a day, stole away for the city, where he punched other children for pennies. We like him because now, when he goes back home, he receives long lines of his hungry countrymen like a generous king. He pays their bills. He builds them hospitals.

Japanese fans can tell you about the time the boxer’s lout of a father ate Manny’s dog. Mexican admirers can tell you about the time when, too slim for a fight, Manny made weight by putting rocks in his pockets. On the streets of Manila, fans recognize not just the boxer but the men necessary to the boxer, such as his fast-talking, Parkinson’s-afflicted trainer, Freddie Roach, who can draw a thousand people to a mall. In a documentary about Pacquiao released earlier this year, we watch the boxer slam his fist into various faces in slow motion, while a deadly serious Liam Neeson explains that he is doing it all for us. Adds the British journalist Gareth Davies: “It’s almost as if you feel the light of God behind him.”

Pacquiao is the most famous resident of an entire Pacific nation, which, in the midst of his fights, experiences a drop in the crime rate and an unofficial truce in the war-torn south. When tropical storm Sendong hit Mindanao in 2011, attention turned to “the single biggest one-man charity institution in the country,” in the words of the Philippine Daily Inquirer: How much would he give? He has been elected twice to his country’s congress, is widely expected to run for president when he retires, and when he competes on May 2, he will be watched by 107 million Filipinos.

Boxing is a sport that tends toward Manichaean clarity (think Evander Holyfield versus the man who would bite off his ear), and the upcoming fight provides a suitably menacing double: Floyd Mayweather, a hermetic megalomaniac nicknamed “Money” and fond of selfies involving stacks of it, a man who spent two months in jail for punching his ex-girlfriend in the head, who demanded that his Filipino opposition “make me a sushi roll.” Money Mayweather is also the undefeated fighter slightly favored by oddsmakers, which means he fails to elicit even the sympathy that redounds to an underdog.

The Man With the Light of God Behind Him is not known for sound financial management — he is being investigated by the Filipino and American tax authorities for many millions in alleged unpaid taxes — but this fight will pay for a lot of bad decisions. The pot, an estimated $200 million, will make next month’s fight at the MGM Grand Garden Arena in Las Vegas the “richest ever single occasion in sport,” as the Guardian put it, with the cheapest seats selling for $1,500.

It’s a price set, in part, by years of anticipation and the dysfunction of the sport itself. For half a decade, Pacquiao and Mayweather, the best boxers in the world, have failed to face one another because they could not come to an agreement about the terms, or perhaps because neither ever really wanted to come to one. Days before the fight was finally announced, sportswriters were still claiming that only credulous fools could believe it would happen. It was late 2009 when HBO Sports president Ross Greenburg pointed out that a Pacquiao-Mayweather matchup would showcase “the two best pound-for-pound fighters in the world, both in their prime, in the same weight class.” It was 2010 when Snoop Dogg publicly begged Pacquiao to “get in the mother­fucking ring.” It was 2011 when Nelson Mandela’s daughter tried to arrange the bout for her father’s 93rd birthday. In the interim, Pacquiao won his second congressional race and lost two boxing matches that seemed to signal the end of his best years. Repeatedly, a bout with Mayweather seemed imminent, only to collapse for reasons that were themselves the subject of dispute. Today, Pacquiao is 36 and Mayweather is 38. Both men have slowed, faded, but no one seems to care. It is simply the biggest fight in decades.

by Kerry Howley, NY Magazine |  Read more:
Image: Ben Lowry

Spoofing a Flash Crash

He operated from a modest suburban London home he shared with his parents, far from the city's glamorous financial center. He used off-the-shelf software anyone can buy.

Yet, if U.S. authorities are correct, Navinder Singh Sarao, 36, managed to send a jolt of fear through the world's markets by helping to set off the 2010 "flash crash," in which the Dow Jones average plunged 600 points in less than seven minutes.

Just how big a role he played has been hotly debated since the federal complaint was unsealed earlier this week, but the idea that a little-known investor had even a small part is deeply troubling, say traders and market experts.

"If this guy can do it," asks finance professor James Angel of Georgetown University, "who else is doing it?"

In an age of rapidly advancing computer power, the fear is that it's not just big banks and hedge funds that can create chaos on exchanges and wipe out the savings of millions of ordinary investors. Someone working from home might be able to do it, too.

"The risks are coming from the small guys who are under the radar," says Irene Aldridge, managing partner of research firm ABLE Alpha Trading and an expert in the kind of high-speed computerized trading that Sarao did. "The regulators don't have the real-time tools to monitor them."

Sarao allegedly employed a ruse called spoofing, a bluffing technique in which traders try to manipulate the price of stocks or other assets by making fake trades to create the impression they want to sell when they really want to buy, or vice versa.

Eric Scott Hunsader, founder of Nanex, a provider of financial data that has documented what it claims are cases of blatant spoofing, says the practice is widespread - in stocks and bonds, oil and gold, cotton and coffee. He says the bluffing is turning markets into a lawless Wild West, despite efforts by trading firms to fight back with software that can sniff out the false trades. (...)

A key to spoofing is placing large orders to sell or buy without ever executing them. Since other traders can see your orders, a large one to sell might convince them prices are likely to head down. One to buy might make them think prices are likely to rise. So they will often mimic your order, which moves prices up or down, as if you had sold or bought yourself.

Next, you cancel your order, and do the opposite - buying at the new, artificially lower price or selling at the new higher one.

The advent of high-frequency trading firms has added a level of sophistication and speed to this bluffing technique.

Using computers to sift through news articles, social media feeds and other data in split seconds, these firms are able to snatch tiny, fleeting profits that mere mortals can't spot. The firms can also bluff fast, sending a series of sell orders, for instance, then canceling them as the price moves down and replacing them with new orders - all within thousandths of a second.

The complaint against Sarao says it was just this sort of lightning-fast spoofing, called dynamic layering, that allowed him to make nearly $880,000 on May 6, 2010, the day of the flash crash.

His computer sent a series of orders to sell E-Mini futures. Then, as their prices moved, his computer changed or replaced those orders in rapid succession - a stunning 19,000 times in less than 2 1/2 hours before it canceled all of them, according to the complaint. Sarao's offers to sell were so numerous that at one point they represented at least one-fifth of all orders to sell E-Mini futures from around the world.

"This is the equivalent of an elephant coming to a tea party," says Nanex's Hunsader. "It's hard not to spot."

Stocks lost $1 trillion in value during the flash crash. The market bounced back by the close of trading, but the breadth and speed of the drop rattled investors and regulators alike.

by Bernard Condon, AP |  Read more:
Image Ray Abrams

Friday, April 24, 2015

Swamps Élysées: The Queen of Gator

Christy Plott Redd says she likes to take the fancy out of fashion, but on a recent afternoon in Manhattan—her auburn hair falling in carefully curled waves beneath a mink hat, her eyelashes pressed into thick half-moons over shadowed lids, her teeth flashing white in an outline of Smashbox fuchsia lipstick—the fancy was very much on display. She wheeled behind her a suitcase the size of a small car. Inside were dozens of alligator skins, samples she was toting around to sell to big-name fashion designers—Ralph Lauren, Oscar de la Renta—for their next collections.

Redd is 36 and the creative director, head of global sales, and co-owner of American Tanning and Leather, a family tannery based in Griffin, Georgia, 40 miles south of Atlanta. But she introduces herself by a far more flamboyant title: the Queen of Gator. It’s her handle on Twitter and Instagram. “La Reina” is engraved on her silver ID bracelet; “Queen of Alligator” is embroidered inside her mink coat. The tongue of her right pink-and-green Nike running shoe says “Gator”; the left says “Queen.” A pink-trimmed calling card is letterpressed with her title and a crown (she’ll tell you it’s her personal card, not her business card, and gleefully dole out one of each).

The title is self-appointed. Several years ago, Redd heard about an alligator buyer from Italy working in Florida and calling himself the King. This was annoying. For one thing, there are no alligators in Italy. Live ones, anyway. More importantly, royalty is demonstrated by blood line, and nobody in the world can lay claim to one more established than Redd’s, whose great-grand­father founded the family business almost a century ago in Blairsville, whose grandfather served time in prison for illegally selling alligator skins in the 1970s, and whose father did too, for that matter. American Tanning is the oldest and largest alligator tannery in the country—and one of the only major ones in the world. Alligator mississippiensis, the American alligator, has been establishing its foothold in what is now the southern United States—its sole habitat—for 180 million years. The Plotts’ regional lineage may stretch back a mere 200 or so, but in any case, what family’s fortune has been entwined with the alligator’s for longer than theirs? Certainly no Italian arriviste’s.

In New York, Redd had scheduled back-to-back appointments: One hour, she had a meeting at a flashy Madison Avenue headquarters; the next, she was in a streamlined downtown studio with a young designer. Redd has been making calls like this (in New York, Milan, Paris) for nearly 15 years, and it’s not uncommon to find her in settings that couldn’t be farther—geographically, culturally, metaphysically—from the rotting reeds of the swamps where her raw material is captured. But she refuses to be intimidated, a gator out of water.

“I don’t like mean girls,” Redd explained. “People say I am down to earth. I’m from Griffin, Georgia, and I treat these people the same I would anyone else.” In other words, she’s got fishermen and farmers on speed dial, but she’s wearing diamonds—preferably her long marquise-shaped earrings by designer and customer Opal Stone, wife of actor Ron Perlman.

On this day, the prospective buyers were coming to Redd’s hotel room, where dozens of hides—in a shape evoking their previous owners, legs and tail dangling—were spread across the desk and slung over the backs of chairs. Redd used one hide to tie back the drapes to let in more sun. The general scaliness, compounded by colors not found in nature, could prompt a shudder in the uninitiated.

One potential customer was an Upper East Side handbag maker who had never bought alligator before. She had an idea: a clutch that would take advantage of the button-sized hole at the top of the tail of every hide, as a way to toggle it shut. Redd didn’t hesitate. “Yeah, that’s the butthole,” she said. “It would be a little butthole clutch.” Then she gently suggested a magnet closure instead. (...)

Christy Plott was born in 1979, the same year that the U.S. Fish and Wildlife Service, encouraged by a rebound in the alligator population, agreed to resume the legal trade of alligator on a strictly regulated basis. The idea was that alligators were such a valuable commodity, landowners would be more inclined to protect the animals’ habitat. Commerce could benefit conservation.

That year, at an auction in Florida, Chris Plott scooped up the first legal alligator skins available in more than a decade. But because the alligator trade had been closed, there wasn’t anyone around to tan them. With 5,000 skins in hand, he decided to build his own tannery. In 1980, American Tanning and Leather was born. Q.C. died of cancer that same year, at 56, leaving Chris to handle business on his own. American Tanning’s first hides were displayed at a show in St. Louis, but they weren’t up to par. Chris went to Europe and brought back a French tanner to advise him. Every year, he said, he lost hundreds of thousands of dollars on alligator, subsidized by his fur business.

When the Fish and Wildlife Service declared the alligator population fully recovered in 1987, things got easier for him, and in the early 1990s, Chris started to make a profit on alligators. Lucky for him, because around the same time, fur was on its way out—thanks in no small part to high-profile, celebrity-spiked animal rights protests, like PETA’s 1994 ad campaign featuring five supermodels claiming they’d “rather go naked than wear fur.” Before long, the wild fur business was nearly defunct, and the Plotts went all in with alligator. (...)

The 30,000 skins the factory processes each year arrive in salted, rolled, refrigerated bundles, courtesy of year-round harvests from alligator farms; periodic catches by nuisance trappers; and, in late summer, hunters and fishermen. About half of AmTan’s skins come from farms, the other half from wild harvest. By far, the most wild gators come from Louisiana, where the Plotts bought a processing center in 2008. During hunting season, the Plotts stake out along the Atchafalaya Basin in St. Martinville, the birthplace of Cajun culture, to purchase whole alligators, typically from commercial fishermen who buy from the docks. At their facility, the meat is separated for sale to seafood dealers; the skins are salt-cured and shipped to Griffin. There, the skins are converted to leather in a series of steps that includes preserving, stretching, drying, chrome dyeing, and polishing. Midway through the process, the skins are said to be “in crust.” The Plotts stock crust year-round, awaiting orders for specific hues and finishes, after which they emerge in a Skittles spectrum of colors, with names like Tahiti, viola, suntan, and pretty-in-pink. Like diamonds, they are graded for quality on a five-point scale—one being the highest tier and accounting for, usually, just 10 percent of the wild skins. Redd’s older brother Damon and his team first grade the skins in raw, then Christy grades them in both crust and in their finished state, looking for any scar or scratch that could render them less valuable. In this relationship-based business, the tanner determines the grade, and if a customer isn’t happy, they ship them back. The risk is that they could switch to a competitor, so the tanner aims to please.

In January, with last season’s skins in crust but many not yet sold, shelves in the crust room were stacked high with the grayish leather, and yellow shopping carts spilled over with sorted bundles. Redd said they represented about $5 million worth.

She put her red manicured fingertips together as if preparing for a dive and said, laughing, “I’m like Scrooge McDuck diving into his money pit—only mine’s filled with alligator skins.” She drew the last word out into two syllables.

by Mary Logan Bikoff, Atlanta Magazine | Read more:
Images: Alex Martinez and Zach Wolfe

One Nation Under God: How Corporate America Invented Christian America

When he ran for the White House, Texas governor George W. Bush took a similarly soft approach, though one that came from the right. A born-again Christian, he shared Bill Clinton’s ability to discuss his faith openly. When Republican primary candidates were asked to name their favorite philosopher in a 1999 debate, for instance, Bush immediately named Christ, “because He changed my heart.” Despite the centrality of faith in his own life, Bush assured voters that he would not implement the rigid agenda of the religious right. Borrowing a phrase from author Marvin Olasky, Bush called himself a “compassionate conservative” and said he would take a lighter approach to social issues including abortion and gay rights than culture warriors such as Pat Buchanan. But many on the right took issue with the phrase. For some, the “compassionate” qualifier implicitly condemned mainstream conservatism as heartless; for others, the phrase seemed an empty marketing gimmick. (As Republican speechwriter David Frum put it, “Love conservatism but hate arguing about abortion? Try our new compassionate conservatism—great ideological taste, now with less controversy.”) But the candidate backed his words with deeds, distancing himself from the ideologues in his party. In a single week in October 1999, for instance, Bush criticized House Republicans for “balancing the budget on the backs of the poor” and lamented that all too often “my party has painted an image of America slouching toward Gomorrah.”

In concrete terms, Bush’s “compassionate conservatism” constituted a promise to empower private religious and community organizations and thereby expand their role in the provision of social services. This “faith­ based initiative” became the centerpiece of his campaign. In his address to the 2000 Republican National Convention, Bush heralded the work of Christian charities and called upon the nation to do what it could to sup­port them. After his inauguration, Bush moved swiftly to make the pro­posal a reality. Indeed, the longest section of his 2001 inaugural address was an expansive reflection on the idea. “America, at its best, is compassionate,” he observed. “Church and charity, synagogue and mosque lend our communities their humanity, and they will have an honored place in our plans and in our laws.” Bush promoted the initiative at his first Na­tional Prayer Breakfast as well. But it was ill-fated. Hamstrung by a lack of clear direction during the administration’s first months, it was quickly overshadowed by a new emphasis on national security after the terrorist attacks of 9/11.

Bush continued to advance his vision of a godly nation. Soon after 9/11, he made a special trip to the Islamic Center of Washington, the very same mosque that had opened its doors to celebrate the Eisenhower inauguration a half century earlier. No sitting president had ever visited an Islamic house of worship, but Bush made clear by his words and deeds there that he considered Muslims part of the nation’s diverse religious community. He denounced recent acts of violence against Muslims and Arab Americans in no uncertain terms. “Those who feel like they can intimidate our fellow citizens to take out their anger don’t represent the best of America,” he said; “they represent the worst of humankind and they should be ashamed.” Referring to Islam as a “religion of peace” and citing the Koran, he closed his address with the same words of inclusion he would have used before any audience, religious or otherwise: “God bless us all.” The president was not alone in enlisting religious patriotism to demonstrate national unity after the attacks. On September 12, 2001, congressional representatives from both parties joined together on the Capitol steps to sing “God Bless America.”Meanwhile, several states that did not already require recitations of the Pledge of Allegiance in their schools introduced bills to do just that.

But the efforts to use the pledge as a source of unity were soon thrown into disarray. In June 2002, a federal court ruled that the phrase “one na­tion under God” violated the First Amendment prohibition against the establishment of a state religion. The case Newdow v. Elk Grove Unified School District had been filed in 2000 by Michael Newdow, an emergency room doctor who complained that his daughter’s rights were infringed because she was forced to “watch and listen as her state-employed teacher in her state-run school leads her classmates in a ritual proclaiming that there is a God, and that ours is ‘one nation under God.” In a 2-to-1 decision, the court agreed. It held that the phrase was just as objectionable as a statement that “we are a nation ‘under Jesus,’ a nation ‘under Vishnu,’ a nation ‘under Zeus,’ or a nation ‘under no god,’ because none of these professions can be neutral with respect to religion.” The reaction from political leaders was as swift as it was predictable. The Senate suspended debate on a pending military spending bill to draft a resolution condemning the ruling, while dozens of House members took to the Capitol steps to recite the pledge and sing “God Bless America” one more time. White House spokesman Ari Fleischer announced that the president thought the decision was “ridiculous”; Democratic senator Tom Daschle called it “nuts.” The reaction was so pronounced, in fact, that the appeals court delayed implementation of its ruling until an appeal could be heard.

As the case made its way through the courts, the nation had to reckon anew with the meaning of “one nation under God.” According to Newdow, an atheist, the language of the amended pledge clearly took “one side in the quintessential religious question ‘Does God exist?’” The Bush administration, defending the pledge, asserted that reciting it was no more a religious act than using a coin with “In God We Trust” inscribed on it; both merely acknowledged the nation’s heritage. A separate brief filed by conservative religious organizations, however, argued that the pledge was “both theological and political.” Reviving claims of the Christian libertarians, it asserted that the words “under God” were added to underscore the concept of limited government. They were meant as a reminder that “government is not the highest authority in human affairs” because, as the Declaration of Independence claimed, “inalienable rights come from God.” In June 2004, the Supreme Court ruled that Newdow technically lacked standing to bring the suit and thus dismissed the lower court’s ruling, dodging the issue for the time being.

Having survived that challenge in the courts, the concept of “one nation under God” thrived on the campaign trail. Seeking to rally religious voters for the 2004 election, Republican strategist Karl Rove advocated a “play-to-the-base” plan to exploit the concerns of the religious right for electoral gain.The president passed two major pieces of pro-life legisla­tion and then joined the campaign for a Federal Marriage Amendment to ban homosexual unions. Many on the right saw the coming campaign as the kind of”religious war” that Pat Buchanan heralded a decade before. The Bush campaign worked to capitalize on “the God gap” in the elector­ate, mobilizing religious conservatives in record numbers. In Allentown, Pennsylvania, one backer erected a billboard that summed up the unofficial strategy of the Republicans: “Bush Cheney ’04-0ne Nation Under God.” The Democrats, meanwhile, gave the politics of religion compara­tively little attention. John Kerry’s presidential campaign relegated much of its national religious outreach to a twenty-eight-year-old newcomer who had virtually no institutional support, not even an old database of contacts. “The matchup between the two parties in pursuit of religious voters wasn’t just David versus Goliath,” the journalist Amy Sullivan wrote.”It was David versus Goliath and the Philistines and the Assyrians and the Egyptians, with a few plagues thrown in for good measure.”

by Kevin M. Kruse, Salon |  Read more:
Image: via: