Tuesday, August 16, 2011

Righthaven Rocked, Owes $34,000 After "Fair Use" Loss

[ed.  If you're not familiar with Righthaven click on the link below, or the "Interesting Article" link on the sidebar to this blog.]

by Nate Anderson

The wheels appear to be coming off the Righthaven trainwreck-in-progress. The litigation outfit, which generally sues small-time bloggers, forum operators, and the occasional Ars Technica writer, has just been slapped with a $34,000 bill for legal fees.

Righthaven v. Hoehn, filed in Nevada federal court, has been an utterly shambolic piece of litigation. Righthaven sued one Wayne Hoehn, a longtime forum poster on the site Madjack Sports. Buried in Home>>Forums>>Other Stuff>>Politics and Religion, Hoehn made a post under the username "Dogs That Bark" in which he pasted in two op-ed pieces. One came from the Las Vegas Review-Journal, which helped set up the Righthaven operation. Righthaven sued.

This was the salvation of the news business? Targeting forum posters in political subforums of sports handicapping sites? But at least it looked like Righthaven had a point; copying had certainly occurred. Had infringement?

Before it was all over, the judge decided that Righthaven had no standing even to bring the case, since only a copyright holder can file an infringement suit (Righthaven's contract only gave it a bare right to sue… which is no right at all). Then the irritated judge decided that Hoehn's cut-and-paste job was fair use, helping establish a precedent that could undercut the entire Righthaven approach.

Then the defense lawyers wanted to be paid. They asked for $34,000 in fees, arguing that they had won the case. To avoid paying the opposing lawyers, Righthaven recently argued that fees could not be awarded; since Righthaven had no standing to sue in the first place, it argued, the court had no jurisdiction over the case at all, not even to assign legal fees.

Defense attorney Marc J. Randazza was furious. "Righthaven deserves some credit for taking this position, as it requires an amazing amount of chutzpah," he wrote to the judge. "Righthaven seeks a ruling holding that, as long as a plaintiff’s case is completely frivolous, then the court is deprived of the right to make the frivolously sued defendant whole, whereas a partially frivolous case might give rise to fee liability. Righthaven’s view, aside from being bizarre, does not even comport with the law surrounding prudential standing."

The judge agreed. In a terse order today, he decided that Hoehn had won the case (as the "prevailing party") and "the attorney’s fees and costs sought on his behalf are reasonable." Righthaven has until September 14 to cut a check for $34,045.50.

Read more:  here and here

USENIX 2011 Keynote: Network Security in the Medium Term, 2061-2561 AD

[ed.  Fascinating speech about where we might go as a society: technologically, economically, socially and culturally.  Along with great historical insights.]

by Charlie Stross

Good afternoon, and thank you for inviting me to speak at USENIX Security.

Unlike you, I am not a security professional. However, we probably share a common human trait, namely that none of us enjoy looking like a fool in front of a large audience. I therefore chose the title of my talk to minimize the risk of ridicule: if we should meet up in 2061, much less in the 26th century, you’re welcome to rib me about this talk. Because I’ll be happy to still be alive to rib.

So what follows should be seen as a farrago of speculation by a guy who earns his living telling entertaining lies for money.

The question I’m going to spin entertaining lies around is this: what is network security going to be about once we get past the current sigmoid curve of accelerating progress and into a steady state, when Moore’s first law is long since burned out, and networked computing appliances have been around for as long as steam engines?

I’d like to start by making a few basic assumptions about the future, some implicit and some explicit: if only to narrow the field.

For starters, it’s not impossible that we’ll render ourselves extinct through warfare, be wiped out by a gamma ray burster or other cosmological sick joke, or experience the economic equivalent of a kernel panic – an unrecoverable global error in our technosphere. Any of these could happen at some point in the next five and a half centuries: survival is not assured. However, I’m going to spend the next hour assuming that this doesn’t happen – otherwise there’s nothing much for me to talk about.

The idea of an AI singularity has become common currency in SF over the past two decades – that we will create human-equivalent general artificial intelligences, and they will proceed to bootstrap themselves to ever-higher levels of nerdish god-hood, and either keep us as pets or turn us into brightly coloured machine parts. I’m going to palm this card because it’s not immediately obvious that I can say anything useful about a civilization run by beings vastly more intelligent than us. I’d be like an australopithecine trying to visualize daytime cable TV. More to the point, the whole idea of artificial general intelligence strikes me as being as questionable as 19th century fantasies about steam-powered tin men. I do expect us to develop some eerily purposeful software agents over the next decades, tools that can accomplish human-like behavioural patterns better than most humans can, but all that’s going to happen is that those behaviours are going to be reclassified as basically unintelligent, like playing chess or Jeopardy.

In addition to all this Grinch-dom, I’m going to ignore a whole grab-bag of toys from science fiction’s toolbox. It may be fun in fiction, but if you start trying to visualize a coherent future that includes aliens, telepathy, faster than light travel, or time machines, your futurology is going to rapidly run off the road and go crashing around in the blank bits of the map that say HERE BE DRAGONS. This is non-constructive. You can’t look for ways to harden systems against threats that emerge from the existence of Leprechauns or Martians or invisible pink unicorns. So, no Hollywood movie scenarios need apply.

Having said which, I cheerfully predict that at least one barkingly implausible innovation will come along between now and 2061 and turn everything we do upside down, just as the internet has pretty much invalidated any survey of the future of computer security that might have been carried out in 1961.

So what do I expect the world of 2061 to look like?

I am going to explicitly assume that we muddle through our current energy crises, re-tooling for a carbon-neutral economy based on a mixture of power sources. My crystal ball is currently predicting that base load electricity will come from a mix of advanced nuclear fission reactor designs and predictable renewables such as tidal and hydroelectric power. Meanwhile, intermittent renewables such as solar and wind power will be hooked to batteries for load smoothing, used to power up off-grid locations such as much of the (current) developing world, and possibly used on a large scale to produce storable fuels – hydrocarbons via Fischer-Tropsch synthesis, or hydrogen gas vial electrolysis.

We are, I think, going to have molecular nanotechnology and atomic scale integrated circuitry. This doesn’t mean magic nanite pixie-dust a la Star Trek; it means, at a minimum, what today we’d consider to be exotic structural materials. It also means engineered solutions that work a bit like biological systems, but much more efficiently and controllably, and under a much wider range of temperatures and pressures.

Chih Han Hsu, Spectacular Leviathan
via:

Bug Nuggets

by Daniel Fromson

The dining-room table was set with roses and silver candlesticks. At one end, near a grandfather clock, sat two plates of mealworm fried rice. “So, a small lunch,” said my host, Marian Peters. “Freshly prepared.” The inch-long larvae, flavored with garlic and soy sauce, reminded me in texture of delicate, nutty seedpods. “Mealworm is one of my favorites at the moment,” Peters told me, speaking of the larvae of the darkling beetle (Tenebrio molitor Linnaeus). When they’re fresh, she added, their exoskeletons don’t get stuck in your teeth.

Based near Amsterdam, Peters’s company, Bugs Originals, has put freeze-dried locusts and mealworms on the shelves at the 24 outlets of Sligro, the Dutch food wholesaler. It has also developed pesto-flavored “bugsnuggets” and chocolate-dipped “bugslibars”—chicken nuggets and muesli bars, respectively, infused with ground-up mealworms. Both, like Peters’s chicken-mealworm meatballs, await approval for sale across the European Union.

The company’s goal is to get consumers to embrace bugs as an eco-friendly alternative to conventional meat. With worldwide demand for meat expected to nearly double by 2050, farm-raised crickets, locusts, and mealworms could provide comparable nutrition while using fewer natural resources than poultry or livestock. Crickets, for example, convert feed to body mass about twice as efficiently as pigs and five times as efficiently as cattle. Insects require less land and water—and measured per kilogram of edible mass, mealworms generate 10 to 100 times less greenhouse gas than pigs.

The Netherlands, already one of the world’s top exporters of agricultural products, hopes to lead the world in the production of what environmentalists call “sustainable food,” and the area around the small town of Wageningen, nicknamed “Food Valley,” has one of the world’s highest concentrations of food scientists. It is also home to a tropical entomologist named Arnold van Huis. In the lineup of head shots near the entrance of Wageningen University’s gleaming new entomology department, he’s the guy with a locust jutting from a corner of his lips. Van Huis has been lecturing on the merits of insect-eating, officially known as entomophagy, since 1996. “People have to know that it is safe,” van Huis told me as we sat in his office. “They have to get the idea that it is not wrong.”

Read more:

Virtual and Artificial

by John Markoff

A free online course at Stanford University on artificial intelligence, to be taught this fall by two leading experts from Silicon Valley, has attracted more than 58,000 students around the globe — a class nearly four times the size of Stanford’s entire student body.

The course is one of three being offered experimentally by the Stanford computer science department to extend technology knowledge and skills beyond this elite campus to the entire world, the university is announcing on Tuesday.

The online students will not get Stanford grades or credit, but they will be ranked in comparison to the work of other online students and will receive a “statement of accomplishment.”

For the artificial intelligence course, students may need some higher math, like linear algebra and probability theory, but there are no restrictions to online participation. So far, the age range is from high school to retirees, and the course has attracted interest from more than 175 countries.

The instructors are Sebastian Thrun and Peter Norvig, two of the world’s best-known artificial intelligence experts. In 2005 Dr. Thrun led a team of Stanford students and professors in building a robotic car that won a Pentagon-sponsored challenge by driving 132 miles over unpaved roads in a California desert. More recently he has led a secret Google project to develop autonomous vehicles that have driven more than 100,000 miles on California public roads.

Dr. Norvig is a former NASA scientist who is now Google’s director of research and the author of a leading textbook on artificial intelligence.

The computer scientists said they were uncertain about why the A.I. class had drawn such a large audience. Dr. Thrun said he had tried to advertise the course this summer by distributing notices at an academic conference in Spain, but had gotten only 80 registrants.

Then, several weeks ago he e-mailed an announcement to Carol Hamilton, the executive director of the Association for the Advancement of Artificial Intelligence. She forwarded the e-mail widely, and the announcement spread virally.

The two scientists said they had been inspired by the recent work of Salman Khan, an M.I.T.-educated electrical engineer who in 2006 established a nonprofit organization to provide video tutorials to students around the world on a variety of subjects via YouTube.

“The vision is: change the world by bringing education to places that can’t be reached today,” said Dr. Thrun.

The rapid increase in the availability of high-bandwidth Internet service, coupled with a wide array of interactive software, has touched off a new wave of experimentation in education.

Read more:

Cancer’s Secrets Come Into Sharper Focus

Bryce Vickmark

For the last decade cancer research has been guided by a common vision of how a single cell, outcompeting its neighbors, evolves into a malignant tumor.

Through a series of random mutations, genes that encourage cellular division are pushed into overdrive, while genes that normally send growth-restraining signals are taken offline.

With the accelerator floored and the brake lines cut, the cell and its progeny are free to rapidly multiply. More mutations accumulate, allowing the cancer cells to elude other safeguards and to invade neighboring tissue and metastasize.

These basic principles — laid out 11 years ago in a landmark paper, “The Hallmarks of Cancer,” by Douglas Hanahan and Robert A. Weinberg, and revisited in a follow-up article this year — still serve as the reigning paradigm, a kind of Big Bang theory for the field.

But recent discoveries have been complicating the picture with tangles of new detail. Cancer appears to be even more willful and calculating than previously imagined.

Most DNA, for example, was long considered junk — a netherworld of detritus that had no important role in cancer or anything else. Only about 2 percent of the human genome carries the code for making enzymes and other proteins, the cogs and scaffolding of the machinery that a cancer cell turns to its own devices.

These days “junk” DNA is referred to more respectfully as “noncoding” DNA, and researchers are finding clues that “pseudogenes” lurking within this dark region may play a role in cancer.

“We’ve been obsessively focusing our attention on 2 percent of the genome,” said Dr. Pier Paolo Pandolfi, a professor of medicine and pathology at Harvard Medical School. This spring, at the annual meeting of the American Association for Cancer Research in Orlando, Fla., he described a new “biological dimension” in which signals coming from both regions of the genome participate in the delicate balance between normal cellular behavior and malignancy.

As they look beyond the genome, cancer researchers are also awakening to the fact that some 90 percent of the protein-encoding cells in our body are microbes. We evolved with them in a symbiotic relationship, which raises the question of just who is occupying whom.

“We are massively outnumbered,” said Jeremy K. Nicholson, chairman of biological chemistry and head of the department of surgery and cancer at Imperial College London. Altogether, he said, 99 percent of the functional genes in the body are microbial.

In Orlando, he and other researchers described how genes in this microbiome — exchanging messages with genes inside human cells — may be involved with cancers of the colon, stomach, esophagus and other organs.

These shifts in perspective, occurring throughout cellular biology, can seem as dizzying as what happened in cosmology with the discovery that dark matter and dark energy make up most of the universe: Background suddenly becomes foreground and issues once thought settled are up in the air. In cosmology the Big Bang theory emerged from the confusion in a stronger but more convoluted form. The same may be happening with the science of cancer.

Read more:

Monday, August 15, 2011

David Guetta


The Dream by Henri Rousseau
via:
 
Tamara de Lempicka // Jeune fille en vert
via:

Anyone's Guess

by Nancy A. Youssef

When congressional cost-cutters meet later this year to decide on trimming the federal budget, the wars in Afghanistan and Iraq could represent juicy targets. But how much do the wars actually cost the U.S. taxpayer?

Nobody really knows.

Yes, Congress has allotted $1.3 trillion for war spending through fiscal year 2011 just to the Defense Department. There are long Pentagon spreadsheets that outline how much of that was spent on personnel, transportation, fuel and other costs. In a recent speech, President Barack Obama assigned the wars a $1 trillion price tag.

But all those numbers are incomplete. Besides what Congress appropriated, the Pentagon spent an additional unknown amount from its $5.2 trillion base budget over that same period. According to a recent Brown University study, the wars and their ripple effects have cost the United States $3.7 trillion, or more than $12,000 per American.

Lawmakers remain sharply divided over the wisdom of slashing the military budget, even with the United States winding down two long conflicts, but there's also a more fundamental problem: It's almost impossible to pin down just what the U.S. military spends on war.

To be sure, the costs are staggering.

According to Defense Department figures, by the end of April the wars in Iraq and Afghanistan — including everything from personnel and equipment to training Iraqi and Afghan security forces and deploying intelligence-gathering drones — had cost an average of $9.7 billion a month, with roughly two-thirds going to Afghanistan. That total is roughly the entire annual budget for the Environmental Protection Agency.

To compare, it would take the State Department — with its annual budget of $27.4 billion — more than four months to spend that amount. NASA could have launched its final shuttle mission in July, which cost $1.5 billion, six times for what the Pentagon is allotted to spend each month in those two wars.

What about Medicare Part D, President George W. Bush's 2003 expansion of prescription drug benefits for seniors, which cost a Congressional Budget Office-estimated $385 billion over 10 years? The Pentagon spends that in Iraq and Afghanistan in about 40 months.

Because of the complex and often ambiguous Pentagon budgeting process, it's nearly impossible to get an accurate breakdown of every operating cost. Some funding comes out of the base budget; other money comes from supplemental appropriations.

But the estimates can be eye-popping, especially considering the logistical challenges to getting even the most basic equipment and comforts to troops in extremely forbidding terrain.

The XX


Using Superman

[ed.  Sorry for the size.  If you have a hard time reading this, a larger version is available here.]
via:

Free Ride

[ed.  This article is especially interesting for the discussion that occurs in the Comments section at the end of the story.  Check it out it's well worth reading, particularly the comments attributed to someone named KTech1.  I'm probably biased, but I think sites like Duck Soup actually benefit traditional media by highlighting interesting articles and pointing traffic their way.  We're all inundated with information, crowd-sourcing reader interest should be something you think they'd support, not suppress.] 

Dustin Hoffman and Robert Redford investigate Watergate in All The President's Men. The Washington Post is one of many media companies losing money for their work through the internet. Photograph: Ronald Grant 

by Robert Levine

For most of the 80s and 90s, NBC dominated US television: Miami Vice, The Cosby Show, Cheers, Seinfeld, Friends. The network earned its ratings by pushing boundaries – Miami Vice stylised the police drama, while Hill Street Blues gave it gritty realism. These shows also brought in big money – NBC was once one of the most profitable divisions of General Electric. But when the parent company was acquired by Comcast this year, the deal reportedly gave the network a value of zero.

NBC isn't the only major media business that has fallen on hard times. EMI, home of the Beatles and Pink Floyd, has trimmed its roster and cut thousands of jobs. The Washington Post, which set a high-water mark for US journalism with its Watergate reporting, has reduced its newsroom staff, closed its national bureaux, and declared: "We are not a national news organisation of record." MGM, with its roaring lion logo, was recently acquired for less than half its 2005 value.

All of these companies faced the same problem: they weren't collecting enough of the revenue being generated by their work. The public hasn't lost its appetite for television, journalism or film; shows, articles and movies reach more consumers than ever online. The problem is that, although the internet has expanded the audience for media, it has all but destroyed the market for it.

Over the past decade, much of the value created by music, films, and newspapers has benefited other companies – pirates and respected technology firms alike. The Pirate Bay website made money by illegally offering major-label albums, even as music sales declined to less than half of what they were 10 years ago. YouTube used clips from shows such as NBC's Saturday Night Live to build a business that Google bought for $1.65bn. And the Huffington Post became one of the most popular news sites online largely by rewriting newspaper articles. This isn't the inevitable result of technology. Traditionally, the companies that invested in music and film also controlled their distribution – EMI, for example, owned recording studios, pressing plants, and the infrastructure that delivered CDs to stores. Piracy was always a nuisance, but never a serious threat. The same was true of other media businesses: the easiest place to get a newspaper story was from a newspaper.

The internet changed all this, not because it enables the fast transmission of digital data but because the regulations that enable technology companies to evade responsibility for their business models have created a broken market. Scores of sites now offer music, while hundreds of others summarise news. Part of the problem is rampant piracy – unauthorised distribution that doesn't benefit creators or the companies that invest in them. It also puts pressure on media companies to accept online distribution deals that don't cover their costs.

But the underlying issue is that creators and distributors now have opposing interests. Companies such as Google and Apple don't care that much about selling media, since they make their money in other ways – on advertising in the first case, and gadgets in the second. Google just wants to help consumers find the song or show they're looking for, whether it's a legal download or not, while Apple has an interest in pushing down the price of music to make its products more useful. And this dynamic doesn't only hurt media conglomerates – it creates problems for independent artists and companies of every size.

Technology companies often promote the idea that "information wants to be free", as technologist Stewart Brand said, because it's so cheap to deliver. Indeed, one of the most exciting aspects of the internet is the way it has all but eliminated distribution costs – a digital movie can be sent from Hollywood to Hong Kong for pennies. Some pundits even suggest the price of media will inevitably fall to that level.

It's hard to imagine how that would happen, simply because the internet hasn't had nearly as much effect on the process of making movies. The same film that costs pennies to send across the world might cost $150m to make. "That tension will not go away," Brand predicted in 1984. "It leads to wrenching debate about price, copyright, 'intellectual property' [and] the moral rightness of casual distribution."

Read more:
Nocturne, Joan Miro, 1940
via:

Record Industry Braces for Artists’ Battles Over Song Rights

by Larry Rohter

Since their release in 1978, hit albums like Bruce Springsteen’s “Darkness on the Edge of Town,” Billy Joel’s “52nd Street,” the Doobie Brothers’ “Minute by Minute,” Kenny Rogers’s “Gambler” and Funkadelic’s “One Nation Under a Groove” have generated tens of millions of dollars for record companies. But thanks to a little-noted provision in United States copyright law, those artists — and thousands more — now have the right to reclaim ownership of their recordings, potentially leaving the labels out in the cold.

When copyright law was revised in the mid-1970s, musicians, like creators of other works of art, were granted “termination rights,” which allow them to regain control of their work after 35 years, so long as they apply at least two years in advance. Recordings from 1978 are the first to fall under the purview of the law, but in a matter of months, hits from 1979, like “The Long Run” by the Eagles and “Bad Girls” by Donna Summer, will be in the same situation — and then, as the calendar advances, every other master recording once it reaches the 35-year mark.

“In terms of all those big acts you name, the recording industry has made a gazillion dollars on those masters, more than the artists have,” said Don Henley, a founder both of the Eagles and the Recording Artists Coalition, which seeks to protect performers’ legal rights. “So there’s an issue of parity here, of fairness. This is a bone of contention, and it’s going to get more contentious in the next couple of years.”

With the recording industry already reeling from plummeting sales, termination rights claims could be another serious financial blow. Sales plunged to about $6.3 billion from $14.6 billion over the decade ending in 2009, in large part because of unauthorized downloading of music on the Internet, especially of new releases, which has left record labels disproportionately dependent on sales of older recordings in their catalogs.

“This is a life-threatening change for them, the legal equivalent of Internet technology,” said Kenneth J. Abdo, a lawyer who leads a termination rights working group for the National Academy of Recording Arts and Sciences and has filed claims for some of his clients, who include Kool and the Gang. As a result the four major record companies — Universal, Sony BMG, EMI and Warner — have made it clear that they will not relinquish recordings they consider their property without a fight.

“We believe the termination right doesn’t apply to most sound recordings,” said Steven Marks, general counsel for the Recording Industry Association of America, a lobbying group in Washington that represents the interests of record labels. As the record companies see it, the master recordings belong to them in perpetuity, rather than to the artists who wrote and recorded the songs, because, the labels argue, the records are “works for hire,” compilations created not by independent performers but by musicians who are, in essence, their employees.

Independent copyright experts, however, find that argument unconvincing. Not only have recording artists traditionally paid for the making of their records themselves, with advances from the record companies that are then charged against royalties, they are also exempted from both the obligations and benefits an employee typically expects.

“This is a situation where you have to use your own common sense,” said June M. Besek, executive director of the Kernochan Center for Law, Media and the Arts at the Columbia University School of Law. “Where do they work? Do you pay Social Security for them? Do you withdraw taxes from a paycheck? Under those kinds of definitions it seems pretty clear that your standard kind of recording artist from the ’70s or ’80s is not an employee but an independent contractor.”

Read more:

Sunday, August 14, 2011

High Tech Cowboys of the Deep Seas

by Joshua Davis

Latitude 48° 14 North. Longitude 174° 26 West.

Almost midnight on the North Pacific, about 230 miles south of Alaska's Aleutian Islands. A heavy fog blankets the sea. There's nothing but the wind spinning eddies through the mist.

Out of the darkness, a rumble grows. The water begins to vibrate. Suddenly, the prow of a massive ship splits the fog. Its steel hull rises seven stories above the water and stretches two football fields back into the night. A 15,683-horsepower engine roars through the holds, pushing 55,328 tons of steel. Crisp white capital letters — COUGAR ACE — spell the ship's name above the ocean froth. A deep-sea car transport, its 14 decks are packed with 4,703 new Mazdas bound for North America. Estimated cargo value: $103 million.

On the bridge and belowdecks, the captain and crew begin the intricate process of releasing water from the ship's ballast tanks in preparation for entry into US territorial waters. They took on the water in Japan to keep the ship steady, but US rules require that it be dumped here to prevent contaminating American marine environments. It's a tricky procedure. To maintain stability and equilibrium, the ballast tanks need to be drained of foreign water and simultaneously refilled with local water. The bridge gives the go-ahead to commence the operation, and a ship engineer uses a hydraulic-powered system to open the starboard tank valves. Water gushes out one side of the ship and pours into the ocean. It's July 23, 2006.

In the crew's quarters below the bridge, Saw "Lucky" Kyin, the ship's 41-year-old Burmese steward, rinses off in the common shower. The ship rolls underneath his feet. He's been at sea for long stretches of the past six years. In his experience, when a ship rolls to one side, it generally rolls right back the other way.

This time it doesn't. Instead, the tilt increases. For some reason, the starboard ballast tanks have failed to refill properly, and the ship has abruptly lost its balance. At the worst possible moment, a large swell hits the Cougar Ace and rolls the ship even farther to port. Objects begin to slide across the deck. They pick up momentum and crash against the port-side walls as the ship dips farther. Wedged naked in the shower stall, Kyin is confronted by an undeniable fact: The Cougar Ace is capsizing.

He lunges for a towel and staggers into the hallway as the ship's windmill-sized propeller spins out of the water. Throughout the ship, the other 22 crew members begin to lose their footing as the decks rear up. There are shouts and screams. Kyin escapes through a door into the damp night air. He's barefoot and dripping wet, and the deck is now a slick metal ramp. In an instant, he's skidding down the slope toward the Pacific. He slams into the railings and his left leg snaps, bone puncturing skin. He's now draped naked and bleeding on the railing, which has dipped to within feet of the frigid ocean. The deck towers 105 feet above him like a giant wave about to break. Kyin starts to pray.

Read more:

Tragic Heroes - the Forty-Seven Ronin

The account of the 1703 vendetta by a large group of masterless samurai (rônin)--who lost their lord due to political infighting--stands as an enduring narrative of Japanese martial loyalty. The story was reenacted countless times on the puppet and kabuki stages, but, because of restrictions against the public discussion of sensitive issues, the names of the characters were always changed. In this 1869 version by Tsukioka Yoshitoshi (1839-1892), the true names and ranks of the rônin heroes are revealed because in 1868 the restrictions had been repealed.




via:

The Elusive Big Idea


Top row from left: Marie Curie, Albert Einstein, George Washington Carver and Betty Friedan. Bottom row from left: Charles R. Drew, Germaine Greer, John Maynard Keynes and Marshall McLuhan.

by Neal Gabler

The July/August issue of The Atlantic trumpets the “14 Biggest Ideas of the Year.” Take a deep breath. The ideas include “The Players Own the Game” (No. 12), “Wall Street: Same as it Ever Was” (No. 6), “Nothing Stays Secret” (No. 2), and the very biggest idea of the year, “The Rise of the Middle Class — Just Not Ours,” which refers to growing economies in Brazil, Russia, India and China.

Now exhale. It may strike you that none of these ideas seem particularly breathtaking. In fact, none of them are ideas. They are more on the order of observations. But one can’t really fault The Atlantic for mistaking commonplaces for intellectual vision. Ideas just aren’t what they used to be. Once upon a time, they could ignite fires of debate, stimulate other thoughts, incite revolutions and fundamentally change the ways we look at and think about the world.

They could penetrate the general culture and make celebrities out of thinkers — notably Albert Einstein, but also Reinhold Niebuhr, Daniel Bell, Betty Friedan, Carl Sagan and Stephen Jay Gould, to name a few. The ideas themselves could even be made famous: for instance, for “the end of ideology,” “the medium is the message,” “the feminine mystique,” “the Big Bang theory,” “the end of history.” A big idea could capture the cover of Time — “Is God Dead?” — and intellectuals like Norman Mailer, William F. Buckley Jr. and Gore Vidal would even occasionally be invited to the couches of late-night talk shows. How long ago that was.

If our ideas seem smaller nowadays, it’s not because we are dumber than our forebears but because we just don’t care as much about ideas as they did. In effect, we are living in an increasingly post-idea world — a world in which big, thought-provoking ideas that can’t instantly be monetized are of so little intrinsic value that fewer people are generating them and fewer outlets are disseminating them, the Internet notwithstanding. Bold ideas are almost passé.

It is no secret, especially here in America, that we live in a post-Enlightenment age in which rationality, science, evidence, logical argument and debate have lost the battle in many sectors, and perhaps even in society generally, to superstition, faith, opinion and orthodoxy. While we continue to make giant technological advances, we may be the first generation to have turned back the epochal clock — to have gone backward intellectually from advanced modes of thinking into old modes of belief. But post-Enlightenment and post-idea, while related, are not exactly the same.

Post-Enlightenment refers to a style of thinking that no longer deploys the techniques of rational thought. Post-idea refers to thinking that is no longer done, regardless of the style.

Read more: