Wednesday, April 23, 2014
How America’s Leading Science Fiction Authors Are Shaping Your Future
Stories set in the future are often judged, as time passes, on whether they come true or not. “Where are our flying cars?” became a plaintive cry of disappointment as the millennium arrived, reflecting the prevailing mood that science and technology had failed to live up to the most fanciful promises of early 20th-century science fiction.
But the task of science fiction is not to predict the future. Rather, it contemplates possible futures. Writers may find the future appealing precisely because it can’t be known, a black box where “anything at all can be said to happen without fear of contradiction from a native,” says the renowned novelist and poet Ursula K. Le Guin. “The future is a safe, sterile laboratory for trying out ideas in,” she tells Smithsonian, “a means of thinking about reality, a method.”
Some authors who enter that laboratory experiment with plausible futures—envisioning where contemporary social trends and recent breakthroughs in science and technology might lead us. William Gibson (who coined the term “cyberspace” and will never be allowed to forget it) is well known for his startling and influential stories, published in the 1980s, depicting visions of a hyper-connected global society where black-hat hackers, cyberwar and violent reality shows are part of daily life. For other authors, the future serves primarily as a metaphor. Le Guin’s award-winning 1969 novel, The Left Hand of Darkness—set on a distant world populated by genetically modified hermaphrodites—is a thought experiment about how society would be different if it were genderless.
Because science fiction spans the spectrum from the plausible to the fanciful, its relationship with science has been both nurturing and contentious. For every author who meticulously examines the latest developments in physics or computing, there are other authors who invent “impossible” technology to serve as a plot device (like Le Guin’s faster-than-light communicator, the ansible) or to enable social commentary, the way H. G. Wells uses his time machine to take the reader to the far future to witness the calamitous destiny of the human race.
Sometimes it’s the seemingly weird ideas that come true—thanks, in part, to science fiction’s capacity to spark an imaginative fire in readers who have the technical knowledge to help realize its visions. Jules Verne proposed the idea of light-propelled spaceships in his 1865 novel, From the Earth to the Moon. Today, technologists all over the world are actively working on solar sails.
Jordin Kare, an astrophysicist at the Seattle-based tech company LaserMotive, who has done important practical and theoretical work on lasers, space elevators and light-sail propulsion, cheerfully acknowledges the effect science fiction has had on his life and career. “I went into astrophysics because I was interested in the large-scale functions of the universe,” he says, “but I went to MIT because the hero of Robert Heinlein’s novel Have Spacesuit, Will Travel went to MIT.” Kare himself is very active in science fiction fandom. “Some of the people who are doing the most exploratory thinking in science have a connection to the science-fiction world.”
Microsoft, Google, Apple and other firms have sponsored lecture series in which science fiction writers give talks to employees and then meet privately with developers and research departments. Perhaps nothing better demonstrates the close tie between science fiction and technology today than what is called “design fiction”—imaginative works commissioned by tech companies to model new ideas. Some corporations hire authors to create what-if stories about potentially marketable products.
“I really like design fiction or prototyping fiction,” says novelist Cory Doctorow, whose clients have included Disney and Tesco. “There is nothing weird about a company doing this—commissioning a story about people using a technology to decide if the technology is worth following through on. It’s like an architect creating a virtual fly-through of a building.” Doctorow, who worked in the software industry, has seen both sides of the development process. “I’ve been in engineering discussions in which the argument turned on what it would be like to use the product, and fiction can be a way of getting at that experience.”
But the task of science fiction is not to predict the future. Rather, it contemplates possible futures. Writers may find the future appealing precisely because it can’t be known, a black box where “anything at all can be said to happen without fear of contradiction from a native,” says the renowned novelist and poet Ursula K. Le Guin. “The future is a safe, sterile laboratory for trying out ideas in,” she tells Smithsonian, “a means of thinking about reality, a method.”Some authors who enter that laboratory experiment with plausible futures—envisioning where contemporary social trends and recent breakthroughs in science and technology might lead us. William Gibson (who coined the term “cyberspace” and will never be allowed to forget it) is well known for his startling and influential stories, published in the 1980s, depicting visions of a hyper-connected global society where black-hat hackers, cyberwar and violent reality shows are part of daily life. For other authors, the future serves primarily as a metaphor. Le Guin’s award-winning 1969 novel, The Left Hand of Darkness—set on a distant world populated by genetically modified hermaphrodites—is a thought experiment about how society would be different if it were genderless.
Because science fiction spans the spectrum from the plausible to the fanciful, its relationship with science has been both nurturing and contentious. For every author who meticulously examines the latest developments in physics or computing, there are other authors who invent “impossible” technology to serve as a plot device (like Le Guin’s faster-than-light communicator, the ansible) or to enable social commentary, the way H. G. Wells uses his time machine to take the reader to the far future to witness the calamitous destiny of the human race.
Sometimes it’s the seemingly weird ideas that come true—thanks, in part, to science fiction’s capacity to spark an imaginative fire in readers who have the technical knowledge to help realize its visions. Jules Verne proposed the idea of light-propelled spaceships in his 1865 novel, From the Earth to the Moon. Today, technologists all over the world are actively working on solar sails.
Jordin Kare, an astrophysicist at the Seattle-based tech company LaserMotive, who has done important practical and theoretical work on lasers, space elevators and light-sail propulsion, cheerfully acknowledges the effect science fiction has had on his life and career. “I went into astrophysics because I was interested in the large-scale functions of the universe,” he says, “but I went to MIT because the hero of Robert Heinlein’s novel Have Spacesuit, Will Travel went to MIT.” Kare himself is very active in science fiction fandom. “Some of the people who are doing the most exploratory thinking in science have a connection to the science-fiction world.”
Microsoft, Google, Apple and other firms have sponsored lecture series in which science fiction writers give talks to employees and then meet privately with developers and research departments. Perhaps nothing better demonstrates the close tie between science fiction and technology today than what is called “design fiction”—imaginative works commissioned by tech companies to model new ideas. Some corporations hire authors to create what-if stories about potentially marketable products.
“I really like design fiction or prototyping fiction,” says novelist Cory Doctorow, whose clients have included Disney and Tesco. “There is nothing weird about a company doing this—commissioning a story about people using a technology to decide if the technology is worth following through on. It’s like an architect creating a virtual fly-through of a building.” Doctorow, who worked in the software industry, has seen both sides of the development process. “I’ve been in engineering discussions in which the argument turned on what it would be like to use the product, and fiction can be a way of getting at that experience.”
by Eileen Gunn, Smithsonian | Read more:
Image: Mehreen MurtazaAnimal Architecture
'Animal Architecture," by Ingo Arndt and Jürgen Tautz, with a foreword by Jim Brandenburg, is a beautiful new science/photography book exploring the mystery of nature through the "complex and elegant structures that animals create both for shelter and for capturing prey."
Arndt is a world-renowned nature photographer based in Germany, whose work you may have seen in National Geographic, GEO and BBC Wildlife.
Above, a grey bowerbird's bower in Australia's Northern Territory. "The grey bowerbird goes to extreme lengths to build a love nest from interwoven sticks and then covers the floor with decorative objects. The more artful the arbor, the greater the chance a male has of attracting a mate."
by Xeni Jardin, Boing Boing | Read more:
Image: Ingo Arndt
Renewables Aren’t Enough. Clean Coal Is the Future
Proof that good things don’t always come in nice packages can be found by taking the fast train from Beijing to Tianjin and then driving to the coast. Tianjin, China’s third-biggest city, originated as Beijing’s port on the Yellow Sea. But in recent years Tianjin has reclaimed so much of its muddy, unstable shoreline that the city has effectively moved inland and a new, crazily active port has sprung up at the water’s edge. In this hyper-industrialized zone, its highways choked with trucks, stand scores of factories and utility plants, each a mass of pipes, reactors, valves, vents, retorts, crackers, blowers, chimneys, and distillation towers—the sort of facility James Cameron might have lingered over, musing, on his way to film the climax of Terminator 2.
Among these edifices, just as big and almost as anonymous as its neighbors, is a structure called GreenGen, built by China Huaneng Group, a giant state-owned electric utility, in collaboration with half a dozen other firms, various branches of the Chinese government, and, importantly, Peabody Energy, a Missouri firm that is the world’s biggest private coal company.
By Western standards, GreenGen is a secretive place; weeks of repeated requests for interviews and a tour met with no reply. When I visited anyway, guards at the site not only refused admittance but wouldn’t even confirm its name. As I drove away from the entrance, a window blind cracked open; through the slats, an eye surveyed my departure. The silence, in my view, is foolish. GreenGen is a billion-dollar facility that extracts the carbon dioxide from a coal-fired power plant and, ultimately, will channel it into an underground storage area many miles away. Part of a coming wave of such carbon-eating facilities, it may be China’s—and possibly the planet’s—single most consequential effort to fight climate change.
Because most Americans rarely see coal, they tend to picture it as a relic of the 19th century, black stuff piled up in Victorian alleys. In fact, a lump of coal is a thoroughly ubiquitous 21st-century artifact, as much an emblem of our time as the iPhone. Today coal produces more than 40 percent of the world’s electricity, a foundation of modern life. And that percentage is going up: In the past decade, coal added more to the global energy supply than any other source.
Nowhere is the preeminence of coal more apparent than in the planet’s fastest-growing, most populous region: Asia, especially China. In the past few decades, China has lifted several hundred million people out of destitution—arguably history’s biggest, fastest rise in human well-being. That advance couldn’t have happened without industrialization, and that industrialization couldn’t have happened without coal. More than three-quarters of China’s electricity comes from coal, including the power for the giant electronic plants where iPhones are assembled. More coal goes to heating millions of homes, to smelting steel (China produces nearly half the world’s steel), and to baking limestone to make cement (China provides almost half the world’s cement). In its frantic quest to develop, China burns almost as much coal as the rest of the world put together—a fact that makes climatologists shudder. (...)
Which brings me, in a way, back to the unwelcoming facility in Tianjin. GreenGen is one of the world’s most advanced attempts to develop a technology known as carbon capture and storage. Conceptually speaking, CCS is simple: Industries burn just as much coal as before but remove all the pollutants. In addition to scrubbing out ash and soot, now standard practice at many big plants, they separate out the carbon dioxide and pump it underground, where it can be stored for thousands of years.
Many energy and climate researchers believe that CCS is vital to avoiding a climate catastrophe. Because it could allow the globe to keep burning its most abundant fuel source while drastically reducing carbon dioxide and soot, it may be more important—though much less publicized—than any renewable-energy technology for decades to come. No less than Steven Chu, the Nobel-winning physicist who was US secretary of energy until last year, has declared CCS essential. “I don’t see how we go forward without it,” he says. (...)
Coal is MEGO—until you live near it. MEGO is old journalistic slang for “my eyes glaze over”—a worthy story that is too dull to read. In America, where coal is mostly burned far out of sight, readers tend to react to the word coal by hitting Close Tab.
But people in Hebei don’t think coal is MEGO, at least in my experience. Hebei is the province that surrounds Beijing. When the capital city set up for the 2008 Olympics, the government pushed out the coal-powered utilities and factories that were polluting its air. Mostly, these facilities moved to Hebei. The province ended up with many new jobs. But it also ended up with China’s dirtiest air.
Because I was curious, I hired a taxi to drive in and around the Hebei city of Tangshan, southeast of Beijing. Visibility was about a quarter mile—a good day, the driver told me. Haze gave buildings the washed-out look of an old photographic print. Not long ago, Tangshan had been a relatively poor place. Now the edge of town held a murderer’s row of luxury-car dealerships: BMW, Jaguar, Mercedes, Lexus, Porsche. Most of the vehicles were displayed indoors. Those outside were covered with gray crud.
Among these edifices, just as big and almost as anonymous as its neighbors, is a structure called GreenGen, built by China Huaneng Group, a giant state-owned electric utility, in collaboration with half a dozen other firms, various branches of the Chinese government, and, importantly, Peabody Energy, a Missouri firm that is the world’s biggest private coal company.By Western standards, GreenGen is a secretive place; weeks of repeated requests for interviews and a tour met with no reply. When I visited anyway, guards at the site not only refused admittance but wouldn’t even confirm its name. As I drove away from the entrance, a window blind cracked open; through the slats, an eye surveyed my departure. The silence, in my view, is foolish. GreenGen is a billion-dollar facility that extracts the carbon dioxide from a coal-fired power plant and, ultimately, will channel it into an underground storage area many miles away. Part of a coming wave of such carbon-eating facilities, it may be China’s—and possibly the planet’s—single most consequential effort to fight climate change.
Because most Americans rarely see coal, they tend to picture it as a relic of the 19th century, black stuff piled up in Victorian alleys. In fact, a lump of coal is a thoroughly ubiquitous 21st-century artifact, as much an emblem of our time as the iPhone. Today coal produces more than 40 percent of the world’s electricity, a foundation of modern life. And that percentage is going up: In the past decade, coal added more to the global energy supply than any other source.
Nowhere is the preeminence of coal more apparent than in the planet’s fastest-growing, most populous region: Asia, especially China. In the past few decades, China has lifted several hundred million people out of destitution—arguably history’s biggest, fastest rise in human well-being. That advance couldn’t have happened without industrialization, and that industrialization couldn’t have happened without coal. More than three-quarters of China’s electricity comes from coal, including the power for the giant electronic plants where iPhones are assembled. More coal goes to heating millions of homes, to smelting steel (China produces nearly half the world’s steel), and to baking limestone to make cement (China provides almost half the world’s cement). In its frantic quest to develop, China burns almost as much coal as the rest of the world put together—a fact that makes climatologists shudder. (...)
Which brings me, in a way, back to the unwelcoming facility in Tianjin. GreenGen is one of the world’s most advanced attempts to develop a technology known as carbon capture and storage. Conceptually speaking, CCS is simple: Industries burn just as much coal as before but remove all the pollutants. In addition to scrubbing out ash and soot, now standard practice at many big plants, they separate out the carbon dioxide and pump it underground, where it can be stored for thousands of years.
Many energy and climate researchers believe that CCS is vital to avoiding a climate catastrophe. Because it could allow the globe to keep burning its most abundant fuel source while drastically reducing carbon dioxide and soot, it may be more important—though much less publicized—than any renewable-energy technology for decades to come. No less than Steven Chu, the Nobel-winning physicist who was US secretary of energy until last year, has declared CCS essential. “I don’t see how we go forward without it,” he says. (...)
Coal is MEGO—until you live near it. MEGO is old journalistic slang for “my eyes glaze over”—a worthy story that is too dull to read. In America, where coal is mostly burned far out of sight, readers tend to react to the word coal by hitting Close Tab.
But people in Hebei don’t think coal is MEGO, at least in my experience. Hebei is the province that surrounds Beijing. When the capital city set up for the 2008 Olympics, the government pushed out the coal-powered utilities and factories that were polluting its air. Mostly, these facilities moved to Hebei. The province ended up with many new jobs. But it also ended up with China’s dirtiest air.
Because I was curious, I hired a taxi to drive in and around the Hebei city of Tangshan, southeast of Beijing. Visibility was about a quarter mile—a good day, the driver told me. Haze gave buildings the washed-out look of an old photographic print. Not long ago, Tangshan had been a relatively poor place. Now the edge of town held a murderer’s row of luxury-car dealerships: BMW, Jaguar, Mercedes, Lexus, Porsche. Most of the vehicles were displayed indoors. Those outside were covered with gray crud.
by Charles C. Mann, Wired | Read more:
Image: Dan WintersAmerican Labor’s Death
This would be a critical blow for these unions, because it would greatly reduce the cash flow into union offices, and therefore hinder their ability to function and serve members. Small locals could go into severe financial trouble. Larger ones might have to stop their campaigns to reach out to workers to ensure that they sign union cards and pay dues. (Disclosure: Readers should know that the author is employed as an editor for a public sector union in New York City.)
Since neoliberalism has steadily killed off American manufacturing since the 1970s, the government sector has been the center of labor’s power. The 2008 financial crisis allowed state-level Republicans to exploit the economic pain to downsize government, which of course means weakening public sector workers rights. It started most dramatically with Wisconsin ridding workers of there of collective bargaining rights. In Detroit, the city cited its bankruptcy as a reason not to fulfill some of its pension obligations. And not a day seems to pass in the right-wing media when all of the world’s ills are blamed squarely on unionized public school teachers.
It’s very easy to blame this as the final phase of the Reagan Revolution, where the New Right began an attack on federal government services and unions, destroying major aspect of both and pulling the Democrats away from class politics and to the political center (Something similar happened at the same time in the United Kingdom with Margaret Thatcher, the unions and the Labour Party). But there’s an alternative narrative.
To borrow a theory from Daniel Gross, an anarchist trade unionist most famous for leading efforts to organize Starbucks baristas, American labor’s decline goes back much further than the rise of the Gipper, to the 1930s, which is most often thought of as labor’s finest hour, when after widespread labor unrest the government enshrined the right to organize in the National Labor Relations Act.
The alternative view is that this codification meant no longer could labor be an organized opposition force to capitalism, or any vehicle to organize workers not just for better wages and benefits, but for a post-capitalist future. Instead, unions became dependent on employers and the government for their power, creating a tripartite political understanding that would remain until the 1970s. In that time, radicals were purged from unions, and while today there are unions like the Industrial Workers of the World, the International Longshore and Warehouse Union and the United Electrical, Radio and Machine Workers of America who advocate that progress comes from confrontation, rather than collaboration, with employers, their numbers are small and voices absent from the discourse.
And so this partially explains labor’s exclusion from 1960s radicalism—recall Mario Savio’s famous Berkeley Free Speech Movement speech in which he vowed that students should not be molded by business, government or organized labor, the latter seen as just as big a part of the establishment as the other two. Today, unions find themselves at odds with progressives and radicals on a whole host of issues. The United Mineworkers of America are against new environmental regulations, and construction unions are fighting environmentalists who want to block the creation of a new oil pipeline because it will create jobs. Unions in upstate New York squirm at criminal justice reform measures that meant fewer inmates, which means fewer prisons and fewer prison jobs.
The fact is that despite the right-wing rhetoric that unions are a left-wing enemy to industrial order, unions are historically tethered to the interests of American capitalism. In purely Marxist terms, in the time of détente, from the 1930s to Reagan, unions helped workers recoup some of the surplus value extracted from them in the form of higher wages and benefits, but still allowed enough surplus value extraction in order for business to profit and eventually grow. For blue-collar workers, it was a pretty good deal; this allowed workers to own homes and cars, send their children to college and participate in the political process.
But as Thomas Piketty’s celebrated new history of capital suggestions, wealth has a tendency to concentrate, so this agreement became untenable. Moving production to the Global South solved labor questions in the industrial sector, driving down wages and forcing the working class into largely non-union service sector. That left the government sector.
What has happened there? It can’t be off-shored, but it can be outsourced.
by Ari Paul, Souciant | Read more:
Tuesday, April 22, 2014
The Secret History of Life-Hacking
We live in the age of life-hacking. The concept, which denotes a kind of upbeat, engineer-like approach to maximizing one’s personal productivity, first entered the mainstream lexicon in the mid-2000s, via tech journalists, the blogosphere, and trendspotting articles with headlines like “Meet the Life Hackers.” Since then the term has become ubiquitous in popular culture—just part of the atmosphere, humming with buzzwords, of the Internet age.
Variations on a blog post called “50 Life Hacks to Simplify Your World” have become endlessly, recursively viral, turning up on Facebook feeds again and again like ghost ships. Lifehacker.com, one of the many horses in Gawker Media’s stable of workplace procrastination sites, furnishes office workers with an endless array of ideas on how to live fitter, happier, and more productively: Track your sleep habits with motion-sensing apps and calculate your perfect personal bed-time; learn how to “supercharge your Gmail filters”; oh, and read novels, because it turns out that “reduces anxiety.” The tribune of life hackers, the author and sometime tech investor Timothy Ferriss, drums up recipes for a life of ease with an indefatigable frenzy, and enumerates the advantages in bestselling books and a reality TV show; outsource your bill payments to a man in India, he advises, and you can enjoy 15 more minutes of “orgasmic meditation.”
Life-hacking wouldn’t be popular if it didn’t tap into something deeply corroded about the way work has, without much resistance, managed to invade every corner of our lives. The idea started out as a somewhat earnest response to the problem of fragmented attention and overwork—an attempt to reclaim some leisure time and autonomy from the demands of boundaryless labor. But it has since become just another hectoring paradigm of self-improvement. The proliferation of apps and gurus promising to help manage even the most basic tasks of simple existence—the “quantified self” movement does life hacking one better, turning the simple act of breathing or sleeping into something to be measured and refined—suggests that merely getting through the day has become, for many white-collar professionals, a set of problems to solve and systems to optimize. Being alive is easier, it turns out, if you treat it like a job. (...)
And yet by comparison, the modern day self-Taylorization of the life hacker has broad appeal. In a way this makes sense: There’s no manager stop-watching you, or forcing you to work in particular ways; you’re ostensibly choosing, of your own will, to make your life better. The way true believers like Ferriss so thoroughly master-plan their lives has a gonzo attractiveness to it. What’s more, “hacking” sounds much better than “management.”
Variations on a blog post called “50 Life Hacks to Simplify Your World” have become endlessly, recursively viral, turning up on Facebook feeds again and again like ghost ships. Lifehacker.com, one of the many horses in Gawker Media’s stable of workplace procrastination sites, furnishes office workers with an endless array of ideas on how to live fitter, happier, and more productively: Track your sleep habits with motion-sensing apps and calculate your perfect personal bed-time; learn how to “supercharge your Gmail filters”; oh, and read novels, because it turns out that “reduces anxiety.” The tribune of life hackers, the author and sometime tech investor Timothy Ferriss, drums up recipes for a life of ease with an indefatigable frenzy, and enumerates the advantages in bestselling books and a reality TV show; outsource your bill payments to a man in India, he advises, and you can enjoy 15 more minutes of “orgasmic meditation.”Life-hacking wouldn’t be popular if it didn’t tap into something deeply corroded about the way work has, without much resistance, managed to invade every corner of our lives. The idea started out as a somewhat earnest response to the problem of fragmented attention and overwork—an attempt to reclaim some leisure time and autonomy from the demands of boundaryless labor. But it has since become just another hectoring paradigm of self-improvement. The proliferation of apps and gurus promising to help manage even the most basic tasks of simple existence—the “quantified self” movement does life hacking one better, turning the simple act of breathing or sleeping into something to be measured and refined—suggests that merely getting through the day has become, for many white-collar professionals, a set of problems to solve and systems to optimize. Being alive is easier, it turns out, if you treat it like a job. (...)
And yet by comparison, the modern day self-Taylorization of the life hacker has broad appeal. In a way this makes sense: There’s no manager stop-watching you, or forcing you to work in particular ways; you’re ostensibly choosing, of your own will, to make your life better. The way true believers like Ferriss so thoroughly master-plan their lives has a gonzo attractiveness to it. What’s more, “hacking” sounds much better than “management.”
by Nikil Saval, Pacific Standard | Read more:
Image: Philip Gendreau/Berrmann/CorbisMorgan and Jeff's Divorce Party Invitation
Morgan + Jeff
Kindly Request Your Presence
At a Party to Celebrate
Their Upcoming Divorce
Or, Extreme Makeover: Our Entire Life and All Our Choices Edition
Taking Place at
What is Now Morgan’s Home
On Friday, February 21, 8 pm.
The Party Will Include Dancing, Photos,
Memories, Drinks, and Snacks.
Because Who Needs a Sustained and Loving Relationship
Based on Mutual Admiration and Support
When You Can Have Mini Franks!!
The Party Will Also Include Games Such as:
“Match the Annoying Quality to Morgan or Jeff,”
“Talk About the Early Days and Try to Pinpoint
Precisely When Things Started Going Wrong,”
“Wonder if Marriage is Even a Viable Institution
Or if it is a Construction of the Patriarchy.”
Also: Badminton!
And We Got a Fire Pit.
To ‘Wink’ at the Differences
That Slowly Pulled Morgan + Jeff Apart
There Will Be “Morgan”- and “Jeff”-Themed Areas
To Represent Their Separate Interests.
Morgan’s Theme Celebrates Her Interest in
Reading, Movies, and Learning About Other People.
Jeff’s Celebrates His Interest in
Staring at His Phone 24/7
And Ignoring Morgan’s Basic Human Need
For Connection.
This is Only for
Close Personal Friends And Family
So Please No Plus-Ones.
And No One Invite Tom
Who, as You All Knew Before Jeff Did,
Morgan Has Been Having an Affair With
For Over a Year.
And Please, No Kids!
Though Morgan + Jeff Have Chosen To Separate
They Still Love Each Other Very Much
So Please No Bad-Mouthing
One to the Other
Or Asking Morgan to Detail
All the Weird Sex Stuff Jeff is Into.
by Blythe Roberson, McSweeny's | Read more:
Image: via
Transcending Complacency on Superintelligent Machines
[ed. Not often do you see Stephen Hawking as a co-author of an opinion piece, especially one related to a blockbuster movie.]
Artificial intelligence (AI) research is now progressing rapidly. Recent landmarks such as self-driving cars, a computer winning at Jeopardy!, and the digital personal assistants Siri, Google Now and Cortana are merely symptoms of an IT arms race fueled by unprecedented investments and building on an increasingly mature theoretical foundation. Such achievements will probably pale against what the coming decades will bring.
The potential benefits are huge; everything that civilization has to offer is a product of human intelligence; we cannot predict what we might achieve when this intelligence is magnified by the tools AI may provide, but the eradication of war, disease, and poverty would be high on anyone's list. Success in creating AI would be the biggest event in human history.
Unfortunately, it might also be the last, unless we learn how to avoid the risks. (...)
Looking further ahead, there are no fundamental limits to what can be achieved: there is no physical law precluding particles from being organized in ways that perform even more advanced computations than the arrangements of particles in human brains. An explosive transition is possible, although it may play out differently than in the movie: as Irving Good realized in 1965, machines with superhuman intelligence could repeatedly improve their design even further, triggering what Vernor Vinge called a "singularity" and Johnny Depp's movie character calls "transcendence." One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.
So, facing possible futures of incalculable benefits and risks, the experts are surely doing everything possible to ensure the best outcome, right? Wrong. If a superior alien civilization sent us a text message saying, "We'll arrive in a few decades," would we just reply, "OK, call us when you get here -- we'll leave the lights on"? Probably not -- but this is more or less what is happening with AI.
by Stephen Hawking, Max Tegmark, Stuart Russell, and Frand Wilczek, Huffington Post | Read more:
Image: AP
Monday, April 21, 2014
Station to Station
On-demand streaming music has been part of the collective imagination for more than a century. It can be traced back to the 1888 publication of Edward Bellamy’s million-selling science fiction novel Looking Backward, in which a man falls asleep in 1887 and wakes up in 2000. Amidst the mind-blowing technological developments he encounters on his journey is a “music room,” in which 24-hour playlists are piped in to subscribers via phone lines. With no shortage of astonishment, the man proclaims that “an arrangement for providing everybody with music in their homes, perfect in quality, unlimited in quantity, suited to every mood, and beginning and ceasing at will” is perhaps the pinnacle of human achievement.
The splashy, celebrity-laden debut of Beats Music earlier this year may not have been accompanied by such gobsmacked wonder, but at the same time, the smartphone-based music subscription service sponsored by AT&T is the latest iteration of Bellamy’s fantastic 19th century notion. Beginning with Pandora’s 2005 launch and dramatically ramping up with Spotify’s controversial 2011 debut, streaming has become the preeminent technological force driving digital music into the 21st century. Though the idea of streaming music pre-dates recordings, the industry’s investments in today’s technology is designed in large part to wrench back control via unlimited access after a decade of ceding power to mp3-downloading fans.
So far, it’s working. According to Nielsen SoundScan’s 2013 report, sales of single mp3 downloads declined 6 percent from 2012, while streaming activity increased by 32 percent. The Recording Industry Association of America’s own data reveals that sales of physical media declined 12.3 percent between 2012 and 2013 while paid subscriptions to streaming platforms increased 57 percent. CDs and mp3s won’t simply disappear—they’re still vital parts of digital music's ecology—but faced with streaming, they feel destined to become the digital equivalents of once-dominant analog predecessors like vinyl records and cassettes.
Though streaming platforms are very much a product of the digital-era presumption that all the world’s information should be accessible with a single click, their form and function derives from another early music medium. A few decades after Bellamy’s book captured the imagination of millions, and at the same time that the business of selling records was taking off, “music rooms” were manifested by broadcast radio. Nationwide, parlors were filled with sound by national radio networks like NBC and CBS, which interspersed music with periodic bursts of news, narrative programs, and advertising. From the 1920s forward, the business of selling and consuming music has been structured by a technological dialogue between programmed music streams and individual recordings.
If the recording industry has its way, music ownership will give way to a model completely based on access, but with an important shift. While radio broadcasts are based on a one-to-many model of transmission, streaming platforms aim to zero in on the tastes of the individual listener. Like many other modern industries, the recording industry is doubling down on big data, giving their catalogs to the coders, and betting on a future of distribution and discovery dictated by quantification. Behind the interfaces of streaming platforms are vast databases of songs coded with pinpoint metadata and matched with freely provided listener taste preferences, an infrastructure designed to execute the recording industry’s century-long mission: suggesting with mathematical detail what a listener wants to hear before they know they want to hear it. Combing through a huge corpus of ever-expanding data for each individual song can be a vastly different undertaking compared to older forms of music marketing and distribution. What used to be a question of persuasion has become a problem of prediction.
Listeners are well-served by streaming platforms, but for artists, they cast the question of compensation in a stark new light. While the value debates that dominated the mp3 moment pitted fans against artists, the emergent streaming era has so far seen the return of corporate exploitation, with a speculative twist: The rich or soon-to-be-rich build innovative products, convince an ailing recording industry to sign over their catalogs, acquiring the bricks-and-mortar of their operations—digitized recordings—for fractions of a penny on the dollar. These operations are mostly funded by venture capital, periodic rounds of investments, or as cogs in vast empires of information, and they can feel overwhelming for fans and artists alike. (...)
As streaming takes center stage for music commerce, questions with long histories must be reframed. In what ways are the non-stop interactions between databases and algorithms shaping our musical tastes? Do streaming platform business models inherently exploit artists when listener choice scales to infinity? Should speculative capitalism be the driving force for large-scale innovations in music technology, and is there a feasible alternative? Are we living in a technological golden age of creative possibility, cross-cultural communication, and sheer abundance, or a surveillance state controlled by privately-held brands promising endless access at the expense of imperceptible control? Answers to these questions are piloting digital music deep into the 21st century, but critically evaluating current technological developments means keeping an eye on the lessons of the past. (...)
More recently, computer engineers have looked to content-based recommendation as a way to address music-as-music, not simply as a generic commodity. Under this heading falls what’s long been called “machine listening”—epitomized most popularly by the Shazam app—in which songs are scanned for musicological factors and matched against those of other songs in infinite configurations. The Echo Nest uses machine listening, but it’s far from the company’s most important innovation. That would be its unique process of data retrieval and curation, which entails scraping information from social media platforms, Wikipedia entries, album reviews, and blog posts, which employees then shape into metadata, attached to songs and artists. When describing this labor-intensive aspect of the coding and recommendation process, Whitman suggests the Echo Nest is a living creature with an endless appetite: “If there’s a new artist, we’ll ingest it and try to learn about it.” (...)
“We don't just see that you have liked a song, we know about that song," Whitman continues. "To us, a song is not just a database entry, it’s the key, the tempo it’s in, the instruments.”
by Eric Harvey, Pitchfork | Read more:
Image: uncredited
Bizarro World
Two weeks ago, a pair of F.B.I. agents appeared unannounced at the door of a member of the defense team for one of the men accused of plotting the 9/11 terrorist attacks. As a contractor working with the defense team at Guantánamo Bay, Cuba, the man was bound by the same confidentiality rules as a lawyer. But the agents wanted to talk.
They asked questions, lawyers say, about the legal teams for Ramzi bin al-Shibh, Khalid Shaikh Mohammed and other accused terrorists who will eventually stand trial before a military tribunal at Guantánamo. Before they left, the agents asked the contractor to sign an agreement promising not to tell anyone about the conversation.
With that signature, Mr. bin al-Shibh’s lawyers say, the government turned a member of their team into an F.B.I. informant.
Also too, is this ok?
Last year, as a lawyer for Mr. Mohammed was speaking during another hearing, a red light began flashing. Then the videofeed from the courtroom abruptly cut out. The emergency censorship system had been activated. But why? And by whom? The defense lawyer had said nothing classified. And the court officer responsible for protecting state secrets had not triggered the system. Days later, the military judge, Col. James L. Pohl, announced that he had been told that an “original classification authority” — meaning the C.I.A. — was secretly monitoring the proceedings. Unknown to everyone else, the agency had its own button, which the judge swiftly and angrily disconnected.
Last year, the government acknowledged that microphones were hidden inside what looked like smoke detectors in the rooms where detainees met with their lawyers. Those microphones gave officials the ability to eavesdrop on confidential conversations, but the military said it never did so.
There's a term for this:
A kangaroo court is a judicial tribunal or assembly that blatantly disregards recognized standards of law or justice, and often carries little or no official standing in the territory within which it resides. Merriam-Webster defines it as "a mock court in which the principles of law and justice are disregarded or perverted".
A kangaroo court is often held by a group or a community to give the appearance of a fair and just trial, even though the verdict has in reality already been decided before the trial has begun. Such courts typically take place in rural areas where legitimate law enforcement may be limited. The term may also apply to a court held by a legitimate judicial authority who intentionally disregards the court's legal or ethical obligations.
This is why I laugh when people say we need to "trust" the secret intelligence agencies and accept that they are following the rule of law and the constitution. It's probably the most fatuous remark I ever hear from liberals. According to that way of thinking, it's the people who reveal the government's misdeeds, not the misdeeds themselves, that constitutes betrayal of our country. I think that may be just a tiny misunderstanding of the issue.
Digby, Hullabaloo| Read more:
Digby, Hullabaloo| Read more:
Image: uncredited
Subscribe to:
Comments (Atom)















