Friday, June 14, 2019
Inside the Cultish Dreamworld of Augusta National
Beneath Augusta National, the world’s most exclusive golf club and most venerated domain of cultivated grass, there is a vast network of pipes and mechanical blowers, which help drain and ventilate the putting greens. The SubAir System was developed in the nineteen-nineties, by the aptly named course superintendent Marsh Benson, in an effort to mitigate the effects of nature on this precious facsimile of it. When the system’s fans blow one way, they provide air to the densely seeded bent grass of the putting surface. This promotes growth. When the fans are reversed, they create a suction effect, and leach water from the greens. This promotes firmness. The professionals who arrive at Augusta every April to compete in the Masters Tournament, the event for which the club is known, expect to be tested by greens that are hard and fast. Amid all the other immodesties and peculiarities of Augusta, the greens, ultimately, are the thing. Herbert Warren Wind, who for decades covered the sport at this magazine and at Sports Illustrated, once asked a colleague, on arriving in Augusta, “Are they firm?” The antecedent was understood. In 1994, Gary McCord, a golf commentator for CBS, the network that has televised the tournament for sixty-three years, said on the air, “They don’t cut the greens here at Augusta, they use bikini wax.” He was banned from the broadcast.
It is by now hardly scandalous to note that Augusta National—called the National by its members and devotees, and Augusta by everyone else—is an environment of extreme artifice, an elaborate television soundstage, a fantasia of the fifties, a Disneyclub in the Georgia pines. Some of the components of the illusion are a matter of speculation, as the club is notoriously stingy with information about itself. It has been accepted as fact that recalcitrant patches of grass are painted green and that the ponds used to be dyed blue. Because the azaleas seem always to bloom right on time, skeptics have propagated the myth that the club’s horticulturists freeze the blossoms, in advance of the tournament, or swap out early bloomers for more coöperative specimens. Pine straw is imported. Pinecones are deported. There is a curious absence of fauna. One hardly ever sees a squirrel or a bird. I’d been told that birdsong—a lot of it, at any rate—is piped in through speakers hidden in the greenery. (In 2000, CBS got caught doing some overdubbing of its own, after a birder noticed that the trills and chirps on a golf broadcast belonged to non-indigenous species.)
You hear about this kind of stuff, before your first visit, just as you get the more commonplace spiel that everything is perfect, that the course is even more majestic in real life than it is on TV, and that, in spite of all the walking, you’ll put on five pounds. Pimento-cheese sandwiches, egg-salad sandwiches, peach-ice-cream sandwiches, MoonPies, underpriced beer. You are urged to adopt the terminology favored by the tournament hosts and embraced by CBS. Spectators are “patrons.” The rough—longer grass that lines the fairways—is the “second cut.” (And it is controversial, because its abundance contravenes the wishes of the patriarchs, who designed the course to have a dearth of rough. Gary McCord may have been onto something.) The traps are bunkers, and what appears to patrons and television viewers to be the whitest sand in golf is technically not sand but waste from feldspar mines in North Carolina.
Augusta National is sometimes likened to Oz. For one thing, it’s a Technicolor fantasyland embedded in an otherwise ordinary tract of American sprawl. Washington Road, the main approach to the club, is a forlorn strip of Waffle Houses, pool-supply stores, and cheap-except-during-the-Masters hotels. In the Hooters parking lot during tournament week, fans line up for selfies with John Daly, the dissolute pro and avatar of mid-round cigarettes and booze. But step through the club’s metal detectors and badge scanners, and you enter a lush, high-rent realm, where you are not allowed to run, talk loudly, or cheer a player’s mistakes. Order is maintained by security guards, who for decades were provided by the Pinkerton detective agency. (Though Pinkerton was acquired by a Swedish company called Securitas, in 1999, many patrons still refer to the guards as Pinkertons.) In 2012, a fan who stole onto a fairway to take a cup of bunker sand was thrown in jail.
I showed up on a Monday afternoon before the tournament, just as a series of storms swept in, and as the spectators, there to witness the first rounds of practice, were being herded off the grounds. Owing to the threat of lightning, play was suspended for the day and the club was closed to visitors. The throngs poured out of the gates into the real world, just as I was leaving it. I took refuge in what the club calls the press building, a recently constructed Taj Mahal of media mollycoddling. This columned, ersatz-antebellum megamansion, in operation just ten days a year, has got to be the fanciest media center in sports. It has state-of-the-art working quarters, radio and television studios, locker rooms, a gratis restaurant with made-to-order omelettes for breakfast and a bountiful hot lunch, as well as a grab-and-go counter with craft beers, artisanal cheeses and jerkies, and a full array of Augusta’s famous sandwiches, each wrapped in green paper.
Such generosity and care, for the journalists, reflects the role that so many of them have played in burnishing the mythology of the Masters; it also suggests an effort to keep them away from the course and the clubhouse. The press is provided with every disincentive to venture out. The gang’s all there. Even the bathrooms are capacious, and staffed with attendants. Each member of the media has a work station with a brass nameplate, a leather swivel chair, a pair of computer monitors, and a surfeit of real-time tournament footage and information—far more data than one would be able to gather out on the golf course, especially because, outside the press building, reporters are not allowed to carry cell phones. (The phone ban, strictly enforced and punishable by immediate removal from the grounds, applies to patrons and members, too. One morning during the tournament this year, a story went around that the club had done a spot inspection of staff headquarters and found that an employee had hidden a cell phone between two slices of bread.) The golfers and the tournament officials appear dutifully for press conferences; why bother heading out to the clubhouse to hound them for quotes? No phones are allowed at the press conferences, either. The club wants control over sounds and pictures—the content. The club can tell who’s who, and who’s where, by rfid chips affixed to each press badge.
The working area faced the practice range, which the players had abandoned, once the rain began hammering down. As dusk approached, the rain briefly let up, and a battalion of men in baggy white coveralls—the official caddie costume at Augusta—fanned out across the range, to retrieve the hundreds of balls that the players had struck there earlier in the day. In the gloaming, these white jumpsuits, moving irregularly amid the deep green of the manicured grounds, brought to mind an avant-garde film about a lunatic asylum: the inmates, in their hospital gowns, out for a constitutional.
The course was still closed the next morning. I caught a ride to the clubhouse on a golf cart with a member, a so-called green jacket, named John Carr, an oil magnate from Ireland, who told me that he was on the media committee.
The members in attendance during the tournament (and at dinner, whenever they visit) are required to wear their green blazers. The club’s founders decreed, in the earliest years of the tournament, that any members present had to make themselves available to patrons who might be in need of assistance. The jackets tell you who the members are. It is an oddity of the place that its members insist on secrecy—there are some three hundred, but there is no public list, and omertà is strictly enforced—and yet here, at the biggest golf tournament of the year, they parade about in uniform, wearing name tags: Roger Goodell, Sam Nunn, Rex Tillerson.
The jackets themselves never leave the grounds; they hang in the members’ lockers. Each winner of the Masters gets a green jacket, too, which is presented immediately after the victory by the club’s chairman and the previous year’s winner, in an awkward ceremony staged for television in the basement of a house called the Butler Cabin, near the eighteenth hole. The solemnity surrounding this perennial observance suggests the initiation ritual of a really square fraternity. Jim Nantz, the longtime host of the CBS broadcast and of the Butler Cabin sacrament, has perfected an air of unctuous self-satisfaction that signals even to the casual viewer that there is something batty about the whole enterprise. The way that Nantz repeats the tag line—“A tradition unlike any other”—assumes a sinister, cultish edge. Everyone associated with the club seems to take all this very seriously. On the official Masters podcast, the host, Marty Smith, said to the celebrity chef David Chang, as though reciting a prayer, “The respect for the grounds and the reverence for the event permeate us as human beings and we thereby disseminate that same respect to our peers.”
“It’s a beautiful thing,” Chang replied. “It almost restores my faith in humanity.” As one long-standing media-badge holder told me, after he’d spent ten minutes singing the club’s praises on the record, “These guys are out of their fucking minds. They think it’s supernatural.”
[ed. See also: If Brooks Koepka Is the Future of Golf, What Does That Future Look Like? (The Ringer).]
It is by now hardly scandalous to note that Augusta National—called the National by its members and devotees, and Augusta by everyone else—is an environment of extreme artifice, an elaborate television soundstage, a fantasia of the fifties, a Disneyclub in the Georgia pines. Some of the components of the illusion are a matter of speculation, as the club is notoriously stingy with information about itself. It has been accepted as fact that recalcitrant patches of grass are painted green and that the ponds used to be dyed blue. Because the azaleas seem always to bloom right on time, skeptics have propagated the myth that the club’s horticulturists freeze the blossoms, in advance of the tournament, or swap out early bloomers for more coöperative specimens. Pine straw is imported. Pinecones are deported. There is a curious absence of fauna. One hardly ever sees a squirrel or a bird. I’d been told that birdsong—a lot of it, at any rate—is piped in through speakers hidden in the greenery. (In 2000, CBS got caught doing some overdubbing of its own, after a birder noticed that the trills and chirps on a golf broadcast belonged to non-indigenous species.)You hear about this kind of stuff, before your first visit, just as you get the more commonplace spiel that everything is perfect, that the course is even more majestic in real life than it is on TV, and that, in spite of all the walking, you’ll put on five pounds. Pimento-cheese sandwiches, egg-salad sandwiches, peach-ice-cream sandwiches, MoonPies, underpriced beer. You are urged to adopt the terminology favored by the tournament hosts and embraced by CBS. Spectators are “patrons.” The rough—longer grass that lines the fairways—is the “second cut.” (And it is controversial, because its abundance contravenes the wishes of the patriarchs, who designed the course to have a dearth of rough. Gary McCord may have been onto something.) The traps are bunkers, and what appears to patrons and television viewers to be the whitest sand in golf is technically not sand but waste from feldspar mines in North Carolina.
Augusta National is sometimes likened to Oz. For one thing, it’s a Technicolor fantasyland embedded in an otherwise ordinary tract of American sprawl. Washington Road, the main approach to the club, is a forlorn strip of Waffle Houses, pool-supply stores, and cheap-except-during-the-Masters hotels. In the Hooters parking lot during tournament week, fans line up for selfies with John Daly, the dissolute pro and avatar of mid-round cigarettes and booze. But step through the club’s metal detectors and badge scanners, and you enter a lush, high-rent realm, where you are not allowed to run, talk loudly, or cheer a player’s mistakes. Order is maintained by security guards, who for decades were provided by the Pinkerton detective agency. (Though Pinkerton was acquired by a Swedish company called Securitas, in 1999, many patrons still refer to the guards as Pinkertons.) In 2012, a fan who stole onto a fairway to take a cup of bunker sand was thrown in jail.
I showed up on a Monday afternoon before the tournament, just as a series of storms swept in, and as the spectators, there to witness the first rounds of practice, were being herded off the grounds. Owing to the threat of lightning, play was suspended for the day and the club was closed to visitors. The throngs poured out of the gates into the real world, just as I was leaving it. I took refuge in what the club calls the press building, a recently constructed Taj Mahal of media mollycoddling. This columned, ersatz-antebellum megamansion, in operation just ten days a year, has got to be the fanciest media center in sports. It has state-of-the-art working quarters, radio and television studios, locker rooms, a gratis restaurant with made-to-order omelettes for breakfast and a bountiful hot lunch, as well as a grab-and-go counter with craft beers, artisanal cheeses and jerkies, and a full array of Augusta’s famous sandwiches, each wrapped in green paper.
Such generosity and care, for the journalists, reflects the role that so many of them have played in burnishing the mythology of the Masters; it also suggests an effort to keep them away from the course and the clubhouse. The press is provided with every disincentive to venture out. The gang’s all there. Even the bathrooms are capacious, and staffed with attendants. Each member of the media has a work station with a brass nameplate, a leather swivel chair, a pair of computer monitors, and a surfeit of real-time tournament footage and information—far more data than one would be able to gather out on the golf course, especially because, outside the press building, reporters are not allowed to carry cell phones. (The phone ban, strictly enforced and punishable by immediate removal from the grounds, applies to patrons and members, too. One morning during the tournament this year, a story went around that the club had done a spot inspection of staff headquarters and found that an employee had hidden a cell phone between two slices of bread.) The golfers and the tournament officials appear dutifully for press conferences; why bother heading out to the clubhouse to hound them for quotes? No phones are allowed at the press conferences, either. The club wants control over sounds and pictures—the content. The club can tell who’s who, and who’s where, by rfid chips affixed to each press badge.
The working area faced the practice range, which the players had abandoned, once the rain began hammering down. As dusk approached, the rain briefly let up, and a battalion of men in baggy white coveralls—the official caddie costume at Augusta—fanned out across the range, to retrieve the hundreds of balls that the players had struck there earlier in the day. In the gloaming, these white jumpsuits, moving irregularly amid the deep green of the manicured grounds, brought to mind an avant-garde film about a lunatic asylum: the inmates, in their hospital gowns, out for a constitutional.
The course was still closed the next morning. I caught a ride to the clubhouse on a golf cart with a member, a so-called green jacket, named John Carr, an oil magnate from Ireland, who told me that he was on the media committee.
The members in attendance during the tournament (and at dinner, whenever they visit) are required to wear their green blazers. The club’s founders decreed, in the earliest years of the tournament, that any members present had to make themselves available to patrons who might be in need of assistance. The jackets tell you who the members are. It is an oddity of the place that its members insist on secrecy—there are some three hundred, but there is no public list, and omertà is strictly enforced—and yet here, at the biggest golf tournament of the year, they parade about in uniform, wearing name tags: Roger Goodell, Sam Nunn, Rex Tillerson.
The jackets themselves never leave the grounds; they hang in the members’ lockers. Each winner of the Masters gets a green jacket, too, which is presented immediately after the victory by the club’s chairman and the previous year’s winner, in an awkward ceremony staged for television in the basement of a house called the Butler Cabin, near the eighteenth hole. The solemnity surrounding this perennial observance suggests the initiation ritual of a really square fraternity. Jim Nantz, the longtime host of the CBS broadcast and of the Butler Cabin sacrament, has perfected an air of unctuous self-satisfaction that signals even to the casual viewer that there is something batty about the whole enterprise. The way that Nantz repeats the tag line—“A tradition unlike any other”—assumes a sinister, cultish edge. Everyone associated with the club seems to take all this very seriously. On the official Masters podcast, the host, Marty Smith, said to the celebrity chef David Chang, as though reciting a prayer, “The respect for the grounds and the reverence for the event permeate us as human beings and we thereby disseminate that same respect to our peers.”
“It’s a beautiful thing,” Chang replied. “It almost restores my faith in humanity.” As one long-standing media-badge holder told me, after he’d spent ten minutes singing the club’s praises on the record, “These guys are out of their fucking minds. They think it’s supernatural.”
by Nick Paumgarten, New Yorker | Read more:
Image: Leo Espinosa[ed. See also: If Brooks Koepka Is the Future of Golf, What Does That Future Look Like? (The Ringer).]
Thursday, June 13, 2019
The Propaganda Multiplier
It is one of the most important aspects of our media system, and yet hardly known to the public: most of the international news coverage in Western media is provided by only three global news agencies based in New York, London and Paris.
The key role played by these agencies means Western media often report on the same topics, even using the same wording. In addition, governments, military and intelligence services use these global news agencies as multipliers to spread their messages around the world.
A study of the Syria war coverage by nine leading European newspapers clearly illustrates these issues: 78% of all articles were based in whole or in part on agency reports, yet 0% on investigative research. Moreover, 82% of all opinion pieces and interviews were in favor of the US and NATO intervention, while propaganda was attributed exclusively to the opposite side.
“The Invisible Nerve Center of the Media System”
So what are the names of these agencies that are “always at the source of the story”? There are now only three global news agencies left:
Wolfgang Vyslozil, former managing director of the Austrian APA, described the key role of news agencies with these words: “News agencies are rarely in the public eye. Yet they are one of the most influential and at the same time one of the least known media types. They are key institutions of substantial importance to any media system. They are the invisible nerve center that connects all parts of this system.” (Segbers 2007, p.10)
Small abbreviation, great effect
However, there is a simple reason why the global agencies, despite their importance, are virtually unknown to the general public. To quote a Swiss media professor: “Radio and television usually do not name their sources, and only specialists can decipher references in magazines.” (Blum 1995, P. 9)
The motive for this discretion, however, should be clear: news outlets are not particularly keen to let readers know that they haven’t researched most of their contributions themselves.
The key role played by these agencies means Western media often report on the same topics, even using the same wording. In addition, governments, military and intelligence services use these global news agencies as multipliers to spread their messages around the world.
A study of the Syria war coverage by nine leading European newspapers clearly illustrates these issues: 78% of all articles were based in whole or in part on agency reports, yet 0% on investigative research. Moreover, 82% of all opinion pieces and interviews were in favor of the US and NATO intervention, while propaganda was attributed exclusively to the opposite side.
Introduction: “Something strange”
“How does the newspaper know what it knows?” The answer to this question is likely to surprise some newspaper readers: “The main source of information is stories from news agencies. The almost anonymously operating news agencies are in a way the key to world events. So what are the names of these agencies, how do they work and who finances them? To judge how well one is informed about events in East and West, one should know the answers to these questions.” (Höhne 1977, p. 11)
A Swiss media researcher points out: “The news agencies are the most important suppliers of material to mass media. No daily media outlet can manage without them. So the news agencies influence our image of the world; above all, we get to know what they have selected.” (Blum 1995, p. 9)
In view of their essential importance, it is all the more astonishing that these agencies are hardly known to the public: “A large part of society is unaware that news agencies exist at all … In fact, they play an enormously important role in the media market. But despite this great importance, little attention has been paid to them in the past.” (Schulten-Jaspers 2013, p. 13)
Even the head of a news agency noted: “There is something strange about news agencies. They are little known to the public. Unlike a newspaper, their activity is not so much in the spotlight, yet they can always be found at the source of the story.” (Segbers 2007, p. 9)
“How does the newspaper know what it knows?” The answer to this question is likely to surprise some newspaper readers: “The main source of information is stories from news agencies. The almost anonymously operating news agencies are in a way the key to world events. So what are the names of these agencies, how do they work and who finances them? To judge how well one is informed about events in East and West, one should know the answers to these questions.” (Höhne 1977, p. 11)A Swiss media researcher points out: “The news agencies are the most important suppliers of material to mass media. No daily media outlet can manage without them. So the news agencies influence our image of the world; above all, we get to know what they have selected.” (Blum 1995, p. 9)
In view of their essential importance, it is all the more astonishing that these agencies are hardly known to the public: “A large part of society is unaware that news agencies exist at all … In fact, they play an enormously important role in the media market. But despite this great importance, little attention has been paid to them in the past.” (Schulten-Jaspers 2013, p. 13)
Even the head of a news agency noted: “There is something strange about news agencies. They are little known to the public. Unlike a newspaper, their activity is not so much in the spotlight, yet they can always be found at the source of the story.” (Segbers 2007, p. 9)
“The Invisible Nerve Center of the Media System”
So what are the names of these agencies that are “always at the source of the story”? There are now only three global news agencies left:
- The American Associated Press (AP) with over 4000 employees worldwide. The AP belongs to US media companies and has its main editorial office in New York. AP news is used by around 12,000 international media outlets, reaching more than half of the world’s population every day.
- The quasi-governmental French Agence France-Presse (AFP) based in Paris and with around 4000 employees. The AFP sends over 3000 stories and photos every day to media all over the world.
- The British agency Reuters in London, which is privately owned and employs just over 3000 people. Reuters was acquired in 2008 by Canadian media entrepreneur Thomson – one of the 25 richest people in the world – and merged into Thomson Reuters, headquartered in New York.
Wolfgang Vyslozil, former managing director of the Austrian APA, described the key role of news agencies with these words: “News agencies are rarely in the public eye. Yet they are one of the most influential and at the same time one of the least known media types. They are key institutions of substantial importance to any media system. They are the invisible nerve center that connects all parts of this system.” (Segbers 2007, p.10)
Small abbreviation, great effect
However, there is a simple reason why the global agencies, despite their importance, are virtually unknown to the general public. To quote a Swiss media professor: “Radio and television usually do not name their sources, and only specialists can decipher references in magazines.” (Blum 1995, P. 9)
The motive for this discretion, however, should be clear: news outlets are not particularly keen to let readers know that they haven’t researched most of their contributions themselves.
by Swiss Propaganda Research | Read more:
Image: SPR
[ed. Thirty-Two Tips For Navigating A Society That Is Full Of Propaganda And Manipulation and Society Is Made Of Narrative. Realizing This Is Awakening From The Matrix (Caitlin Johnstone).]
[ed. Thirty-Two Tips For Navigating A Society That Is Full Of Propaganda And Manipulation and Society Is Made Of Narrative. Realizing This Is Awakening From The Matrix (Caitlin Johnstone).]
Seven Ways to Make Windows 10 Work Better
Wednesday, June 12, 2019
I Want to Live in Elizabeth Warren’s America
It’s early, but this much is true: Elizabeth Warren is running the most impressive presidential campaign in ages, certainly the most impressive campaign within my lifetime.
I don’t mean that the Massachusetts senator is a better speaker than anyone who has ever run, nor a more strident revolutionary, nor as charismatic a shaper of her public image. It’s not even that she has better ideas than her opponents, though on a range of issues she certainly does.
I’m impressed instead by something more simple and elemental: Warren actually has ideas. She has grand, detailed and daring ideas, and through these ideas she is single-handedly elevating the already endless slog of the 2020 presidential campaign into something weightier and more interesting than what it might otherwise have been: a frivolous contest about who hates Donald Trump most.
Warren’s approach is ambitious and unconventional. She is betting on depth in a shallow, tweet-driven world. By offering so much honest detail so early, she risks turning off key constituencies, alienating donors and muddying the gauzy visionary branding that is the fuel for so much early horse-race coverage. It’s worth noting that it took Warren months of campaigning and reams of policy proposals to earn her a spot on the cover of Time Magazine. Meanwhile, because they match the culture’s Aaron Sorkinian picture of what a smart progressive looks like, Beto and Buttigieg— whose policy depth can be measured in tossed-off paragraphs — are awarded fawning coverage just for showing up male.
Yet, deliciously, Warren’s substantive approach is yielding results. Her plans are so voluminous that they’ve become their own meme. She’s been rising like a rocket in the polls, and is finally earning the kind of media coverage that was initially bestowed on many less-deserving men in the race. Warren’s policy ideas are now even beginning to create their own political weather. Following her early, bold call to break up big technology companies, the Justice Department and the Federal Trade Commission are dividing up responsibilities on policing tech giants, and lawmakers in the House are planning a sweeping inquiry into tech dominance. Warren’s Democratic opponents are now rushing to respond with their own deep policy ideas; Joe Biden’s staff seems to be pulling all-nighters, cutting and pasting from whatever looks good, to match Warren’s policy shop.
You might think I’m getting too giddy here. You might argue that policy ideas, especially at this stage of the game, don’t really matter — either because the public doesn’t care about substance, or because it’s unlikely that any president can get what she wants through a partisan, rigid Congress, so all these plans are a mere academic exercise. Or you may simply not like what you’ve heard of Warren’s ideas.
Still, do me a favor. Whatever your politics, pull out your phone, pour yourself a cup of tea, and set aside an hour to at least read Warren’s plans. You’ll see that on just about every grave threat facing Americans today, she offers a plausible theory of the problem and a creative and comprehensive vision for how to address it.
This week, she unveiled a $2 trillion plan that combines industrial policy, foreign policy and federal procurement to tackle the existential threat of climate change. She also has a plan for housing affordability, for child care affordability, and for student debt and the crushing costs of college. She knows what she wants to do to stem opioid deaths and to address maternal mortality. She has an entire wing of policy devoted to corporate malfeasance — she wants to jail lawbreaking executives, to undo the corporate influence that shapes military procurement, and to end the scandal of highly profitable corporations paying no federal taxes. And she has a plan to pay for much on this list, which might otherwise seem like a grab-bag of expensive lefty dreams: She’ll tax ultra-millionaires and billionaires — the wealthiest 75,000 American households — yielding $2.75 trillion over 10 years, enough to finance a wholesale reformation of the American dream.
There’s a good chance you’ll disagree with some or all of these ideas. Three months ago, when Warren outlined her plan for cleaving the economic dominance of large technology companies, I spent a few days quizzing her staff on what I considered to be flaws in her approach. I planned to write about them, but I was beaten by a wave of other tech pundits with similar reservations.
But then, in the discussion that followed, I realized what a service Warren had done, even if I disagreed with her precise approach. For months, commentators had been debating the generalities of policing tech. Now a politician had put forward a detailed plan for how to do so, sparking an intense policy discussion that was breaking new analytical ground. For a moment, it almost felt like I was living in a country where adults discuss important issues seriously. Wouldn’t that be a nice country to live in?
by Farhad Manjoo, NY Times | Read more:
Image: Mason Trinca for The New York Times
I don’t mean that the Massachusetts senator is a better speaker than anyone who has ever run, nor a more strident revolutionary, nor as charismatic a shaper of her public image. It’s not even that she has better ideas than her opponents, though on a range of issues she certainly does.
I’m impressed instead by something more simple and elemental: Warren actually has ideas. She has grand, detailed and daring ideas, and through these ideas she is single-handedly elevating the already endless slog of the 2020 presidential campaign into something weightier and more interesting than what it might otherwise have been: a frivolous contest about who hates Donald Trump most.
Warren’s approach is ambitious and unconventional. She is betting on depth in a shallow, tweet-driven world. By offering so much honest detail so early, she risks turning off key constituencies, alienating donors and muddying the gauzy visionary branding that is the fuel for so much early horse-race coverage. It’s worth noting that it took Warren months of campaigning and reams of policy proposals to earn her a spot on the cover of Time Magazine. Meanwhile, because they match the culture’s Aaron Sorkinian picture of what a smart progressive looks like, Beto and Buttigieg— whose policy depth can be measured in tossed-off paragraphs — are awarded fawning coverage just for showing up male.Yet, deliciously, Warren’s substantive approach is yielding results. Her plans are so voluminous that they’ve become their own meme. She’s been rising like a rocket in the polls, and is finally earning the kind of media coverage that was initially bestowed on many less-deserving men in the race. Warren’s policy ideas are now even beginning to create their own political weather. Following her early, bold call to break up big technology companies, the Justice Department and the Federal Trade Commission are dividing up responsibilities on policing tech giants, and lawmakers in the House are planning a sweeping inquiry into tech dominance. Warren’s Democratic opponents are now rushing to respond with their own deep policy ideas; Joe Biden’s staff seems to be pulling all-nighters, cutting and pasting from whatever looks good, to match Warren’s policy shop.
You might think I’m getting too giddy here. You might argue that policy ideas, especially at this stage of the game, don’t really matter — either because the public doesn’t care about substance, or because it’s unlikely that any president can get what she wants through a partisan, rigid Congress, so all these plans are a mere academic exercise. Or you may simply not like what you’ve heard of Warren’s ideas.
Still, do me a favor. Whatever your politics, pull out your phone, pour yourself a cup of tea, and set aside an hour to at least read Warren’s plans. You’ll see that on just about every grave threat facing Americans today, she offers a plausible theory of the problem and a creative and comprehensive vision for how to address it.
This week, she unveiled a $2 trillion plan that combines industrial policy, foreign policy and federal procurement to tackle the existential threat of climate change. She also has a plan for housing affordability, for child care affordability, and for student debt and the crushing costs of college. She knows what she wants to do to stem opioid deaths and to address maternal mortality. She has an entire wing of policy devoted to corporate malfeasance — she wants to jail lawbreaking executives, to undo the corporate influence that shapes military procurement, and to end the scandal of highly profitable corporations paying no federal taxes. And she has a plan to pay for much on this list, which might otherwise seem like a grab-bag of expensive lefty dreams: She’ll tax ultra-millionaires and billionaires — the wealthiest 75,000 American households — yielding $2.75 trillion over 10 years, enough to finance a wholesale reformation of the American dream.
There’s a good chance you’ll disagree with some or all of these ideas. Three months ago, when Warren outlined her plan for cleaving the economic dominance of large technology companies, I spent a few days quizzing her staff on what I considered to be flaws in her approach. I planned to write about them, but I was beaten by a wave of other tech pundits with similar reservations.
But then, in the discussion that followed, I realized what a service Warren had done, even if I disagreed with her precise approach. For months, commentators had been debating the generalities of policing tech. Now a politician had put forward a detailed plan for how to do so, sparking an intense policy discussion that was breaking new analytical ground. For a moment, it almost felt like I was living in a country where adults discuss important issues seriously. Wouldn’t that be a nice country to live in?
by Farhad Manjoo, NY Times | Read more:
Image: Mason Trinca for The New York Times
[ed. I'm a supporter.]
Tuesday, June 11, 2019
Recession Or Not, There Will Be Pain
Coping with corporate bonds.
With memories of 2008-2009 still fresh, some observers have focused on corporate debt as the likely culprit. It’s true that corporate debt has risen rapidly during the expansion, both in absolute terms and in relation to corporate profits. But low interest rates mean that debt service—interest payments on this debt relative to after-tax profit—is about 25 percent, where it usually is during periods of expansion and not a cause for worry. Bank regulators are concerned about the rapid growth of leveraged loans and weaker lender protections. But they appear to be correct in their assessment that leveraged lending, despite a 20 percent growth since last year to almost $1.2 trillion, “isn’t a current threat to the financial system.”
Still, recession or no recession, there will be pain.
A large and growing share of corporate debt is “speculative debt”—either leveraged loans used to acquire target companies and burden them with high debt levels or high risk junk bonds. Many companies with high levels of speculative debt on their books were acquired by private equity in a leveraged buyout, meaning the PE firm used high amounts of debt to buy them. This is debt the target companies, not their private equity owners, are obligated to repay.
Often, these PE-owned companies are required to issue junk bonds and further increase their indebtedness in order to pay dividends to their owners. A 100-day plan imposed on company managers at the time of the buyout lays out the steps that the company will need to take to service this mountain of debt. Reducing labor costs is a big part of these plans, whether by closing less profitable stores and establishments, laying off workers at those it continues to operate, or cutting pay and benefits. After it takes these steps to manage its debt, the company is on a knife-edge.
If all the assumptions made by the private equity firm when it persuaded creditors to lend it boatloads of money hold up, the company will avoid defaulting on its loans and going bankrupt. But if these assumptions are upended—say, by a slowdown in the economy, defaults and bankruptcies will spike. Creditors who have loaned billions of dollars to finance private equity-sponsored leverage buyouts will experience losses. Establishments will be shuttered, some companies will be liquidated, workers will lose their jobs, and communities will lose businesses that have played a key role in the local economy.
In 2013, concerned that loading a company with debt greater than 6 times earnings increased the likelihood of default or bankruptcy, bank regulators—the Office of the Comptroller of the Currency (OCC), the Federal Reserve (Fed) and the Federal Deposit Insurance Corporation (FDIC)—updated lending guidance. Banks were advised to avoid making loans that saddled a company with debt greater than 6 times earnings unless they could show that the company would be able to pay back the loan.
Initially, this put a crimp in private equity’s ability to load up companies with excessive amounts of debt. But private equity firms soon found a way around this limitation. They set up their own lending operations and extended loans to other firms in the industry. Trump administration regulators have chosen to relax enforcement of the guidelines. The result? In the first quarter of 2019, six years after the updated guidance was issued, leverage used in buyouts has risen to an average of 6.96 times earnings, up from 5.80 times in the first quarter or 2013.
We don’t need to look far to understand how this will affect the viability of businesses and the outcome for workers. Bankruptcies of department stores and specialty shop chains are so widespread, they have been dubbed a “retail apocalypse.” Retail is a business that has always faced disruptors—consumer tastes can be fickle, innovations like fast fashion challenge traditional marketing, recessions lead customers to postpone purchases, e-commerce puts pressure on brick and mortar stores. Traditionally, retailers have prepared for this by keeping debt levels low and owning their own real estate—holding costs down so they can weather tough times and make the necessary adaptations in how they do business.
Private equity owners turn this formula for success on its head. The low debt levels of retailers are an invitation to load up the stores they acquired with high amounts of debt. Selling off some of the stores’ real estate in sale-lease back agreements enriches the PE owners who pocket the proceeds of the sale, but leaves the stores to pay rent on facilities they used to own. Stores are stripped of resources they need to modernize and keep up with the competition by owners that put their hands in the till to pay themselves generous dividends. Often the owners collect fees from these companies, even when company profits spiral downward. These measures guarantee that the PE firm will make its bundle. While private equity owners prefer a profitable resale of their companies, that’s really the second bite of the apple. Exiting investments via bankruptcy is increasingly common.
Private equity firms own only a fraction of U.S. retail chains, but they are behind a disproportionate share—financial news service Debtwire calculates 40 percent—of retail bankruptcies: Toys ‘R Us, Payless Shoes, Gymboree, Claire’s Stores, PetSmart, Radio Shack, Staples, Sports Authority, Shopko, The Limited Charlotte Russe, Rue 21, Nine West, Aeropostale. The list goes on.
by Eileen Appelbaum, Economic Policy Institute | Read more:
The Queen of Eating Shellfish Online
Most of us can probably agree that eating food is more enjoyable than watching someone else eat food. For one, it’s a basic human need. It also tastes good a lot of the time. Not to mention, people can be pretty gross when they eat, especially when they do so in over-the-top, finger-licking fashion.
Still, hundreds of thousands of people tune in each week to watch Bethany Gaskin binge-eat shellfish on YouTube.
Mrs. Gaskin, 44, has capitalized on the popularity of a food-video genre known as mukbang, which involves scarfing down, on camera, more grub than should rightly be consumed in a single sitting.
On her two YouTube channels, Bloveslife and BlovesASMR Eating Her Way, Mrs. Gaskin chats up her audience while eating king crab legs, mussels, lobster tails, hard-boiled eggs and roasted red potatoes. The videos, produced in her Cincinnati home, have made her a millionaire, she said. But getting into the business wasn’t about money; mukbang was more of a calling than a vocation.“I think of mukbanging as a ministry,” Mrs. Gaskin said. “I didn’t consult with my husband before I quit my job. I knew this was it, and I quit by faith.”
The Spread of Binge Culture
Mukbang seems to have begun as an internet trend more than a decade ago in South Korea. The name is a mash-up of the Korean words for let’s eat (“muk-ja”) and broadcasting (“bang-song”). Korean live-streamers often schedule their mukbang videos to align with dinnertime hours, so their viewers eating alone at home feel like they’re sharing a meal with a friend.
Viewers cite other benefits too. Watching the videos can serve as an appetite-curbing exercise. And for a certain subset, the sounds of a person eating foster an autonomous sensory meridian response, or A.S.M.R.; viewers derive pleasure from the sounds created by extra-loud crunching, slurping and lip smacking. (...)
Gross Profits
Perhaps the noisy and bad-mannered eating is off-putting for most, but the genre has a lot of devotees, if Mrs. Gaskin’s success is any indication. Her primary YouTube channel, Bloveslife, has 1.8million subscribers, and on Instagram she has a following of nearly 900,000, one of whom is Cardi B.
Through advertising on her videos, Mrs. Gaskin said she has made more than $1 million, providing screenshots of a report from YouTube.
Before becoming a YouTube sensation, Mrs. Gaskin, who has an associate’s degree in early childhood development, owned a day care facility. After five years, she sold the business and used the money to pay off loans and leases. She then got a job making circuit boards for the military for a year.
In 2017, she started making Food Network-style cooking videos in her home kitchen and posting them on YouTube. “I’m a foodie,” Mrs. Gaskin said. “I’ve always liked to cook.”
“Then I did a mukbang, and people just went crazy,” she said. “I was like, ‘People want to see me eat, this is weird,’ and since they were easier to record, I just started doing mukbangs and all of a sudden, it just took off from there.”
by Jasmin Barmore, NY Times | Read more:
Image: Maddie McGarvey for The New York Times
[ed. What a world.]
The U.S. Health Care System is Full of Monopolies
The U.S. health care system is full of monopolies (Axios)
Image: Open Markets Institute; Chart Axios Visuals
Monday, June 10, 2019
Fashions Fade, But Fleabag Is Forever
This is a love story. A dangerously elegant woman (noble stock) in lips the color of a dying rose (not a lipstick, but a blend of oils, waxes, and pigments based on MAC’s Dare You), hair a roaring bob, a cigarette perched on her Erté fingers, stands pensively against a brick wall (real?), the burnished light (not real?) casting the kind of shadow that fills in the blanks — and the cleavage. This is Fleabag (of the Amazon series of the same name, written by and starring Phoebe Waller-Bridge), taking a breather behind a restaurant during a fraught family dinner, a fourth-wall-demolishing millennial café owner who could pass for a femme fatale in a film noir. A big part of that latter fantasy is the navy blue jumpsuit she’s wearing (Love, $50), or, more accurately, embodying. The keyhole at the front is more like a door ajar, two strips of material like curtains begging to be parted while threatening to close. Her shoulders jut out, her back is exposed — this is as naked as chic is allowed to be. It is a sleeveless, backless, armless, chestless (well, sort of) number that requires legs for days. To wear it the way Fleabag does, you basically need to be Fleabag, which means you basically need to be Waller-Bridge, whose androgyny (she dressed as a boy when she was a kid), sexiness (she dressed what we think of as the opposite of a boy when she discovered them), and sylphlike stature are as impossible to mimic as the rest of her.
When everyone ran out to buy that jumpsuit last week, that is what they wanted: everything it entailed, from the lights illuminating the scene right down to the It Girl inside it. In her ode to the jumpsuit, The Cut’s Kathryn VanArendonk — who bought two sizes just to be sure — wrote not so much about how it looked as what it meant: “It’s revealing in a way that feels like a choice rather than a plea.” A British fan then polled Twitter: “Will buying the Fleabag jumpsuit solve my emotional problems AS WELL as making me look bomb?” The only answers she provided were “Yes” and “Absolutely.”
“I think people don’t always view contemporary costuming as hard, and it’s really hard,” says Emma Fraser, creator of the TV Ate My Wardrobe blog. “It’s not just about throwing together an outfit,” she explains, it’s using clothes as “an extension of who that character is.” The last time a television star’s style migrated en masse into off-screen culture may have been The Rachel in the ’90s: the shaggy hairdon’t of the Friends’ everywoman played by Jennifer Aniston, whose face was normal enough that every woman thought a mere haircut could be a conduit for a New York City life that didn’t suck. Fleabag gives us an updated version of that same generational aspiration — the bold red lip, the navy jumpsuit, the “achievable” look and life. Describing the character’s allure, Fraser inadvertently defines the millennial: “Everything can be a mess, but you can still kind of be put together.” Watching television can be like window-shopping, shallow characters being little more than clothes horses for pricey brands, so seeing a layered antiheroine whose affordable accoutrements are inseparable from who she is feels revolutionary. And who, these days, doesn’t want to be part of a revolution? As Waller-Bridge herself texted Fleabag costume designer, Ray Holman, (referencing Twitter): “The jumpsuit is a movement.” (...)
As much as the first season of Fleabag is about loss, the second is about love. And isn’t it like that messy bitch to fall for the one guy she can’t have sex with. When we first meet the priest (aka “the hot priest,” played by Sherlock’s Andrew Scott), it’s not clear he is one. He’s unknown to Fleabag, just a random sweary guy at the table of her family dinner. He’s not wearing the dog collar (the audience shouldn’t have any preconceived notions, says Holman). Instead, he is rumpled, in a lavender linen shirt designed by Oliver Spencer, master of the relaxed Brit look (as if that isn’t an oxymoron). Father looks good, but not too good. “He’s quite poor,” the costume designer explains. “He’s not a rich Catholic priest so he doesn’t have many clothes and the clothes he has, they’re old.” He’s not the point anyway. This episode belongs to Fleabag. Fleabag and her jumpsuit (and, okay, her priest boner).
by Soraya Roberts, The Cut | Read more:
Image: Steve Schofield, Amazon / Illustration by Homestead
[ed. Here it is: Fleabag: Season 2 (YouTube). See also: The Case for Boring Office Clothes (The Atlantic).]
When everyone ran out to buy that jumpsuit last week, that is what they wanted: everything it entailed, from the lights illuminating the scene right down to the It Girl inside it. In her ode to the jumpsuit, The Cut’s Kathryn VanArendonk — who bought two sizes just to be sure — wrote not so much about how it looked as what it meant: “It’s revealing in a way that feels like a choice rather than a plea.” A British fan then polled Twitter: “Will buying the Fleabag jumpsuit solve my emotional problems AS WELL as making me look bomb?” The only answers she provided were “Yes” and “Absolutely.” “I think people don’t always view contemporary costuming as hard, and it’s really hard,” says Emma Fraser, creator of the TV Ate My Wardrobe blog. “It’s not just about throwing together an outfit,” she explains, it’s using clothes as “an extension of who that character is.” The last time a television star’s style migrated en masse into off-screen culture may have been The Rachel in the ’90s: the shaggy hairdon’t of the Friends’ everywoman played by Jennifer Aniston, whose face was normal enough that every woman thought a mere haircut could be a conduit for a New York City life that didn’t suck. Fleabag gives us an updated version of that same generational aspiration — the bold red lip, the navy jumpsuit, the “achievable” look and life. Describing the character’s allure, Fraser inadvertently defines the millennial: “Everything can be a mess, but you can still kind of be put together.” Watching television can be like window-shopping, shallow characters being little more than clothes horses for pricey brands, so seeing a layered antiheroine whose affordable accoutrements are inseparable from who she is feels revolutionary. And who, these days, doesn’t want to be part of a revolution? As Waller-Bridge herself texted Fleabag costume designer, Ray Holman, (referencing Twitter): “The jumpsuit is a movement.” (...)
As much as the first season of Fleabag is about loss, the second is about love. And isn’t it like that messy bitch to fall for the one guy she can’t have sex with. When we first meet the priest (aka “the hot priest,” played by Sherlock’s Andrew Scott), it’s not clear he is one. He’s unknown to Fleabag, just a random sweary guy at the table of her family dinner. He’s not wearing the dog collar (the audience shouldn’t have any preconceived notions, says Holman). Instead, he is rumpled, in a lavender linen shirt designed by Oliver Spencer, master of the relaxed Brit look (as if that isn’t an oxymoron). Father looks good, but not too good. “He’s quite poor,” the costume designer explains. “He’s not a rich Catholic priest so he doesn’t have many clothes and the clothes he has, they’re old.” He’s not the point anyway. This episode belongs to Fleabag. Fleabag and her jumpsuit (and, okay, her priest boner).
by Soraya Roberts, The Cut | Read more:
Image: Steve Schofield, Amazon / Illustration by Homestead
[ed. Here it is: Fleabag: Season 2 (YouTube). See also: The Case for Boring Office Clothes (The Atlantic).]
Her Evangelical Megachurch Was Her World
Her Evangelical Megachurch Was Her World. Then Her Daughter Said She Was Molested by a Minister (NY Times).
Image: Ryan Longnecker
[ed. You have to sign a forced arbitration contract to be a member.]
[ed. You have to sign a forced arbitration contract to be a member.]
Sunday, June 9, 2019
The Invisible Primary
In the United States, the invisible primary, also known as the money primary, is the period between the first well-known presidential candidates with strong political support networks showing interest in running for president and demonstration of substantial public support by voters for them in primaries and caucuses. During the money primary candidates raise funds for the upcoming primary elections and attempt to garner support of political leaders and donors, as well as the party establishment. Fund raising numbers and opinion polls are used by the media to predict who the front runners for the nomination are. This is a crucial stage of a campaign for the presidency, as the initial frontrunners who raise the most money appear the strongest and will be able to raise even more money. On the other hand, members of the party establishment who find themselves losing the invisible primary, such as Mitt Romney in the 2016 race, may abandon hope of successfully running.
During the invisible primary appeals are made and meetings held with the political elite: party leaders, major donors, fundraisers, and political action committees. In contrast to the smoke-filled room where a small group of party-leaders might at the last minute, in a small meeting room at a political convention, determine the candidate, the invisible primary refers to the period of jockeying which precedes the first primaries and caucuses and even campaign announcements. The winners of the invisible primary, such as Hillary Clinton and Jeb Bush in 2016, come into the first primaries and caucuses with a full war chest of money, support from office holders, and an aura of inevitability. Winners of the invisible primary have the support of the leaders of their political party and, in turn, support the political positions of their party; they are insiders, part of the party establishment. They do not always win, as Hillary Clinton did not in 2008. There is little or no campaign advertising using TV, particularly by the candidate, during this period, although online advertising may be used to build mailing lists of grassroots supporters and small contributors.
During the invisible primary appeals are made and meetings held with the political elite: party leaders, major donors, fundraisers, and political action committees. In contrast to the smoke-filled room where a small group of party-leaders might at the last minute, in a small meeting room at a political convention, determine the candidate, the invisible primary refers to the period of jockeying which precedes the first primaries and caucuses and even campaign announcements. The winners of the invisible primary, such as Hillary Clinton and Jeb Bush in 2016, come into the first primaries and caucuses with a full war chest of money, support from office holders, and an aura of inevitability. Winners of the invisible primary have the support of the leaders of their political party and, in turn, support the political positions of their party; they are insiders, part of the party establishment. They do not always win, as Hillary Clinton did not in 2008. There is little or no campaign advertising using TV, particularly by the candidate, during this period, although online advertising may be used to build mailing lists of grassroots supporters and small contributors.
by Wikipedia | Read more:
Georgetown Carnival, Seattle 2019
[ed. Georgetown Carnival, Seattle. 2019. Click on the "Read more" link below for more pictures. 2015 Carnival pics can be found here. All photos: markk]
Saturday, June 8, 2019
The Kill Zone
Earlier this week, Treasury Secretary Steven Mnuchin joined a growing number of public officials concerned about the impact of Internet monopolies when he called on the Justice Department to look into the power that digital platforms like Google have over the US economy. “These are issues the Justice Department needs to look at seriously,” he told CNBC, “not for any one company, but obviously as these technology companies have a greater and greater impact on the economy, I think that you have to look at the power they have.”
Mnuchin’s comments followed a 60 Minutes report that examined the enormous power Google wields over potential competitors thanks to its monopoly in online search and search advertising. “If I were starting out today, I would have no shot of building Yelp,” said Jeremy Stoppelman, co-founder and CEO of Yelp, during the segment. Yelp has long argued that Google has abused its dominance in local search to favor its own services over competitors such as itself, and is currently attempting to convince European competition authorities to launch a fresh antitrust case against the company.
“If you provide great content in one of these categories that is lucrative to Google, and seen as potentially threatening, they will snuff you out,” added Stoppelman. “They will make you disappear. They will bury you.”
The sentiment that startups effectively have no chance of competing against the “Big Five” tech giants—Alphabet, Amazon, Apple, Facebook, and Microsoft—is one that has become increasingly common among tech entrepreneurs and venture capitalists in recent years. “People are not getting funded because Amazon might one day compete with them,” one founder told The Guardian. “If it was startup versus startup, it would have been a fair fight, but startup versus Amazon and it’s game over.” As the author and media scholar Jonathan Taplin pointed out in an interview with ProMarket, the very notion that someone could start a new search engine that competes with Google “is just laughed at by the venture capital community.”
Investors and entrepreneurs, said the venture capitalist Albert Wenger during a panel discussion at the Stigler Center’s annual antitrust conference last month, are now wary of entering into direct competition with giants like Google and Facebook. Both companies, along with Amazon and Apple, effectively have a “Kill Zone” around them—areas not worth operating or investing in, since defeat is guaranteed.
Tech platforms, after all, have endless resources at their disposal to either purchase or crush new upstarts they perceive as threats. Increasingly, startups that operate in areas coveted by tech giants face a similar choice: sell—or get crushed. The Big Five have made over 436 acquisitions in the last decade, with little to no challenge from antitrust authorities. When startups refuse to sell, they find themselves facing an unlevel playing field. Snapchat, which turned down a $3 billion acquisition offer from Facebook in 2013 (and a $30 billion bid from Google in 2016), is a case in point: after it failed to acquire Snapchat, Facebook simply cloned many of Snapchat’s key features, using its vast reach to completely undercut its growth. This is not an uncommon occurrence.
“The Kill Zone is a real thing,” said Wenger, a managing partner at Union Square Ventures and an early investor in Twitter. “The scale of these companies and their impact on what can be funded, and what can succeed, is massive.” He went on to quote one angel investor who told him that he only invests “in things that are not in Facebook’s, Apple’s, Amazon’s or Google’s kill zone.”
The kill zone, noted Wenger, is not a new phenomenon. Microsoft had a similar kill zone around it when it dominated the tech industry in the late 1990s. “It was a similar playbook, where Microsoft would see, ‘What kind of things are doing well on my platform?’” he said. “Then they would just absorb those into the platform itself. That is a playbook that’s being exercised by Amazon, by Google, by Facebook, by all the big digital platforms.”
All this has profound implications for the startup ecosystem and for the future of innovation. Is the dominance of digital platforms, routinely hailed as the most innovative companies in the world, actually hindering innovation? Much of the Stigler Center panel, moderated by Fortune magazine’s executive editor Adam Lashinsky, revolved around this very question. In addition to Wenger, it featured patent expert Elvir Causevic, managing director and co-head of Houlihan Lokey’s Tech+IP Advisory practice; Glen Weyl, a principal researcher at Microsoft Research New England and a senior research scholar at Yale’s economics department and law school; and Matt Perault, director of public policy at Facebook.
While opinions as to how to address the power of digital platforms and spur innovation varied wildly, most of the panelists seemed to agree on one basic premise: the size and scope of digital platforms has become an impediment to innovation.
“Small Companies No Longer Have Access to Patent Protection”
Innovation used to be associated with small companies and entrepreneurs. There’s a reason why the garage has taken such an important place in the mythology of the tech industry: Silicon Valley, as we know it, is the product of entrepreneurs starting companies in their garages, from Bill Hewlett and Dave Packard in the late 1930s, through Steve Jobs and Steve Wozniak in the 1970s, to Larry Page and Sergey Brin in the 1990s.
But the vaunted garage is little more than a myth in today’s Silicon Valley. The rise of digital platforms has been correlated with a historic decline in startups: new business formation in the US has declined by more than 40 percent since the late 1970s and is near a 40-year low. At the same time, as the New York Times’ Farhad Manjoo pointed out last year, the technology industry has gradually become “a playground for giants.”
Many economists are naturally concerned about this decline in entrepreneurship: startups are an important driver of both jobs and innovation. A lack of startups is often associated with rigidity and a lack of economic dynamism. Another result, however, is that big firms have seemingly taken the mantle as the most innovative in the world.
“The label of innovation has been grabbed by Big Tech,” said Causevic, who argued that big tech firms use the US patent system to stifle innovation. “We’ve taken the focus off of rewarding genius and innovation to rewarding capital and scale.”
Historically, he noted, large companies used to abuse the patent system to entrench their position. But the patent system also served an important function: it provided small innovators with an effective tool to fight big firms that tried to infringe on their patents. Recent changes in US patent laws, however—in particular the America Invents Act (AIA) that was signed into law by President Obama in 2011—have created a situation where “small companies no longer have access to patent protection.” In order to deal with patent trolls, he said, the AIA has “eviscerated” the ability of small companies to enjoy patent protection, making it lucrative for big tech firms to be on the side of anti-patent enforcement.
“You have nothing to lose. You’re better off just infringing. As a matter of fact, it might be less expensive to infringe than it might be to pay royalties, given how the current case law is set up,” said Causevic. “Throughout my career, it was always the patents that made the big difference when the little guys [fought] against the big guys. Now you don’t have that.” It’s not only small companies that are affected by this, contended Causevic—even middle-market firms are at risk.
To illustrate this point, Causevic used the recent example of Apple and Immersion. Immersion, which developed the feedback technologies that are used in many wearable devices, sued Apple in 2016, alleging that Apple’s iPhones and iWatch devices were infringing on its haptic feedback patents. The companies reached a settlement earlier this year. “That technology was largely invented by Immersion, a middle-market company that has been been around for 20 years, has 1,000 patents. Apple worked with them, paid them a license for years, but decided to stop paying and said, ‘No, we’ll just do it ourselves,’” said Causevic. “[Immersion’s] market cap dropped 60 percent and Apple did a piddly settlement with this company for peanuts. The company’s really in a lot of pain. It used to be a $500 million company.”
The larger question, said Causevic, is not really the patent system per se, which he acknowledged might be outdated, but the question of how to reward innovation and what type of innovation gets rewarded. “Do we want to reward innovation or do we want to reward capital, and network, and market power?” he asked.
A “Lack of Imagination” Among Antitrust Enforcers
Weyl, co-author of the new book Radical Markets: Uprooting Capitalism and Democracy for a Just Society, laid much of the blame on the lack of antitrust enforcement in the past 40 years. Enforcers, he said, have focused too much on consumer welfare instead of competition, and thus failed to anticipate how crucial new industries might develop. This manifested in the approval of a number of mergers that fundamentally altered the course of the digital economy: Google’s purchase of DoubleClick in 2007 and Waze in 2013, Facebook’s acquisitions of Instagram in 2012 and WhatsApp in 2014, and Microsoft’s acquisition of Skype in 2011.
“Had those companies not been absorbed,” said Weyl, “they might have changed the texture of the way that competition took place within those relevant marketplaces. In fact, the prospect of that happening was part of the basis of the funding and expansion of those companies.”
Mnuchin’s comments followed a 60 Minutes report that examined the enormous power Google wields over potential competitors thanks to its monopoly in online search and search advertising. “If I were starting out today, I would have no shot of building Yelp,” said Jeremy Stoppelman, co-founder and CEO of Yelp, during the segment. Yelp has long argued that Google has abused its dominance in local search to favor its own services over competitors such as itself, and is currently attempting to convince European competition authorities to launch a fresh antitrust case against the company.
“If you provide great content in one of these categories that is lucrative to Google, and seen as potentially threatening, they will snuff you out,” added Stoppelman. “They will make you disappear. They will bury you.”
The sentiment that startups effectively have no chance of competing against the “Big Five” tech giants—Alphabet, Amazon, Apple, Facebook, and Microsoft—is one that has become increasingly common among tech entrepreneurs and venture capitalists in recent years. “People are not getting funded because Amazon might one day compete with them,” one founder told The Guardian. “If it was startup versus startup, it would have been a fair fight, but startup versus Amazon and it’s game over.” As the author and media scholar Jonathan Taplin pointed out in an interview with ProMarket, the very notion that someone could start a new search engine that competes with Google “is just laughed at by the venture capital community.”
Investors and entrepreneurs, said the venture capitalist Albert Wenger during a panel discussion at the Stigler Center’s annual antitrust conference last month, are now wary of entering into direct competition with giants like Google and Facebook. Both companies, along with Amazon and Apple, effectively have a “Kill Zone” around them—areas not worth operating or investing in, since defeat is guaranteed.
Tech platforms, after all, have endless resources at their disposal to either purchase or crush new upstarts they perceive as threats. Increasingly, startups that operate in areas coveted by tech giants face a similar choice: sell—or get crushed. The Big Five have made over 436 acquisitions in the last decade, with little to no challenge from antitrust authorities. When startups refuse to sell, they find themselves facing an unlevel playing field. Snapchat, which turned down a $3 billion acquisition offer from Facebook in 2013 (and a $30 billion bid from Google in 2016), is a case in point: after it failed to acquire Snapchat, Facebook simply cloned many of Snapchat’s key features, using its vast reach to completely undercut its growth. This is not an uncommon occurrence.
“The Kill Zone is a real thing,” said Wenger, a managing partner at Union Square Ventures and an early investor in Twitter. “The scale of these companies and their impact on what can be funded, and what can succeed, is massive.” He went on to quote one angel investor who told him that he only invests “in things that are not in Facebook’s, Apple’s, Amazon’s or Google’s kill zone.”
The kill zone, noted Wenger, is not a new phenomenon. Microsoft had a similar kill zone around it when it dominated the tech industry in the late 1990s. “It was a similar playbook, where Microsoft would see, ‘What kind of things are doing well on my platform?’” he said. “Then they would just absorb those into the platform itself. That is a playbook that’s being exercised by Amazon, by Google, by Facebook, by all the big digital platforms.”
All this has profound implications for the startup ecosystem and for the future of innovation. Is the dominance of digital platforms, routinely hailed as the most innovative companies in the world, actually hindering innovation? Much of the Stigler Center panel, moderated by Fortune magazine’s executive editor Adam Lashinsky, revolved around this very question. In addition to Wenger, it featured patent expert Elvir Causevic, managing director and co-head of Houlihan Lokey’s Tech+IP Advisory practice; Glen Weyl, a principal researcher at Microsoft Research New England and a senior research scholar at Yale’s economics department and law school; and Matt Perault, director of public policy at Facebook.
While opinions as to how to address the power of digital platforms and spur innovation varied wildly, most of the panelists seemed to agree on one basic premise: the size and scope of digital platforms has become an impediment to innovation.
“Small Companies No Longer Have Access to Patent Protection”
Innovation used to be associated with small companies and entrepreneurs. There’s a reason why the garage has taken such an important place in the mythology of the tech industry: Silicon Valley, as we know it, is the product of entrepreneurs starting companies in their garages, from Bill Hewlett and Dave Packard in the late 1930s, through Steve Jobs and Steve Wozniak in the 1970s, to Larry Page and Sergey Brin in the 1990s.
But the vaunted garage is little more than a myth in today’s Silicon Valley. The rise of digital platforms has been correlated with a historic decline in startups: new business formation in the US has declined by more than 40 percent since the late 1970s and is near a 40-year low. At the same time, as the New York Times’ Farhad Manjoo pointed out last year, the technology industry has gradually become “a playground for giants.”
Many economists are naturally concerned about this decline in entrepreneurship: startups are an important driver of both jobs and innovation. A lack of startups is often associated with rigidity and a lack of economic dynamism. Another result, however, is that big firms have seemingly taken the mantle as the most innovative in the world.
“The label of innovation has been grabbed by Big Tech,” said Causevic, who argued that big tech firms use the US patent system to stifle innovation. “We’ve taken the focus off of rewarding genius and innovation to rewarding capital and scale.”
Historically, he noted, large companies used to abuse the patent system to entrench their position. But the patent system also served an important function: it provided small innovators with an effective tool to fight big firms that tried to infringe on their patents. Recent changes in US patent laws, however—in particular the America Invents Act (AIA) that was signed into law by President Obama in 2011—have created a situation where “small companies no longer have access to patent protection.” In order to deal with patent trolls, he said, the AIA has “eviscerated” the ability of small companies to enjoy patent protection, making it lucrative for big tech firms to be on the side of anti-patent enforcement.
“You have nothing to lose. You’re better off just infringing. As a matter of fact, it might be less expensive to infringe than it might be to pay royalties, given how the current case law is set up,” said Causevic. “Throughout my career, it was always the patents that made the big difference when the little guys [fought] against the big guys. Now you don’t have that.” It’s not only small companies that are affected by this, contended Causevic—even middle-market firms are at risk.
To illustrate this point, Causevic used the recent example of Apple and Immersion. Immersion, which developed the feedback technologies that are used in many wearable devices, sued Apple in 2016, alleging that Apple’s iPhones and iWatch devices were infringing on its haptic feedback patents. The companies reached a settlement earlier this year. “That technology was largely invented by Immersion, a middle-market company that has been been around for 20 years, has 1,000 patents. Apple worked with them, paid them a license for years, but decided to stop paying and said, ‘No, we’ll just do it ourselves,’” said Causevic. “[Immersion’s] market cap dropped 60 percent and Apple did a piddly settlement with this company for peanuts. The company’s really in a lot of pain. It used to be a $500 million company.”
The larger question, said Causevic, is not really the patent system per se, which he acknowledged might be outdated, but the question of how to reward innovation and what type of innovation gets rewarded. “Do we want to reward innovation or do we want to reward capital, and network, and market power?” he asked.
A “Lack of Imagination” Among Antitrust Enforcers
Weyl, co-author of the new book Radical Markets: Uprooting Capitalism and Democracy for a Just Society, laid much of the blame on the lack of antitrust enforcement in the past 40 years. Enforcers, he said, have focused too much on consumer welfare instead of competition, and thus failed to anticipate how crucial new industries might develop. This manifested in the approval of a number of mergers that fundamentally altered the course of the digital economy: Google’s purchase of DoubleClick in 2007 and Waze in 2013, Facebook’s acquisitions of Instagram in 2012 and WhatsApp in 2014, and Microsoft’s acquisition of Skype in 2011.
“Had those companies not been absorbed,” said Weyl, “they might have changed the texture of the way that competition took place within those relevant marketplaces. In fact, the prospect of that happening was part of the basis of the funding and expansion of those companies.”
by Asher Schechter, Pro-Market | Read more:
[ed. See also: Adversarial interoperability: reviving an elegant weapon from a more civilized age to slay today's monopolies (Boing Boing).]
Friday, June 7, 2019
Kauai 'O'o
This is the song of the last male Kauai ‘O'o, singing at a partner that does not exist anymore. Recorded in 1987, this was the last time the song of this species was heard. It has since been declared extinct.
Overlooked No More: Elizabeth Peratrovich, Rights Advocate for Alaska Natives
It was hardly the first affront. They had grown up in a segregated Alaska: separate schools, hospitals, theaters, restaurants and cemeteries. But for Elizabeth Peratrovich and her husband, Roy, Tlingit natives, the sign they spotted one day in late 1941 in Douglas, just across the channel from downtown Juneau, was the final straw.
“No Natives Allowed” read the notice on a hotel door.
“The proprietor of Douglas Inn does not seem to realize that our Native boys are just as willing as the white boys to lay down their lives to protect the freedom that he enjoys,” they wrote in a letter to Ernest Gruening, the territory’s governor, signaling the start of their campaign to fight discrimination in Alaska.
Calling such open bias “an outrage,” the couple continued, “We will still be here to guard our beloved country while hordes of uninterested whites will be fleeing South.”
Gruening agreed with the Peratroviches, and they joined forces. In 1943, they attempted to usher an antidiscrimination bill through Alaska’s two-branch Territorial Legislature. It failed, with a tie vote of 8-8 in the House.
In the two years that followed, the Peratroviches redoubled their efforts, urging Native Alaskans to campaign for seats in the Legislature and taking their cause on the road to gain support. They even left their children in the care of an orphanage for a summer so that they could travel across the state more freely.
By the time the new bill reached the Senate floor, on Feb. 5, 1945, Congress had increased the size of the territory’s Legislature, two Natives had been elected to it, and Alaska’s House had already approved the bill. Though the odds of passage were high, the bill set off hours of passionate debate and drew so many onlookers that the crowd spilled out of the gallery doors.
Senator Allen Shattuck argued that the measure would “aggravate rather than allay” racial tensions.
“Who are these people, barely out of savagery, who want to associate with us whites with 5,000 years of recorded civilization behind us?” he was quoted as saying in Gruening’s 1973 autobiography, “Many Battles.”
When the floor was opened to public comments, Peratrovich set down her knitting needles and rose from her seat in the back.
Taking the podium, she said: “I would not have expected that I, who am barely out of savagery, would have to remind the gentlemen with 5,000 years of recorded civilization behind them of our Bill of Rights.”
She gave examples of the injustices that she and her family had faced because of their background and called on the lawmakers to act. “You as legislators,” she said, “can assert to the world that you recognize the evil of the present situation and speak your intent to help us overcome discrimination.”
Her testimony, The Daily Alaska Empire wrote, shamed the opposition into a “defensive whisper.”
The gallery broke out in a “wild burst of applause,” Gruening wrote. The 1945 Anti-Discrimination Act was passed, 11-5.
Gruening signed the bill into law on Feb. 16 — a date now celebrated by the state each year. The legislation entitled all Alaskans to “full and equal enjoyment” of public establishments, setting a misdemeanor penalty for violators. It also banned discriminatory signage based on race.
It was the first antidiscrimination act in the United States. It would be nearly 20 years before the federal Civil Rights Act would be passed, in 1964, and 14 years before Alaska would become a state.
by Carson Vaughan, NY Times | Read more:
Image: Alaska State Archives
“No Natives Allowed” read the notice on a hotel door.
“The proprietor of Douglas Inn does not seem to realize that our Native boys are just as willing as the white boys to lay down their lives to protect the freedom that he enjoys,” they wrote in a letter to Ernest Gruening, the territory’s governor, signaling the start of their campaign to fight discrimination in Alaska.Calling such open bias “an outrage,” the couple continued, “We will still be here to guard our beloved country while hordes of uninterested whites will be fleeing South.”
Gruening agreed with the Peratroviches, and they joined forces. In 1943, they attempted to usher an antidiscrimination bill through Alaska’s two-branch Territorial Legislature. It failed, with a tie vote of 8-8 in the House.
In the two years that followed, the Peratroviches redoubled their efforts, urging Native Alaskans to campaign for seats in the Legislature and taking their cause on the road to gain support. They even left their children in the care of an orphanage for a summer so that they could travel across the state more freely.
By the time the new bill reached the Senate floor, on Feb. 5, 1945, Congress had increased the size of the territory’s Legislature, two Natives had been elected to it, and Alaska’s House had already approved the bill. Though the odds of passage were high, the bill set off hours of passionate debate and drew so many onlookers that the crowd spilled out of the gallery doors.
Senator Allen Shattuck argued that the measure would “aggravate rather than allay” racial tensions.
“Who are these people, barely out of savagery, who want to associate with us whites with 5,000 years of recorded civilization behind us?” he was quoted as saying in Gruening’s 1973 autobiography, “Many Battles.”
When the floor was opened to public comments, Peratrovich set down her knitting needles and rose from her seat in the back.
Taking the podium, she said: “I would not have expected that I, who am barely out of savagery, would have to remind the gentlemen with 5,000 years of recorded civilization behind them of our Bill of Rights.”
She gave examples of the injustices that she and her family had faced because of their background and called on the lawmakers to act. “You as legislators,” she said, “can assert to the world that you recognize the evil of the present situation and speak your intent to help us overcome discrimination.”
Her testimony, The Daily Alaska Empire wrote, shamed the opposition into a “defensive whisper.”
The gallery broke out in a “wild burst of applause,” Gruening wrote. The 1945 Anti-Discrimination Act was passed, 11-5.
Gruening signed the bill into law on Feb. 16 — a date now celebrated by the state each year. The legislation entitled all Alaskans to “full and equal enjoyment” of public establishments, setting a misdemeanor penalty for violators. It also banned discriminatory signage based on race.
It was the first antidiscrimination act in the United States. It would be nearly 20 years before the federal Civil Rights Act would be passed, in 1964, and 14 years before Alaska would become a state.
by Carson Vaughan, NY Times | Read more:
Image: Alaska State Archives
[ed. Overlooked is a series of obituaries about remarkable people whose deaths, beginning in 1851, went unreported in The Times.]
Consumer Financial Loan-Shark Bureau
How Payday Lenders Spent $1 Million at a Trump Resort — and Cashed In
In mid-March, the payday lending industry held its annual convention at the Trump National Doral hotel outside Miami. Payday lenders offer loans on the order of a few hundred dollars, typically to low-income borrowers, who have to pay them back in a matter of weeks. The industry has long been reviled by critics for charging stratospheric interest rates — typically 400% on an annual basis — that leave customers trapped in cycles of debt.
The industry had felt under siege during the Obama administration, as the federal government moved to clamp down. A government study found that a majority of payday loans are made to people who pay more in interest and fees than they initially borrow. Google and Facebook refuse to take the industry’s ads.
On the edge of the Doral’s grounds, as the payday convention began, a group of ministers held a protest “pray-in,” denouncing the lenders for having a “feast” while their borrowers “suffer and starve.”
But inside the hotel, in a wood-paneled bar under golden chandeliers, the mood was celebratory. Payday lenders, many dressed in golf shirts and khakis, enjoyed an open bar and mingled over bites of steak and coconut shrimp.
They had plenty to be elated about. A month earlier, Kathleen Kraninger, who had just finished her second month as director of the federal Consumer Financial Protection Bureau, had delivered what the lenders consider an epochal victory: Kraninger announced a proposal to gut a crucial rule that had been passed under her Obama-era predecessor.
Payday lenders viewed that rule as a potential death sentence for many in their industry. It would require payday lenders and others to make sure borrowers could afford to pay back their loans while also covering basic living expenses. Banks and mortgage lenders view such a step as a basic prerequisite. But the notion struck terror in the payday lenders. Their business model relies on customers — 12 million Americans take out payday loans every year, according to Pew Charitable Trusts — getting stuck in a long-term cycle of debt, experts say. A CFPB study found that three out of four payday loans go to borrowers who take out 10 or more loans a year. (...)
In Mick Mulvaney, who Trump appointed as interim chief of the CFPB in 2017, the industry got exactly the kind of person it had hoped for. As a congressman, Mulvaney had famously derided the agency as a “sad, sick” joke.
If anything, that phrase undersold Mulvaney’s attempts to hamstring the agency as its chief. He froze new investigations, dropped enforcement actions en masse, requested a budget of $0 and seemed to mock the agency by attempting to officially re-order the words in the organization’s name.
But Mulvaney’s rhetoric sometimes exceeded his impact. His budget request was ignored, for example; the CFPB’s name change was only fleeting. And besides, Mulvaney was always a part-timer, fitting in a few days a week at the CFPB while also heading the Office of Management and Budget, and then moving to the White House as acting chief of staff.
It’s Mulvaney’s successor, Kraninger, whom the financial industry is now counting on — and the early signs suggest she’ll deliver. In addition to easing rules on payday lenders, she has continued Mulvaney’s policy of ending supervisory exams on outfits that specialize in lending to the members of the military, claiming that the CFPB can do so only if Congress passes a new law granting those powers (which isn’t likely to happen anytime soon). She has also proposed a new regulation that will allow debt collectors to text and email debtors an unlimited number of times as long as there’s an option to unsubscribe.
Enforcement activity at the bureau has plunged under Trump. The amount of monetary relief going to consumers has fallen from $43 million per week under Richard Cordray, the director appointed by Barack Obama, to $6.4 million per week under Mulvaney and is now $464,039, according to an updated analysis conducted by the Consumer Federation of America’s Christopher Peterson, a former special adviser to the bureau. (...)
Triple-digit interest rates are no laughing matter for those who take out payday loans. A sum as little as $100, combined with such rates, can lead a borrower into long-term financial dependency.
That’s what happened to Maria Dichter. Now 73, retired from the insurance industry and living in Palm Beach County, Florida, Dichter first took out a payday loan in 2011. Both she and her husband had gotten knee replacements, and he was about to get a pacemaker. She needed $100 to cover the co-pay on their medication. As is required, Dichter brought identification and her Social Security number and gave the lender a postdated check to pay what she owed. (All of this is standard for payday loans; borrowers either postdate a check or grant the lender access to their bank account.) What nobody asked her to do was show that she had the means to repay the loan. Dichter got the $100 the same day.
The relief was only temporary. Dichter soon needed to pay for more doctors’ appointments and prescriptions. She went back and got a new loan for $300 to cover the first one and provide some more cash. A few months later, she paid that off with a new $500 loan.
Dichter collects a Social Security check each month, but she has never been able to catch up. For almost eight years now, she has renewed her $500 loan every month. Each time she is charged $54 in fees and interest. That means Dichter has paid about $5,000 in interest and fees since 2011 on what is effectively one loan for $500.
by Anjali Tsui, ProPublica, and Alice Wilder, WNYC, Pro Publica | Read more:
Image: via
In mid-March, the payday lending industry held its annual convention at the Trump National Doral hotel outside Miami. Payday lenders offer loans on the order of a few hundred dollars, typically to low-income borrowers, who have to pay them back in a matter of weeks. The industry has long been reviled by critics for charging stratospheric interest rates — typically 400% on an annual basis — that leave customers trapped in cycles of debt.
The industry had felt under siege during the Obama administration, as the federal government moved to clamp down. A government study found that a majority of payday loans are made to people who pay more in interest and fees than they initially borrow. Google and Facebook refuse to take the industry’s ads.
On the edge of the Doral’s grounds, as the payday convention began, a group of ministers held a protest “pray-in,” denouncing the lenders for having a “feast” while their borrowers “suffer and starve.”But inside the hotel, in a wood-paneled bar under golden chandeliers, the mood was celebratory. Payday lenders, many dressed in golf shirts and khakis, enjoyed an open bar and mingled over bites of steak and coconut shrimp.
They had plenty to be elated about. A month earlier, Kathleen Kraninger, who had just finished her second month as director of the federal Consumer Financial Protection Bureau, had delivered what the lenders consider an epochal victory: Kraninger announced a proposal to gut a crucial rule that had been passed under her Obama-era predecessor.
Payday lenders viewed that rule as a potential death sentence for many in their industry. It would require payday lenders and others to make sure borrowers could afford to pay back their loans while also covering basic living expenses. Banks and mortgage lenders view such a step as a basic prerequisite. But the notion struck terror in the payday lenders. Their business model relies on customers — 12 million Americans take out payday loans every year, according to Pew Charitable Trusts — getting stuck in a long-term cycle of debt, experts say. A CFPB study found that three out of four payday loans go to borrowers who take out 10 or more loans a year. (...)
In Mick Mulvaney, who Trump appointed as interim chief of the CFPB in 2017, the industry got exactly the kind of person it had hoped for. As a congressman, Mulvaney had famously derided the agency as a “sad, sick” joke.
If anything, that phrase undersold Mulvaney’s attempts to hamstring the agency as its chief. He froze new investigations, dropped enforcement actions en masse, requested a budget of $0 and seemed to mock the agency by attempting to officially re-order the words in the organization’s name.
But Mulvaney’s rhetoric sometimes exceeded his impact. His budget request was ignored, for example; the CFPB’s name change was only fleeting. And besides, Mulvaney was always a part-timer, fitting in a few days a week at the CFPB while also heading the Office of Management and Budget, and then moving to the White House as acting chief of staff.
It’s Mulvaney’s successor, Kraninger, whom the financial industry is now counting on — and the early signs suggest she’ll deliver. In addition to easing rules on payday lenders, she has continued Mulvaney’s policy of ending supervisory exams on outfits that specialize in lending to the members of the military, claiming that the CFPB can do so only if Congress passes a new law granting those powers (which isn’t likely to happen anytime soon). She has also proposed a new regulation that will allow debt collectors to text and email debtors an unlimited number of times as long as there’s an option to unsubscribe.
Enforcement activity at the bureau has plunged under Trump. The amount of monetary relief going to consumers has fallen from $43 million per week under Richard Cordray, the director appointed by Barack Obama, to $6.4 million per week under Mulvaney and is now $464,039, according to an updated analysis conducted by the Consumer Federation of America’s Christopher Peterson, a former special adviser to the bureau. (...)
Triple-digit interest rates are no laughing matter for those who take out payday loans. A sum as little as $100, combined with such rates, can lead a borrower into long-term financial dependency.
That’s what happened to Maria Dichter. Now 73, retired from the insurance industry and living in Palm Beach County, Florida, Dichter first took out a payday loan in 2011. Both she and her husband had gotten knee replacements, and he was about to get a pacemaker. She needed $100 to cover the co-pay on their medication. As is required, Dichter brought identification and her Social Security number and gave the lender a postdated check to pay what she owed. (All of this is standard for payday loans; borrowers either postdate a check or grant the lender access to their bank account.) What nobody asked her to do was show that she had the means to repay the loan. Dichter got the $100 the same day.
The relief was only temporary. Dichter soon needed to pay for more doctors’ appointments and prescriptions. She went back and got a new loan for $300 to cover the first one and provide some more cash. A few months later, she paid that off with a new $500 loan.
Dichter collects a Social Security check each month, but she has never been able to catch up. For almost eight years now, she has renewed her $500 loan every month. Each time she is charged $54 in fees and interest. That means Dichter has paid about $5,000 in interest and fees since 2011 on what is effectively one loan for $500.
by Anjali Tsui, ProPublica, and Alice Wilder, WNYC, Pro Publica | Read more:
Image: via
Biotech Cockaigne of the Vegan Hopeful
August 2013: The future of meat appears in London. At least, that’s how the media event I’m watching online has been billed. A hamburger made of bovine muscle cells grown in vitro is unveiled, then served to a panel of tasters while a studio audience of journalists watches. A promotional film describes the various ills that “cultured meat” promises to solve, ills caused by eating animals at industrial scale. Industrial animal agriculture possibly produces 14 to 18 percent of global emissions of greenhouse gases. The byproducts of animal agriculture can pollute waterways and soil. Livestock, especially bovine livestock, is inefficient at turning plant foods into protein. Concentrated animal feeding operations (CAFOs) are a potential source of zoonotic diseases; furthermore, subtherapeutic dosing with antibiotics to speed animals’ growth builds antibiotic resistance in pathogens that can grow in feedlots.1 Billions of animals suffer in our meat production infrastructure, and the moral weight of that suffering depends on whom you ask, and on his or her philosophical views about animals. Today’s event conveys the implicit promise that “cultured meat” may solve all these problems. The short promotional film concludes with the words “be part of the solution.”
A second promotional film describes how the burger was made: The process started with a biopsy of cow muscle cells, followed by careful stimulation of a stem cell–driven, natural process of muscle repair, as cells were fed with growth media under carefully calibrated laboratory conditions. Gradually, what functions as a healing process in vivo (i.e., in living animals) becomes a meat production process, in vitro. Thus, the potential of stem cells to create new tissue becomes the biological grounds for a promise about the future of protein.
But this is only a test—or, only a taste. In vitro techniques cannot yet perfectly reproduce in vivo animal muscle and fat, and thus cannot perfectly reproduce what consumers recognize as meat. Cultured meat has yet to become delicious. Nor is the technology scalable. The techniques and materials are still too expensive. The burger taste-tested in London took months of lab time to make, and the entire project (materials, technician salaries, etc.) cost more than $300,000 US. If the holy grail of cultured meat research is to develop a product that can replace “cheap meat,” that is, the kind of meat that is produced at industrial scale and sold at fast-food restaurants, then the goal seems years or decades away.
If we succeed in growing meat—meat that never had parents, meat that was never part of a complete animal body—we will do more than change human subsistence strategies forever. We will also transform our relationship with animal bodies, beginning at the level of the cell. Mark Post, the Dutch medical researcher who created the burger with the help of a team of scientists and technicians, seems hopeful and confident. He laughs good-naturedly with the journalists when they articulate their doubts. Of course, he acknowledges, it would be easier if everyone just became a vegetarian, but such a mass shift in human behavior doesn’t seem likely.
A Tale of Hope—or Hype?
October 2018: Scientists, entrepreneurs, and promoters are working to make cultured meat a reality. There is still no cultured meat on the market, but a handful of startup companies, many of them based in the San Francisco Bay area, promise that they will have a product to sell—presumably still not at the same price point as a fast-food hamburger or chicken nugget—in a matter of months or a handful of years.
I spent the years between the first in vitro hamburger unveiling and late 2018 conducting ethnographic research on the cultured meat movement, and I still cannot tell you if cultured meat will grace our tables soon. To the best of my knowledge, the two main technical challenges in cultured meat research have not yet been surmounted. One challenge is the creation of an affordably scalable growth medium not derived from animal sources (the current mix contains fetal bovine serum) and the other is the ability to create “thick” and texturally sophisticated tissue, such as that found in steak or pork chops, as opposed to growing two-dimensional sheets of cells and assembling them into meat. And beyond these technical challenges, cultured meat’s pioneers will need to find a way to make production “scale up” to the point where the cost of an individual serving of meat drops close to, or even equals, the cost of the conventional equivalent. In short, we don’t yet know what kind of technology story this is. Are we en route to success, or are we watching a cautionary tale in progress, one about hope and hype?
Much like self-driving cars, the advocates of which hope their use will reduce car crashes, cultured meat is promoted by those who believe in its practical and ethical benefits. But cultured meat is also like the self-driving car insofar as opinions vary as to whether a single technology can resolve a complex and, in some senses, social problem that involves not only engineering challenges but also the vagaries of human behavior. Like medical therapies based on stem cells, cultured meat excites the imagination and creates hope, but the hype seems to be running years or decades ahead of the reality. (Cultured meat itself is an offshoot of the effort to create tissues for transplant to human patients, an effort that goes by the name “regenerative medicine.”)
Cultured meat may one day come ashore on the high-tech equivalent of the Island of Misfit Toys, where flying cars rust next to moldering piles of food pills, but it hasn’t yet. One of the forces keeping it afloat, both financially and in the popular imagination, is many people’s deep investment in the defense of animals. The cultured meat startups are linked by a loose social network of educated professionals, often vegans or vegetarians, who believe that cultured meat may accomplish what decades of animal protection activism has not, alleviating the suffering of animals in our food system. Not all venture capital investment in cultured meat research is inspired by a desire to protect animals, of course; there are investors interested in the potential environmental “cleanliness” of cultured meat, and those angling for a profit, just as profit orientation is part of the package for any investor. But the most vocal proponents of cultured meat speak more eagerly about the defense of animals than they do about the defense of the natural environment or human health, although they readily acknowledge that cultured meat (many of them call it “clean meat,” or use other terms) happily addresses all three needs at once.
Meet the Utilitarians
In addition to resources, the advocates of cultured meat have a philosophy ready to hand. Many of them are self-described utilitarians, readers of the works of philosopher Peter Singer, in particular his 1975 book Animal Liberation. In that book, Singer followed classical utilitarian philosophers like Jeremy Bentham by arguing that the way to determine the moral standing of animals is not by assessing their intellectual capacities relative to those of most humans but by asking if animals can suffer as humans do. Answering that question in the affirmative, Singer suggested that it was “speciesist” to deny moral standing to the suffering of animals. Many regard Animal Liberation as the bible of the contemporary animal rights movement, despite the fact that the book does not defend the rights of animals per se. Contrary to the thinking of some other philosophers concerned with animals, such as Tom Regan, Singer does not assert the inherent rights of animals, or (in what philosophers term a “deontological” fashion) define the maltreatment or even the use of animals as morally wrong. “I am a vegetarian,” Singer has written, “because I am a utilitarian.” Rather than focus on the inherent worth of a human or animal life, a utilitarian will ask how that life is contoured by experiences of suffering or happiness. These notions, unlike those such as inherent worth, are the conditions a utilitarian can measure with some hope of improving the world. Whether they share Singer’s ordering of concerns (first utilitarianism, then animal protection), many of cultured meat’s promoters have taken up Singer’s approach as a philosophical support for their work.
Utilitarianism combines the following features: It is consequentialist insofar as it judges right and wrong by considering the outcome of our actions, not preoccupying itself with the nature of those actions themselves. It is a doctrine of ends, not means. It is universalist insofar as it claims to take into account every being’s interests equally. It is welfarist in that it understands and measures people’s well-being in terms of the satisfaction of their needs. And it is aggregative in that it considers everyone’s interests added together with the goal of maximizing happiness and minimizing suffering for the greatest number. Individuals count only as part of the whole. Each one counts for one, never for more than one.
If this account of utilitarianism’s parts seems schematic, it is worth saying that many utilitarian accounts of the world can seem like line drawings or blueprints. As the philosopher Bernard Williams noted, this philosophy “appeals to a frame of mind in which technical difficulty…is preferable to moral unclarity, no doubt because it is less alarming.”That is to say, for a utilitarian it is better to have a complicated job of balancing multiple interests than to be unsure what would count as a desirable outcome. Utilitarianism appeals to those who dislike moral ambiguity and to those who focus on outcomes; this characterization also applies to many actors in the world of cultured meat who eagerly anticipate an end to animal agriculture.
by Benjamin Aldes Wurgaft, The Hedgehog Review | Read more:
A second promotional film describes how the burger was made: The process started with a biopsy of cow muscle cells, followed by careful stimulation of a stem cell–driven, natural process of muscle repair, as cells were fed with growth media under carefully calibrated laboratory conditions. Gradually, what functions as a healing process in vivo (i.e., in living animals) becomes a meat production process, in vitro. Thus, the potential of stem cells to create new tissue becomes the biological grounds for a promise about the future of protein.But this is only a test—or, only a taste. In vitro techniques cannot yet perfectly reproduce in vivo animal muscle and fat, and thus cannot perfectly reproduce what consumers recognize as meat. Cultured meat has yet to become delicious. Nor is the technology scalable. The techniques and materials are still too expensive. The burger taste-tested in London took months of lab time to make, and the entire project (materials, technician salaries, etc.) cost more than $300,000 US. If the holy grail of cultured meat research is to develop a product that can replace “cheap meat,” that is, the kind of meat that is produced at industrial scale and sold at fast-food restaurants, then the goal seems years or decades away.
If we succeed in growing meat—meat that never had parents, meat that was never part of a complete animal body—we will do more than change human subsistence strategies forever. We will also transform our relationship with animal bodies, beginning at the level of the cell. Mark Post, the Dutch medical researcher who created the burger with the help of a team of scientists and technicians, seems hopeful and confident. He laughs good-naturedly with the journalists when they articulate their doubts. Of course, he acknowledges, it would be easier if everyone just became a vegetarian, but such a mass shift in human behavior doesn’t seem likely.
A Tale of Hope—or Hype?
October 2018: Scientists, entrepreneurs, and promoters are working to make cultured meat a reality. There is still no cultured meat on the market, but a handful of startup companies, many of them based in the San Francisco Bay area, promise that they will have a product to sell—presumably still not at the same price point as a fast-food hamburger or chicken nugget—in a matter of months or a handful of years.
I spent the years between the first in vitro hamburger unveiling and late 2018 conducting ethnographic research on the cultured meat movement, and I still cannot tell you if cultured meat will grace our tables soon. To the best of my knowledge, the two main technical challenges in cultured meat research have not yet been surmounted. One challenge is the creation of an affordably scalable growth medium not derived from animal sources (the current mix contains fetal bovine serum) and the other is the ability to create “thick” and texturally sophisticated tissue, such as that found in steak or pork chops, as opposed to growing two-dimensional sheets of cells and assembling them into meat. And beyond these technical challenges, cultured meat’s pioneers will need to find a way to make production “scale up” to the point where the cost of an individual serving of meat drops close to, or even equals, the cost of the conventional equivalent. In short, we don’t yet know what kind of technology story this is. Are we en route to success, or are we watching a cautionary tale in progress, one about hope and hype?
Much like self-driving cars, the advocates of which hope their use will reduce car crashes, cultured meat is promoted by those who believe in its practical and ethical benefits. But cultured meat is also like the self-driving car insofar as opinions vary as to whether a single technology can resolve a complex and, in some senses, social problem that involves not only engineering challenges but also the vagaries of human behavior. Like medical therapies based on stem cells, cultured meat excites the imagination and creates hope, but the hype seems to be running years or decades ahead of the reality. (Cultured meat itself is an offshoot of the effort to create tissues for transplant to human patients, an effort that goes by the name “regenerative medicine.”)
Cultured meat may one day come ashore on the high-tech equivalent of the Island of Misfit Toys, where flying cars rust next to moldering piles of food pills, but it hasn’t yet. One of the forces keeping it afloat, both financially and in the popular imagination, is many people’s deep investment in the defense of animals. The cultured meat startups are linked by a loose social network of educated professionals, often vegans or vegetarians, who believe that cultured meat may accomplish what decades of animal protection activism has not, alleviating the suffering of animals in our food system. Not all venture capital investment in cultured meat research is inspired by a desire to protect animals, of course; there are investors interested in the potential environmental “cleanliness” of cultured meat, and those angling for a profit, just as profit orientation is part of the package for any investor. But the most vocal proponents of cultured meat speak more eagerly about the defense of animals than they do about the defense of the natural environment or human health, although they readily acknowledge that cultured meat (many of them call it “clean meat,” or use other terms) happily addresses all three needs at once.
Meet the Utilitarians
In addition to resources, the advocates of cultured meat have a philosophy ready to hand. Many of them are self-described utilitarians, readers of the works of philosopher Peter Singer, in particular his 1975 book Animal Liberation. In that book, Singer followed classical utilitarian philosophers like Jeremy Bentham by arguing that the way to determine the moral standing of animals is not by assessing their intellectual capacities relative to those of most humans but by asking if animals can suffer as humans do. Answering that question in the affirmative, Singer suggested that it was “speciesist” to deny moral standing to the suffering of animals. Many regard Animal Liberation as the bible of the contemporary animal rights movement, despite the fact that the book does not defend the rights of animals per se. Contrary to the thinking of some other philosophers concerned with animals, such as Tom Regan, Singer does not assert the inherent rights of animals, or (in what philosophers term a “deontological” fashion) define the maltreatment or even the use of animals as morally wrong. “I am a vegetarian,” Singer has written, “because I am a utilitarian.” Rather than focus on the inherent worth of a human or animal life, a utilitarian will ask how that life is contoured by experiences of suffering or happiness. These notions, unlike those such as inherent worth, are the conditions a utilitarian can measure with some hope of improving the world. Whether they share Singer’s ordering of concerns (first utilitarianism, then animal protection), many of cultured meat’s promoters have taken up Singer’s approach as a philosophical support for their work.
Utilitarianism combines the following features: It is consequentialist insofar as it judges right and wrong by considering the outcome of our actions, not preoccupying itself with the nature of those actions themselves. It is a doctrine of ends, not means. It is universalist insofar as it claims to take into account every being’s interests equally. It is welfarist in that it understands and measures people’s well-being in terms of the satisfaction of their needs. And it is aggregative in that it considers everyone’s interests added together with the goal of maximizing happiness and minimizing suffering for the greatest number. Individuals count only as part of the whole. Each one counts for one, never for more than one.
If this account of utilitarianism’s parts seems schematic, it is worth saying that many utilitarian accounts of the world can seem like line drawings or blueprints. As the philosopher Bernard Williams noted, this philosophy “appeals to a frame of mind in which technical difficulty…is preferable to moral unclarity, no doubt because it is less alarming.”That is to say, for a utilitarian it is better to have a complicated job of balancing multiple interests than to be unsure what would count as a desirable outcome. Utilitarianism appeals to those who dislike moral ambiguity and to those who focus on outcomes; this characterization also applies to many actors in the world of cultured meat who eagerly anticipate an end to animal agriculture.
by Benjamin Aldes Wurgaft, The Hedgehog Review | Read more:
Image: Alarmy
Subscribe to:
Comments (Atom)







