Saturday, September 10, 2016

Letter of Recommendation: Glass Bricks

Like so many seemingly innocuous things, glass bricks were created to make the world a better place. They were invented at the turn of the 20th century to provide factory workers with more natural light. Soon they moved beyond the world of industry, as Art Deco architects took to their sleek modernism, using them to adorn building exteriors and divide interiors. A 1930 issue of Popular Science speculated about a future in which skyscrapers would be made almost entirely of glass bricks.

Today glass bricks are most closely associated with the decadence of 1980s architecture, which channeled the elegance and streamlined surfaces of Art Deco — an ’80s callback to the retro future imagined in the ’20s. Though they are designed to look pristinely high-gloss forever, time and dirt take their toll on most. There’s something kind of sleazy about them and not just because they show up in the background of so many scenes shot in Encino porn houses. You find them in corner bars and mini-malls, reflecting neon or LED light. They’re often installed to replace windows, providing translucency but keeping the outside firmly out. If they once signaled progress, nowadays glass bricks signify an oddly compelling sort of decline. And for me, they evoke my own Los Angeles childhood. At some point, I became completely obsessed.

My fixation was fueled, in part, by a fear that glass bricks were becoming endangered. Los Angeles has morphed in recent years. Ritzy apartment buildings and trendy hotels went up in formerly decaying areas like downtown and Hollywood, turning decrepit blocks into lavish playgrounds. This year it was announced that the Los Angeles Memorial Sports Arena, in Exposition Park, would be bulldozed to make way for a more modern stadium. The arena had been around since 1959, and in Los Angeles, a building from 1959 is considered fairly historic. But its datedness was exactly what I loved about the arena; it felt like a portal to a Los Angeles I’d never seen, whose ghosts I could sense. It was also full of glass bricks, and I feared they would be ground to crystalline dust.

I started taking long, aimless drives down major city-spanning thoroughfares like Beverly Boulevard and through suburban neighborhoods like Glendale, searching for glass bricks that I could capture with my camera phone. I started to see them everywhere in Los Angeles, and through Twitter, I learned that they were actually everywhere. A #glassbricks hashtag I started as a joke became real when people began sending me photos of their own sightings from all corners of the world: Amsterdam, Tokyo, Zurich. Last week, a friend sent me a shot of some glass bricks near Chernobyl that probably haven’t been touched since 1986. There are glass bricks at the end of the world.

by Molly Lambert, NY Times |  Read more:
Image: Coley Brown

Friday, September 9, 2016

Thursday, September 8, 2016

Wells Fargo Fined for Fraudulently Opening Accounts for Customers

[ed. It never ends. Will anyone higher up ever be prosecuted? One guess. The fact that there's a done deal before news ever reached the public should tell you everything you need to know - either about the regulators enforcing banking regulations, or the regulations themselves (and Wells Fargo, who somehow thought it wasn't important enough to disclose the investigation in recent regulatory filings.]

For years, Wells Fargo employees secretly issued credit cards without a customer’s consent. They created fake email accounts to sign up customers for online banking services. They set up sham accounts that customers learned about only after they started accumulating fees.

On Thursday, these illegal banking practices cost Wells Fargo $185 million in fines, including a $100 million penalty from the Consumer Financial Protection Bureau, the largest such penalty the agency has issued.

Federal banking regulators said the practices, which date back to 2011, reflected serious flaws in the internal culture and oversight at Wells Fargo, one of the nation’s largest banks. The bank has fired at least 5,300 employees who were involved.

In all, Wells Fargo employees opened roughly 1.5 million bank accounts and applied for 565,000 credit cards that may not have been authorized by customers, the regulators said in a news conference. The bank has 40 million retail customers.

Some customers noticed the deception when they were charged unexpected fees, received credit or debit cards in the mail that they did not request, or started hearing from debt collectors about accounts they did not recognize. But most of the sham accounts went unnoticed, as employees would routinely close them shortly after opening them. Wells has agreed to refund about $2.6 million in fees that may have been inappropriately charged.

Wells Fargo is famous for its culture of cross-selling products to customers — routinely asking, say, a checking account holder if she would like to take out a credit card. Regulators said the bank’s employees had been motivated to open the unauthorized accounts by compensation policies that rewarded them for opening new accounts; many current and former Wells employees told regulators they had felt extreme pressure to open as many accounts as possible.

“Unchecked incentives can lead to serious consumer harm, and that is what happened here,” said Richard Cordray, director of the Consumer Financial Protection Bureau.

Wells said the employees who were terminated included managers and other workers. A bank spokeswoman declined to say whether any senior executives had been reprimanded or fired in the scandal.

“Wells Fargo is committed to putting our customers’ interests first 100 percent of the time, and we regret and take responsibility for any instances where customers may have received a product that they did not request,” the bank said in a statement. (...)

Banking regulators said the widespread nature of the illegal behavior showed that the bank lacked the necessary controls and oversight of its employees. Ensuring that large banks have tight controls has been one of the central preoccupations of banking regulators after the mortgage crisis.

Such pervasive problems at Wells Fargo, which has headquarters in San Francisco, stand out given all of the scrutiny that has been heaped on large, systemically important banks since 2008.

“If the managers are saying, ‘We want growth; we don’t care how you get there,’ what do you expect those employees to do?” said Dan Amiram, an associate business professor at Columbia University.

It is a particularly ugly moment for Wells, one of the few large American banks that have managed to produce consistent profit increases since the financial crisis. Wells has earned a reputation on Wall Street as a tightly run ship that avoided many of the missteps of the mortgage crisis because it took fewer risks than many of its competitors. At the same time, Wells has managed to be enormously profitable, as other large banks continued to stumble because of tighter regulations and a choppy economy.

Analysts have marveled at the bank’s ability to cross-sell mortgages, credit cards and auto loans to customers. The strategy is at the core of modern-day banking: Rather than spend too much time and money recruiting new customers, sell existing customers on new products.

by Michael Corkery, NY Times |  Read more:
Image: Eric Thayer

Thomas Kaltenbach, Bundeswehrlager Kassel, 2007
via:

The Privacy Wars Are About to Get A Whole Lot Worse

[ed. I can't wait until some massive data collection operation like Google or ATT or Facebook gets their system hacked and suddenly hundreds of millions of people's web surfing habits (and other personal data) are available for searching, by anyone. You know it's going to happen eventually. Then we'll all finally know where the bear shits in the buckwheat.]

It used to be that server logs were just boring utility files whose most dramatic moments came when someone forgot to write a script to wipe out the old ones and so they were left to accumulate until they filled the computer’s hard-drive and crashed the server.

Then, a series of weird accidents turned server logs into the signature motif of the 21st century, a kind of eternal, ubiquitous exhaust from our daily lives, the CO2 of the Internet: invisible, seemingly innocuous, but harmful enough, in aggregate, to destroy our world.

Here’s how that happened: first, there were cookies. People running web-servers wanted a way to interact with the people who were using them: a way, for example, to remember your preferences from visit to visit, or to identify you through several screens’ worth of interactions as you filled and cashed out a virtual shopping cart.

Then, Google and a few other companies came up with a business model. When Google started, no one could figure out how the com­pany would ever repay its investors, especially as the upstart search-engine turned up its nose at the dirtiest practices of the industry, such as plastering its homepage with banner ads or, worst of all, selling the top results for common search terms.

Instead, Google and the other early ad-tech companies worked out that they could place ads on other people’s websites, and that those ads could act as a two-way conduit between web users and Google. Every page with a Google ad was able to both set and read a Google cookie with your browser (you could turn this off, but no one did), so that Google could get a pretty good picture of which websites you visited. That information, in turn, could be used to target you for ads, and the sites that placed Google ads on their pages would get a little money for each visitor. Advertisers could target different kinds of users – users who had searched for information about asbestos and lung cancer, about baby products, about wedding planning, about science fiction novels. The websites themselves became part of Google’s ‘‘inventory’’ where it could place the ads, but they also improved Google’s dossiers on web users and gave it a better story to sell to advertisers.

The idea caught the zeitgeist, and soon everyone was trying to figure out how to gather, aggregate, analyze, and resell data about us as we moved around the web.

Of course, there were privacy implications to all this. As early breaches and tentative litigation spread around the world, lawyers for Google and for the major publishers (and for publishing tools, the blogging tools that eventually became the ubiquitous ‘‘Content Management Systems’’ that have become the default way to publish material online) adopted boiler­plate legalese, those ‘‘privacy policies’’ and ‘‘terms of service’’ and ‘‘end user license agreements’’ that are referenced at the bottom of so many of the pages you see every day, as in, ‘‘By using this website, you agree to abide by its terms of service.’’

As more and more companies twigged to the power of ‘‘surveillance capitalism,’’ these agreements proliferated, as did the need for them, because before long, everything was gathering data. As the Internet everted into the physical world and colonized our phones, we started to get a taste of what this would look like in the coming years. Apps that did innocuous things like turning your phone into a flashlight, or recording voice memos, or letting your kids join the dots on public domain clip-art, would come with ‘‘permissions’’ screens that required you to let them raid your phone for all the salient facts of your life: your phone number, e-mail address, SMSes and other messages, e-mail, location – everything that could be sensed or inferred about you by a device that you carried at all times and made privy to all your most sensitive moments.

When a backlash began, the app vendors and smartphone companies had a rebuttal ready: ‘‘You agreed to let us do this. We gave you notice of our privacy practices, and you consented.’’

This ‘‘notice and consent’’ model is absurd on its face, and yet it is surprisingly legally robust. As I write this in July of 2016, US federal appellate courts have just ruled on two cases that asked whether End User Licenses that no one read and no one understands and no one takes seriously are enforceable. The cases differed a little in their answer, but in both cases, the judges said that they were enforceable at least some of the time (and that violating them can be a felony!). These rulings come down as the entirety of America has been consumed with Pokémon Go fever, only to have a few killjoys like me point out that merely by installing the game, all those millions of players have ‘‘agreed’’ to forfeit their right to sue any of Pokémon’s corporate masters should the com­panies breach all that private player data. You do, however, have 30 days to opt out of this forfeiture; if Pokémon Go still exists in your timeline and you signed up for it in the past 30 days, send an e-mail to with the subject ‘‘Arbitra­tion Opt-out Notice’’ and include in the body ‘‘a clear declaration that you are opting out of the arbitration clause in the Pokémon Go terms of service.’’

Notice and consent is an absurd legal fiction. Jonathan A. Obar and Anne Oeldorf-Hirsch, a pair of communications professors from York University and the University of Connecticut, published a working paper in 2016 called ‘‘The Biggest Lie on the Internet: Ignoring the Privacy Policies and Terms of Service Policies of Social Net­working Services.’’ The paper details how the profs gave their students, who are studying license agreements and privacy, a chance to beta-test a new social network (this service was fictitious, but the students didn’t know that). To test the network, the students had to create accounts, and were given a chance to review the service’s terms of service and privacy policy, which prominently promised to give all the users’ personal data to the NSA, and demanded the students’ first-born children in return for access to the service. As you may have gathered from the paper’s title, none of the students noticed either fact, and almost none of them even glanced at the terms of service for more than a few seconds.

Indeed, you can’t examine the terms of service you interact with in any depth – it would take more than 24 hours a day just to figure out what rights you’ve given away that day. But as terrible as notice-and-consent is, at least it pretends that people should have some say in the destiny of the data that evanescences off of their lives as they move through time, space, and information.

The next generation of networked devices are literally incapable of participating in that fiction.

The coming Internet of Things – a terrible name that tells you that its proponents don’t yet know what it’s for, like ‘‘mobile phone’’ or ‘’3D printer’’ – will put networking capability in everything: appliances, light­bulbs, TVs, cars, medical implants, shoes, and garments. Your lightbulb doesn’t need to be able to run apps or route packets, but the tiny, com­modity controllers that allow smart lightswitches to control the lights anywhere (and thus allow devices like smart thermostats and phones to integrate with your lights and home security systems) will come with full-fledged computing capability by default, because that will be more cost-efficient that customizing a chip and system for every class of devices. The thing that has driven computers so relentlessly, making them cheaper, more powerful, and more ubiquitous, is their flexibility, their character of general-purposeness. That fact of general-purposeness is inescapable and wonderful and terrible, and it means that the R&D that’s put into making computers faster for aviation benefits the computers in your phone and your heart-monitor (and vice-versa). So every­thing’s going to have a computer.

You will ‘‘interact’’ with hundreds, then thou­sands, then tens of thousands of computers every day. The vast majority of these interactions will be glancing, momentary, and with computers that have no way of displaying terms of service, much less presenting you with a button to click to give your ‘‘consent’’ to them. Every TV in the sportsbar where you go for a drink will have cameras and mics and will capture your image and process it through facial-recognition software and capture your speech and pass it back to a server for continu­ous speech recognition (to check whether you’re giving it a voice command). Every car that drives past you will have cameras that record your like­ness and gait, that harvest the unique identifiers of your Bluetooth and other short-range radio devices, and send them to the cloud, where they’ll be merged and aggregated with other data from other sources.

In theory, if notice-and-consent was anything more than a polite fiction, none of this would hap­pen. If notice-and-consent are necessary to make data-collection legal, then without notice-and-consent, the collection is illegal.

But that’s not the realpolitik of this stuff: the reality is that when every car has more sensors than a Google Streetview car, when every TV comes with a camera to let you control it with gestures, when every medical implant collects telemetry that is collected by a ‘‘services’’ business and sold to insurers and pharma companies, the argument will go, ‘‘All this stuff is both good and necessary – you can’t hold back progress!’’

It’s true that we can’t have self-driving cars that don’t look hard at their surroundings all the time, and pay especially close attention to humans to make sure that they’re not killing them. However, there’s nothing intrinsic to self-driving cars that says that the data they gather needs to be retained or further processed. Remember that for many years, the server logs that recorded all your inter­actions with the web were flushed as a matter of course, because no one could figure out what they were good for, apart from debugging problems when they occurred. (...)

The next iteration of this is the gadgets that will spy on us from every angle, in every way, all the time. The data that these services collect will be even more toxic in its potential to harm us. Consider that today, identity thieves merge data from several breaches in order to piece together enough information to get a duplicate deed for their victims’ houses and sell those houses out from under them; that voyeurs use untargeted attacks to seize control over peoples’ laptops to capture nude photos of them and then use those to blackmail their victims to perform live sex-acts on camera; that every person who ever applied for security clearance in the USA had their data stolen by Chinese spies, who broke into the Office of Personnel Management’s servers and stole more than 20,000,000 records.

The best way to secure data is never to collect it in the first place. Data that is collected is likely to leak. Data that is collected and retained is certain to leak. A house that can be controlled by voice and gesture is a house with a camera and a microphone covering every inch of its floorplan.

The IoT will rupture notice-and-consent, but without some other legal framework to replace it, it’ll be a free-for-all that ends in catastrophe.

by Cory Doctorow, Locus |  Read more:
Image: Cory Doctorow uncredited

How Global Entertainment Killed Culture

A large number of studies in recent years have looked to define the characteristics of contemporary culture within the context of the globalization of capitalism and of markets, and the extraordinary revolution in technology. One of the most incisive of these studies is Gilles Lipovetsky and Jean Serroy’s La cultura-mundo: Respuesta a una sociedad desorientada (Culture-World: Response to a Disoriented Society). It puts forward the idea that there is now an established global culture—a culture-world—that, as a result of the progressive erosion of borders due to market forces, and of scientific and technical revolutions (especially in the field of communications), is creating, for the first time in history, certain cultural values that are shared by societies and individuals across the five continents, values that can be shared equally despite different traditions, beliefs, and languages. This culture, unlike what had previously been defined as culture, is no longer elitist, erudite and exclusive, but rather a genuine “mass culture”:
Diametrically opposed to hermetic and elitist vanguard movements, this mass culture seeks to offer innovations that are accessible to the widest possible audience, which will entertain the greatest number of consumers. Its intention is to amuse and offer pleasure, to provide an easy and accessible escapism for everyone without the need for any specific educational background, without concrete and erudite references. What the culture industries invent is a culture transformed into articles of mass consumption.
This mass culture, according to the authors, is based on the predominance of image and sound over the word. The film industry, in particular Hollywood, “globalizes” movies, sending them to every country, and within each country, reaching every social group, because, like commercially available music and television, films are accessible to everyone and require no specialist background to be enjoyed. This process has been accelerated by the cybernetic revolution, the creation of social networks and the universal reach of the Internet. Not only has information broken through all barriers and become accessible to all, but almost every aspect of communication, art, politics, sport, religion, etc., has felt the reforming effects of the small screen: “The screen world has dislocated, desynchronized and deregulated the space—time of culture.”

All this is true, of course. What is not clear is whether what Lipovetsky and Serroy call the “culture-world” or mass culture (in which they include, for example, even the “culture of brands” of luxury objects), is, strictly speaking, culture, or if we are referring to essentially different things when we speak, on one hand, about an opera by Wagner or Nietzsche’s philosophy and, on the other hand, the films of Alfred Hitchcock and John Ford (two of my favorite directors), and an advertisement for Coca-Cola. They would say yes, that both categories are culture, while I think that there has been a change, or a Hegelian qualitative leap, that has turned this second category into something different from the first.

Furthermore, some assertions of La cultura-mundo seem questionable, such as the proposition that this new planetary culture has developed extreme individualism across the globe. Quite the reverse: the ways in which advertising and fashion shape and promote cultural products today are a major obstacle to the formation of independent individuals, capable of judging for themselves what they like, what they admire, or what they find disagreeable, deceitful or horrifying in these products. Rather than developing individuals, the culture-world stifles them, depriving them of lucidity and free will, causing them to react to the dominant “culture” with a conditioned, herd mentality, like Pavlov’s dogs reacting to the bell that rings for a meal.

Another of Lipovetsky’s and Serroy’s ideas that seems questionable is the assertion that because millions of tourists visit the Louvre, the Acropolis and the Greek amphitheaters in Sicily, then culture has lost none of its value, and still enjoys “a great legitimacy.” The authors seem not to notice that these mass visits to great museums and classic historical monuments do not illustrate a genuine interest in “high culture” (the term they use), but rather simple snobbery because the fact of having been in these places is part of the obligations of the perfect postmodern tourist. Instead of stimulating an interest in the classical past and its arts, these visits replace any form of serious study and investigation. A quick look is enough to satisfy people that their cultural conscience is clear. These tourist visits “on the lookout for distractions” undermine the real significance of these museums and monuments, putting them on the same level as other obligations of the perfect tourist: eating pasta and dancing a tarantella in Italy, applauding flamenco and cante jondo in Andalucía, and tasting escargots, visiting the Louvre and the Folies-Bergère in Paris.

In 2010, Flammarion in Paris published Mainstream by the sociologist Frédéric Martel. This book demonstrates that, to some extent, the “new culture” or the “culture-world” that Lipovetsky and Serroy speak of is already a thing of the past, out of kilter with the frantic maelstrom of our age. Martel’s book is fascinating and terrifying in its description of the “entertainment culture” that has replaced almost everywhere what scarcely half a century ago was understood as culture. Mainstream is, in effect, an ambitious study, drawing on hundreds of interviews from many parts of the world, of what, thanks to globalization and the audiovisual revolution, is now shared by people across five continents, despite differences in languages, religions and customs.

Martel’s study does not talk about books—the only one mentioned in its several hundred pages is Dan Brown’s The Da Vinci Code, and the only woman writer mentioned is the film critic Pauline Kael—or about painting and sculpture, or about classical music and dance, or about philosophy or the humanities in general. Instead it talks exclusively about films, television programs, videogames, manga, rock, pop and rap concerts, videos and tablets and the “creative industries” that produce and promote them: that is, the entertainment enjoyed by the vast majority of people that has been replacing (and will end up finishing off) the culture of the past.

The author approves of this change, because, as a result, mainstream culture has swept away the cultural life of a small minority that had previously held a monopoly over culture; it has democratized it, putting it within everyone’s reach, and because the contents of this new culture seem to him to be perfectly attuned to modernity, to the great scientific and technological inventions of our era.

The accounts and the interviews collected by Martel, along with his own analysis, are instructive and quite representative of a reality that, until now, sociological and philosophical studies have not dared to address. The great majority of humanity does not engage with, produce or appreciate any form of culture other than what used to be considered by cultured people, disparagingly, as mere popular pastimes, with no links to the intellectual, artistic, and literary activities that were once at the heart of culture. This former culture is now dead, although it still survives in small social enclaves, without any influence on the mainstream.

The essential difference between the culture of the past and the entertainment of today is that the products of the former sought to transcend mere present time, to endure, to stay alive for future generations, while the products of the latter are made to be consumed instantly and disappear, like cake or popcorn. Tolstoy, Thomas Mann, still more Joyce and Faulkner, wrote books that looked to defeat death, outlive their authors and continue attracting and fascinating readers in the future. Brazilian soaps, Bollywood movies, and Shakira concerts do not look to exist any longer than the duration of their performance. They disappear and leave space for other equally successful and ephemeral products. Culture is entertainment and what is not entertaining is not culture.

Martel’s investigation shows that this is today a global phenomenon, something that is occurring for the first time in history, in which developed and underdeveloped countries participate, no matter how different their traditions, beliefs or systems of government although, of course, each country and society will display certain differences in terms of detail and nuance with regard to films, soap operas, songs, manga, animation, etc.

by Mario Vargas Llosa, Literary Hub | Read more:
Image: The Truman Show

Wolf Alice


[ed. Pretty excellent band (never heard of them till today). At least somebody's still making interesting music these days. See also: Wolf Alice - Lollapalooza 2016 .]

Wednesday, September 7, 2016

The Pleasures of Protest: Taking on Gentrification in Chinatown

[ed. If there's anywhere I'd like to live in Seattle (if there was anywhere I could live in Seattle, it would be the Asian district - or Beacon Hill, the outermost reaches of its influence). Sadly, those communities won't be around in their present form much longer I'm afraid.]

On a cold night in the early winter months of 2007, I was with a group of tenants — all Latino and Chinese immigrant families — clustered together in front of their home, two buildings on Delancey Street that straddled the border between Chinatown and the Lower East Side. We were there, shivering in the cold, to protest their landlords.

Ever since they bought the two buildings in 2001, the owners of 55 Delancey and 61 Delancey Street — Nir Sela, Michael Daniel, and 55 Delancey Street Realty LLC — had been attempting to kick out the Chinese and Latino families who had lived there, but in recent months, the situation had come to a head. They had begun aggressively bringing tenants to housing court, often on trumped up charges (one lawsuit argued that, based on the number of shoes displayed inside the apartment, it was obvious that more than just one family lived there); offered several families significant buyouts to leave; and had refused to make basic repairs. For stretches at a time, and in the coldest days of winter, there had been no heat or hot water.

That evening, huddled in our winter coats and clutching hand-made signs, we waited for the arrival of one of the owners, who had agreed to meet with us and discuss our demands.

I had been volunteering with CAAAV, a tenant organizing group in Chinatown, and in the months prior, I had spent many of my nights going from apartment to apartment, often with Zhi Qin Zheng, a resident of the building as well as an organizer at CAAAV, helping to painstakingly document their living conditions and assisting residents in calling the city’s 311 hotline so that each housing code violation would be on record.

Their apartments were cramped, even rundown, but for these families, it was home, and they wanted to stay. Over the years, each building had become a small community, one where people felt comfortable leaving their doors open and asking each other to watch their children. “If we left, where would we go?” Sau Ying Kwok, a feisty grandmother with a nimbus of frizzy hair, wondered aloud. She had become one of the more vocal leaders in the building, along with the soft-spoken You Liu Lin, a man in his middle years with a penchant for Brylcreeming his hair as well as shoving bottles of water and perfect Fuji apples into my hands whenever I knocked on his door.

I often questioned why I was there on those trips. I had moved to the city three years prior from Texas, fresh out of college and possessing a vague notion that I would put my Asian American Studies degree to use and, in the words of 1960s radicals inspired by Mao Zedong, “serve the people.”

In a way, I was continuing the tradition of those who were part of the Asian American movement of the 1960s — young, mostly college-educated Chinese, Japanese, and Filipino Americans who not only coined the term “Asian American” but also immersed themselves in ethnic enclaves like Chinatown on the east and west coasts.

In Serve The People: Making Asian America in the Long Sixties, her book chronicling the Asian American movement, Karen Ishizuka wrote that people had to become Asian American. It wasn’t about your ethnic background, but “a political identity developed out of the oppositional consciousness of the Long Sixties, in order to be seen and heard.”

But there has always been a disconnect between these Asian American activists and the people they served, who tended to be primarily working-class immigrants, a disconnect that I felt keenly. What was I, an ABC (American-born Chinese) doing in a mostly immigrant community, with my barely passable Mandarin? I didn’t really know, but I felt a complicated sense of belonging that I had never experienced before, complicated because I was, in many ways, an outsider — someone not from the neighborhood or embedded in its history, who wasn’t threaded through the day-to-day life that makes a grouping of city blocks a community. Yet the residents didn’t treat me as an outsider when they invited me into their homes; being Chinese, it seemed, was enough.

It was easy to understand why the owners would want to wholesale evict these families, who all lived in rent-stabilized apartments where rents were, on average, $1000 or less, far below what the owners could charge in the hot real estate market of lower Manhattan, where people fought for the right to pay $3000 a month for a two-bedroom apartment.

That night, I got a lesson in what some have called the pleasures of protest. When Nir Sela and his wife arrived and saw the mass of people waiting for them on the sidewalk, when they saw the cameras, they quickly turned around and walked away. We began following them, scores of people chanting, “Shame on you! Shame on you!” They quickly got into a cab and sped away. Despite the abrupt cancellation of the meeting we had planned, everyone seemed pleased, smiles on their faces.

Soon after, the tenants decided to go on a rent strike. It was a success — a few months later, the owners capitulated, agreeing to make all the necessary repairs and to end eviction proceedings, along with a payment of $3000 to each household. Less than a year later, I would join the staff of CAAAV as a full-time housing organizer, still high off the success of that campaign victory.

But in a city where finance capital reigns, this sense of victory wouldn’t last for long.

* * *

Chinatown as we know it today didn’t really exist until the 1970s, when, in the wake of the 1965 Immigration Act, Chinese immigrants began arriving in large numbers.

Yet as early as the 1850s, one could find a small bachelor community of Chinese men living in what was then known as Five Points (and what some today have called “America’s first slum”), a neighborhood that had arisen on top of a landfill whose residents were free blacks as well as Irish, Jewish, and Italian immigrants. Jacob Riis in his influential 1890 book, How the Other Half Lives: Studies Among the Tenements of New York, devoted an entire chapter to Chinatown, writing dismissively, “Chinatown as a spectacle is disappointing. Next door to the Bend, it has little of its outdoor stir and life, none of its gayly-colored rags or picturesque filth and poverty.” Yet the neighborhood, he noted, had already taken on the tinge of the exotic, New Yorkers believing it was rife with far more opium dens than actually existed.

The black residents fled after an anti-abolition riot; the Chinese men, sailors as well as workers who had moved from the west coast in the wake of increasingly oppressive laws and racist mob violence, stayed because they had nowhere else to go. “Residents of New York Chinatown could not cross Canal Street into Little Italy without the risk of being beaten up;” wrote John Kuo Wei Tchen, the historian and founder of the Museum of Chinese in America, “laundry men in the scattered boroughs and suburbs illegally lived in the back of their shops because they could not rent apartments.”

By the early 1960s, there were only 5,000 residents of Chinatown, mostly elderly men who lived on the blocks clustered around Columbus Park. The neighborhood surrounding it was in decline, the Irish having moved away decades prior, and the Jewish and Italian immigrants who had come to define the Lower East Side having already begun fleeing in rapid numbers.

Without the 1965 Immigration Act, Chinatown would have faded away. But as tens of thousands of immigrants began flocking to New York City, the empty tenements and boarded up storefronts filled with families and small businesses, and the old garment factories once again hummed with the sound of sewing machines, this time manned by a workforce of Chinese immigrant women. Chinatown mushroomed over the next two decades, spreading until it was bordered by Soho and Tribeca to the west and the East River on the opposite end, with Delancey Street settled as the line delineating Chinatown from the Lower East Side.

According to the scholar Peter Kwong, this expansion ended by the mid-1990s, halted by the revitalization of the neighborhoods bordering Chinatown. The events of 9/11 further destabilized the neighborhood, located as it was so close to the Financial District, but, as Kwong put it in the New York Times: “The root cause of the decline of Chinatown predated the 9/11 attack; the collapse of the garment industry and years of harm done by real estate speculation had already taken their toll on the community.”

I didn’t know any of this history when I came to New York City in 2004 and moved into an apartment in central Harlem, itself a neighborhood in flux, where I paid $750 each month to live with two roommates. Like most, all I knew was that Chinatown had a lot of Chinese people, and that fact alone drew me to the neighborhood on evenings after work and on weekends. Having grown up in south Texas, I had moved in large part out of a desire to live somewhere where I could feel a sense of belonging that I hadn’t had as a child.

People expressed a lot of strange beliefs about Chinatown, ideas that became increasingly more bizarre to me the more time I spent in the neighborhood. It’s often described as “gritty” or “dirty,” or as “exotic.” Other commonly used descriptors are “authentic” and “unchanging.”

Those descriptions made me cringe, not only for the casual racism underpinning them and, in the words of the scholar Lisa Lowe in her book Immigrant Acts, “the gaze that seeks to exoticize [Chinatown] as antiquated artifact”, but because they miss an essential truth of the neighborhood — that what is thought of as exotic or authentic to some, is simply the minutiae of life for others. (...)

And yet I too was guilty of a sort of fetishization, for I had my own foolish, romantic notions of the neighborhood, tinged with a nostalgia for a home I had never had. Eating dumplings wasn’t just a meal — it was embracing my culture. During the four years that I worked in the neighborhood, these notions were quickly disabused by the everyday life and reality that I saw around me. I began to understand that Chinatown was a vibrant neighborhood of the present, the kind that urban planning writer and activist Jane Jacobs described as displaying the “exuberant diversity” that she believed characterized the best cities, the ones that thrived.

by Esther Wang, Longreads |  Read more:
Image: Katie Kosma

Leo Berne, Hunger, February 2016
via:

The State of the Menswear Union

[ed. I have more nice clothes to wear than I have opportunities to wear them.]

A man in his early thirties relaxes outside a barber shop on Crosby Street one humid New York afternoon. He scrolls intently through his iPhone, the square ice cubes in a cup resting by his elbow tinted brown by what little remains of his coffee.

He looks great: thick black dreads piled in a haphazardly perfect manner atop his head, an off-white linen shirt that's both stylish and functionally appropriate for the unrelenting heat, baby blue pants that hug — not squeeze — his body, canvas sneakers, no socks. He's the modern man, cool and comfortable and aesthetically aware.

The other guys wandering down Crosby Street look similar, many with skinny black jeans rolled at the ankles, the better to show off bright new Nikes. The coolest dude pairs pants that have huge holes in the knees with an oversized white tee under button-down chambray, plus a flat-brim hat. He disappears into a building that's under construction. Even the worst-dressed men — five bros loudly recounting the previous night's exploits — look pretty good. They make their athleisure tracksuit pants work with the simple shirts and sneakers they chose after waking up hungover that morning.

Yes, this is Crosby Street, one of the most fashionable blocks in New York City, where signs herald the imminent opening of a Rick Owens boutique and idle stoop-sitters could be professional models. Guys should dress well here. But the focus on clothes has spread far, far beyond Soho.

We're witnessing a fascinating, exciting, very specific moment, a "choose-your-own-adventure time of menswear, where guys are letting their freak flags fly," in the words of Jian DeLeon, senior menswear editor for trend forecasting company WGSN. Information has never been more readily available, and online shopping has lowered the barrier to entry significantly. (...)

Traditionally, conversations about men's style have been quieter than the ones about women's, constantly happening only if you know where to look. In the last decade or so, though, they've become easier to find. The discussion moved online in the midaughts when forums like Ask Andy About Clothes and blogs like The Sartorialist started to enter the consciousness of a certain type of man. Guys geeking out about fashion could find each other, sharing tips about designers, history, whatever. Age mattered less than disposition. On the message boards and in the comments sections, no one knew or cared who was a teenager in Iowa or a thirtysomething in Manhattan. The only thing that mattered was that the poster had a smart sense of style, which meant focusing on timeless quality rather than of-the-moment trends, and offered an intelligent opinion.

Fast forward a few years, and the menswear conversation shifted to Tumblr, where you could find an endless stream of guys dressing to impress, often to the point of absurdity. This became known as #menswear, a reference to the Tumblr hashtag, and was epitomized by images of wannabe tastemakers peacocking at Pitti Uomo. (The mockumentary The Life of Pitti Peacocks features garish paisley suit jackets, absurd floral-print pants, and more in just its first half-minute; it illustrates the see-and-be-seen insanity perfectly, as do so many Instagram photos.) In response, satirical Tumblrs like Kevin Burrows and Lawrence Schlossman's Fuck Yeah Menswear cropped up, injecting a bit of fun into the increasingly self-serious #menswear movement. It was, after all, just clothes.

The ultimate distillation of this scene came with Four Pins, the Complex-owned site headed by Schlossman and his team of rabble-rousers. They took aim at anything and everything, mixing biting commentary with long explainers that placed trends in historical context. Readers had their laughs while learning about the clothes they were wearing, or at least aspired to own.

When Four Pins shut down in January, it felt like the end of an era. "It wasn't like someone was going to make their own Four Pins," says Schlossman, who now works as a brand director at the resale site Grailed. "It was more like if Four Pins can't succeed, then maybe this movement is done. It wasn't that the door was open. It was like the door was slammed shut."

Green agrees. "If ever there was a menswear punk-rock era, where it was like the Wild, Wild West — a bunch of uncool dudes talking shit and building this following that no one had ever really seen before, having fun, and making fun of these designers and men's clothing — that was it," he says. "As annoying as some of those guys are and as corny as some of them are, I think a lot of them are really witty and really smart. We made fun of it at the time, but I gotta say, I think it was special."

While #menswear might be dead, menswear has never been bigger. Online menswear sales in particular grew faster than every other category between 2010 and 2015, and show no signs of slowing down; research firm Euromonitor International speculates that the global menswear market will rise from $29 billion in 2015 to $33 billion by 2020. (By comparison, the women's clothing market actually declined by 0.9 percent annually between 2011 and 2016, according to research company IBISWorld.) One-third of men reported they'd like to spend more money on clothes in 2016 than they did in 2015, according to Rupa Ghosh, a retail analyst at Mintel.

Menswear is moving to the masses.

by Noah Davis, Racked |  Read more:
Image: Lindsay Mound

Why Luck Plays a Big Role in Making You Rich

Robert Frank was playing tennis one cold Saturday morning in Ithaca, N.Y., when his heart stopped. Sudden cardiac arrest—a short-circuit in the heart’s electrical signaling—kills 98 percent of its victims and leaves most of the rest permanently impaired.

Yet two weeks later, Frank was back on the tennis court.

How did this happen? There was a car accident a few hundred yards away from where Frank collapsed. Two ambulances responded but the injuries were minor and only one was needed. The other ambulance, usually stationed five miles away, reached Frank in minutes.

“I’m alive today because of pure dumb luck,” says Frank, a 71-year-old economics professor at Cornell University. Or you can call it a miracle. Either way, Frank can’t take credit for surviving that day. From coincidence or the divine, he got help. Nine years later, he is still grappling with the concept of luck. And, applied to his field of economics, it’s led him into some dangerous territory: wealth.

Talk about luck and money in the same sentence, he says, and prepare to deal with “unbridled anger.” U.S. Democratic Senator Elizabeth Warren of Massachusetts and President Barack Obama were pilloried for suggesting rich Americans should be grateful for what Obama called “this unbelievable American system that we have that allowed you to thrive.” Even referring to the wealthy as “the luckiest among us”—as I did a few months ago—can spark some unhinged reactions.

“There are people who just don’t want to hear about the possibility that they didn’t do it all themselves,” Frank says.

Mild-mannered and self-effacing, he isn’t about to tell the rich “you didn’t build that,” as Obama did (and likely regretted). Frank’s new book, Success and Luck: Good Fortune and the Myth of Meritocracy, is a study in diplomacy. Combining memoir with academic research, it’s an earnest argument that all of us—even the rich—would be better off recognizing how luck can lead to success. (...)

Winner-take-all markets

For more than 20 years, Frank has been studying the rise of winner-take-all markets—fields of fierce economic competition in which only a few top performers take home the bulk of the rewards. More and more of the economy is starting to look like sports or music, Frank says, where millions of people compete and the winners are paid thousands of times more than the runners-up.

Another example he gives is the humble neighborhood accountant. In the 20th century, the typical accountant was competing against nearby rivals. If you worked hard, there was a good chance of winning over the most lucrative clients in town. Today, neighborhood accountants face much more competition: Sophisticated global accounting firms can swoop in and sign up their biggest clients. Tax preparation, an accountant’s bread and butter, has been mostly swallowed up by two large players—H&R Block for storefront preparation and TurboTax online.

“Technology has enabled people who are best at what they do to extend their reach geographically,” Frank says. TurboTax was initially just one of a number of tax software programs on the market. But, as happened with search engines and social media sites, it was able to win over customers early, and its competitive advantage snowballed. TurboTax now dominates online tax preparation—thousands of local accountants replaced by one company.

In these winner-take-all markets, luck can play a huge role. A simulation conducted by Frank shows how: Imagine a tournament in which every contestant is randomly assigned a score representing their skill. In this simple scenario, the most skilled person wins. The more competitors there are, the higher the score the winner will likely have.

Now introduce chance by randomly assigning each participant a “luck” score. That score, however, can play only a tiny role in the ultimate outcome, just 2 percent compared with 98 percent allotted to skill. This minor role for chance is enough to tilt the contest away from the top-skilled people. In a simulation with 1,000 participants, the person with the top skill score prevailed only 22 percent of the time. The more competition there is, the hardest it is for skill alone to win out. With 100,000 participants, the most skilled person wins just 6 percent of the time.

Frank writes:
Winning a competition with a large number of contestants requires that almost everything go right. And that, in turn, means that even when luck counts for only a trivial part of overall performance, there’s rarely a winner who wasn’t also very lucky.
Winner-take-all markets can end up creating vast wealth differences between the lucky and unlucky. One person—smart, persistent, but unlucky—struggles, while an equally (or even slightly less) talented and hard-working person gets a lucky break that can reap millions, or billions, of dollars.

by Ben Steverman, Bloomberg |  Read more:
Image: William Andrew/Getty Images

Life at the Nowhere Office

You wake up and wonder: What time is it? Your little touchscreen says 2:54 a.m. Or 7:21 a.m. Or whatever. It is always anytime. And anytime is check-in time. With one ear on your pillow you check the number of likes your latest Facebook post has harvested, the Retweets of your latest birdsong, and then onto the Inbox. After eyeballing what awaits in the day ahead, you sift through the messages and rank them on importance, returning to them when showered and fully awake.

Wherever you are, you respond to the most urgent requests and make sure to nowhere yourself by deleting your “sent from my iPhone” signature. You could be at your desk already, right? No one needs to know that you are two blocks away. You don’t want to convey that you are on the run and not giving them your full attention. So with some digital camouflaging you say: I am in a place where I can give you due consideration. At no point are we on the train, in a cafe, in bed, in the restroom. Except of course we are.

Many of us recognize this morning routine. It might seem mundane, but like any regime, it is has an aesthetic. In fact, this vignette reflects the ideals of het nieuwe werken, a Dutch term meaning “the new way of working,” a reorganization of the office that promotes flexibility and “efficient” design, combining the fruits of a digitally-connected world and organically-formed social structures. Hailed as a “silent revolution”, it purports to liberate creative and entrepreneurial potential that would otherwise go untapped. The modern “inspired” workspace serves as essential infrastructure to this new organization of work. Not only does it accommodate these new rhythms; it makes them look good.

The interconnected values of “frictionless” dynamism, notional flattening of managerial hierarchies, and sociability that define contemporary professional work are mirrored in the spaces and gadgets that allow us to function in this rootless, diffuse way. A quick trawl through some design blogs or a richly illustrated book like The Creative Workplace quickly reveals a number of conventions of the twenty-first-century inspired workspace: open plans, glass walls, communal table-desks, high ceilings. Likewise, and thanks largely to Apple, we prefer our mobile devices shiny and monochrome. Industrial touches like unfinished plywood, subway tile, exposed brick, and Edison bulbs round out these spaces and imbue them with an aura of artisanal making, attempting to give material form to production that in all likelihood is relegated to computer screens.

Across these diverse spaces, the two most consistent design principles are openness and a banishment of personal clutter. The new office presents itself as the interior design equivalent of everyone’s friend. It is comfortable and always available, a temporary platform onto which workers alight for meetings and some deskwork before fluttering off to another meeting, the home office, another job. But importantly, leave no trace behind. Remember: You have never been here.The “dynamic” workplace has arisen at a time when professional work has become increasingly insecure.

The luxury minimalism that defines the inspired workspace is an extension of a broader aesthetic movement that Kyle Chayka has termed Airspace. Airspace is a new International Style of sorts, a set of design conventions that has spread across the globe thanks to the homogenization of taste facilitated by social media. “It is the realm of coffee shops, bars, startup offices, and co-live/work spaces that share the same hallmarks everywhere you go ... Minimalist furniture. Craft beer and avocado toast. Reclaimed wood. Industrial lighting,” Chayka writes. Indeed, all of the spaces he lists are, explicitly or not, workspaces for the mobile, constantly collaborating knowledge worker. Airspace is essentially diffused workspace because the office has become a mobile home. We take it with us everywhere we go.

How freeing this increased mobility is remains open to debate. Flexibility is a sharp double-edged sword within contemporary work culture. On the one hand, workers often do prefer the ability to drop in and out of the physical office: Recall the outcry when Yahoo! CEO Marissa Mayer clamped down on telecommuting at the company. On the other hand, as Nikil Saval and others have noted, it’s no coincidence that the “dynamic” workplace has arisen at a time when professional work has become increasingly insecure. Dynamism and mobility are meant to be liberating, but the darker connotations of cleared desks and ephemeral presence lurk in the shadows of the creative workplace’s imported espresso machines and Aeron chairs. (...)

Perhaps no individual did more to dismantle the physical barriers of these rooms than Robert Propst, an energetic American polymath who in the 1960s began advocating for a spatial organization of the workplace that most of us would recognize as the open-plan office, developing prototypes for what he called “Action Offices.” Nikil Saval writes in Cubed: A Secret History of the Office, that “Propst was one of the first designers to argue that office work was mental work and that mental effort was tied to environmental enhancement of one’s physical capabilities.” He essentially pioneered the very idea of the modern creative workspace.

The person Propst had in mind for his new, open, dynamic workspaces was the “knowledge worker,” a social figure newly articulated by management theorist Peter Drucker. As Saval tells it, knowledge workers were defined not only by their white-collar job titles, but also by a strong belief in their own mobility. Of course “mobility” didn’t have the association with precarity then that it does now. At the time, it was an exciting idea; each worker was in possession of his unique intellectual skillsets, untethered from specific institutions and free to pounce after each new opportunity as soon as it appeared.

This sense of mobility helped undermine traditional bureaucratic hierarchies and meshed perfectly with Propst’s design principles aimed at facilitating democratic, serendipitous encounters through diminished barriers and un-hierarchical gathering spaces like social tables rather than desks. Soon enough, the utopian design philosophy of Propst was co-opted and rationalized by the furniture industry, becoming a goldmine for his employer, Herman Miller. It is therefore perhaps unsurprising that Propst’s “Action Offices” eventually morphed into the grim cubicle. And yet, with the advent of het nieuwe werken and its accommodating Airspace aesthetic, the twenty-first century has rebooted his dreams of the open, inspirational, social workplace. Propst 2.0. (...)

Regardless how we feel about this, there is a shared recognition that if one wants to express oneself in this world of hyped self-management, one needs to be hooked up to the cloud, be it for email, Facebook, or any other social media. This resignation is Heidegger’s technological determinism made manifest. Indeed how helpless we feel when our passwords don’t work and we are locked out of the system. If we want to get any work done, we can only do so on the terms afforded by technology, which includes our ever-dispersing workspaces. (...)

The result is that your office diffuses much like a gas following the laws of entropy. This anywhering of the office renders our attempts to disappear by implementing out-of-the-office replies instantly moot and futile. Work will fill the space available to it. And with no space spared, it will find you wherever you are: not just your work office, but also your home, your yoga studio, your children’s kindergarten. And what is more, in addition to our physical selves we now have to manage this professional avatar as well. And due to the ongoing metrification and financialization of work we are increasingly stripped of the clutter that makes us us. All of our quirks and idiosyncratic features have no use, as they can either not be numbered or would just make us look messy and thus unproductive.

It is here that the controlling nature of the new aesthetic becomes most limpid and palpable: the constant sanitization of our digital selves reflects the homogenized minimalism of Airspace. Such thorough self-regulation enacts our—apparently willing—participation in Lewis Coser’s “greedy organizations,” those that sever an individual’s social ties and “can thrive only if they are able to absorb their members fully and totally within their confines.” It is no coincidence that the pursuit of transparency within contemporary management techniques such the 360 performance appraisal is replicated in the new aesthetic of the office. In neither place is there room for “dirt,” or “matter out of place,” as the anthropologist Mary Douglas famously stated. On the CEO’s digital dashboard we are (and want to be) little more than a sanitized number, perfectly ordered in sleek spreadsheets, proving how efficient, valuable and productive we are and how can be deployed as a resource towards our new project.

by Miya Tokumitsu and Joeri Mol, TNR |  Read more:
Image: Rakic / Shutterstock

Tuesday, September 6, 2016


Mario Sorrenti, Visionaire, Fall 1996
via:

The Drug of Choice For the Age of Kale

The day after Apollo 14 landed on the moon, Dennis and Terence McKenna began a trek through the Amazon with four friends who considered themselves, as Terence wrote in his book “True Hallucinations,” “refugees from a society that we thought was poisoned by its own self-hatred and inner contradictions.” They had come to South America, the land of yagé, also known as ayahuasca: an intensely hallucinogenic potion made from boiling woody Banisteriopsis caapi vines with the glossy leaves of the chacruna bush. The brothers, then in their early twenties, were grieving the recent death of their mother, and they were hungry for answers about the mysteries of the cosmos: “We had sorted through the ideological options, and we had decided to put all of our chips on the psychedelic experience.”

They started hiking near the border of Peru. As Dennis wrote, in his memoir “The Brotherhood of the Screaming Abyss,” they arrived four days later in La Chorrera, Colombia, “in our long hair, beards, bells, and beads,” accompanied by a “menagerie of sickly dogs, cats, monkeys, and birds” accumulated along the way. (The local Witoto people were cautiously amused.) There, on the banks of the Igara Paraná River, the travellers found themselves in a psychedelic paradise. There were cattle pastures dotted with Psilocybe cubensis—magic mushrooms—sprouting on dung piles; there were hammocks to lounge in while you tripped; there were Banisteriopsis caapi vines growing in the jungle. Taken together, the drugs produced hallucinations that the brothers called “vegetable television.” When they watched it, they felt they were receiving important information directly from the plants of the Amazon.

The McKennas were sure they were on to something revelatory, something that would change the course of human history. “I and my companions have been selected to understand and trigger the gestalt wave of understanding that will be the hyperspacial zeitgeist,” Dennis wrote in his journal. Their work was not always easy. During one session, the brothers experienced a flash of mutual telepathy, but then Dennis hurled his glasses and all his clothes into the jungle and, for several days, lost touch with “consensus reality.” It was a small price to pay. The “plant teachers” seemed to have given them “access to a vast database,” Dennis wrote, “the mystical library of all human and cosmic knowledge.”

If these sound like the joys and hazards of a bygone era, then you don’t know any ayahuasca users—yet. In the decades since the McKennas’ odyssey, the drug—or “medicine,” as many devotees insist that it be called—has become increasingly popular in the United States, to the point where it’s a “trendy thing right now,” as Marc Maron said recently to Susan Sarandon, on his “WTF” podcast, before they discussed what she’d learned from her latest ayahuasca experience. (“I kind of got, You should just keep your heart open all the time,” she said. “Because the whole point is to be open to the divine in every person in the world.”)

The self-help guru Tim Ferriss told me that the drug is everywhere in San Francisco, where he lives. “Ayahuasca is like having a cup of coffee here,” he said. “I have to avoid people at parties because I don’t want to listen to their latest three-hour saga of kaleidoscopic colors.”

Leanna Standish, a researcher at the University of Washington School of Medicine, estimated that “on any given night in Manhattan, there are a hundred ayahuasca ‘circles’ going on.” The main psychoactive substance in ayahuasca has been illegal since it was listed in the 1970 Controlled Substances Act, but Standish, who is the medical director of the Bastyr Integrative Oncology Research Center, recently applied for permission from the F.D.A. to do a Phase I clinical trial of the drug—which she believes could be used in treatments for cancer and Parkinson’s disease. “I am very interested in bringing this ancient medicine from the Amazon Basin into the light of science,” Standish said. She is convinced that “it’s going to change the face of Western medicine.” For now, though, she describes ayahuasca use as a “vast, unregulated global experiment.” (...)

The first American to study ayahuasca was the Harvard biologist Richard Evans Schultes, who pioneered the field of ethnobotany (and co-authored “Plants of the Gods,” with Albert Hofmann, the Swiss scientist who discovered LSD). In 1976, a graduate student of Schultes’s brought a collection of the plants back from his field research to a greenhouse at the University of Hawaii—where Dennis McKenna happened to be pursuing a master’s degree. Thanks to McKenna, some B. caapi cuttings “escaped captivity,” he told me. “I took them over to the Big Island, where my brother and his wife had purchased some land. They planted it in the forest, and it happened to like the forest—a lot. So now it’s all over the place.”

Terence McKenna died in 2000, after becoming a psychedelic folk hero for popularizing magic mushrooms in books, lectures, and instructional cassette tapes. Dennis McKenna went on to get a doctorate in botany and is now a professor at the University of Minnesota. When we spoke, he was on a book tour in Hawaii. He had been hearing about ayahuasca use in a town on the Big Island called Puna, where people call themselves “punatics.” “Everybody is making ayahuasca, taking ayahuasca,” he said. “It’s like the Wild West.”

If cocaine expressed and amplified the speedy, greedy ethos of the nineteen-eighties, ayahuasca reflects our present moment—what we might call the Age of Kale. It is a time characterized by wellness cravings, when many Americans are eager for things like mindfulness, detoxification, and organic produce, and we are willing to suffer for our soulfulness.

Ayahuasca, like kale, is no joy ride. The majority of users vomit—or, as they prefer to say, “purge.” And that’s the easy part. “Ayahuasca takes you to the swampland of your soul,” my friend Tony, a photographer in his late fifties, told me. Then he said that he wanted to do it again.

by Ariel Levy, New Yorker |  Read more:
Image: Bjorn Lie

No ‘For Sale’ Sign? Silicon Valley Buyers Aren’t Deterred

Swell-looking home you’ve got here. Ever think about selling it? How about to me, right now?

That is increasingly the approach the house-hungry are using in Silicon Valley, where the number of homes on the market is so small that would-be buyers are driven to desperation. Their solution: seek out homes that are, in theory at least, not for sale.

Sue Zweig grew up in this working-class community, back when people said it was for the newly wed and the nearly dead. Not long ago, when she was out walking her dog, she began to realize things were different. A woman pulled over, asked about houses for sale in the neighborhood and ended up spending 45 minutes poking around Ms. Zweig’s living room and kitchen.

Her four-bedroom house was not on the market then, and it was not on the market a year or so later when another eager buyer showed up. This time, Ms. Zweig, a nurse, and her husband, Steve Zweig, made a deal for $1.375 million, a seven-figure profit over what they had paid in 1987. They moved out of the house last year.

Buyers in Silicon Valley must be aggressive and innovative as well as well-heeled, especially as housing inventory here hits its lowest point in at least 20 years. In San Mateo County, which includes Redwood City, the number of homes for sale in August was 1,184. That is a drop of 62 percent from a decade ago, even as the population increased more than 70,000.

It is a microcosm of a growing national problem. The number of homes on the market in the United States has fallen on a year-over-year basis for the last 14 months, the National Association of Realtors says. When adjusted for population, the inventory of homes for sale is the lowest it has been since modern records started being kept in 1982.

Flushing out people before they are officially ready to sell — by a few weeks or a few years — has obvious benefits for buyers, but sellers say they can profit, too. It streamlines an expensive process that traditionally consumes many months.

“We probably left money on the table,” said Mr. Zweig, a retired digital printer. “But we didn’t have to list it, didn’t have to do open houses, didn’t have to stage it.”

There is a long history among Silicon Valley’s elite of buying houses that are not for sale. Mark Zuckerberg, the billionaire chief executive of Facebook, found a place he liked near San Francisco’s Mission District in 2012 and paid the owner at least twice what it was worth.

People of much more modest means are now echoing his tactics, even if they cannot extend his lavish terms.

“Technology is making people impatient,” said Steve Korn, a retired forklift facility manager who is now a real estate agent here. “No one wants a six-month slog anymore to get a new place or move on from an old place.”

Technology is also fueling this boom in a more direct way. Tech companies, especially Facebook and Google, have plans to build new campuses and hire even more workers.

That means even places like Redwood City, a longtime also-ran to neighbors like Atherton, Menlo Park and Palo Alto, are now hot. That’s a windfall for longtime residents like the Zweigs, who moved to a coastal town 220 miles south and built a new house.

Prices in their old Redwood City development have continued to soar, prompting some wishful dreams among those who remain.

Michele and Mike Sweeney put a “make me move” notice on their 2,060-square-foot house last year. That is a feature that the online real estate company Zillow offers to let owners solicit interest. Their demand was $1.9 million, significantly more than their house was worth.

“We used to say around here, ‘If it hits a million, we’re all selling,’” said Ms. Sweeney, who works for a hospital. “That was not too long ago.”

They were flooded with inquiries but did not make a deal. Now, according to Zillow, their house is rapidly approaching the price they wanted. “I asked my son, ‘Do you want to finish high school in Italy?’” Ms. Sweeney said.

by David Streitfeld, NY Times |  Read more:
Image: Gabrielle Lurie

Pete Wells Has His Knives Out

[ed. Great article, and Pete Wells is a national treasure (his visit to Senor Frog's in Times Square has to be one of the funniest reviews I've ever read). Want to be the food critic for the world's most influencial newspaper in the world's most influencial city? Here's how you do it.]

Pete Wells, the restaurant critic of the Times, who writes a review every week—and who occasionally writes one that creates a national hubbub about class, money, and soup—was waiting for a table not long ago at Momofuku Nishi, a modish new restaurant in Chelsea. Wells is fifty-three and soft-spoken. His balance of Apollonian and Dionysian traits is suggested by a taste for drawing delicate sketches of tiki cocktails. Since starting the job, in 2012, he has eaten out five times a week. His primary disguise strategy is “to be the least interesting person in the room,” he had told me, adding, “Which I was, for many years. It’s not a stretch.” But he does vary his appearance. At times, he’ll be unshaven, in frayed jeans; in Chelsea, he looked like a European poet—a gray wool suit over a zip-up sweater, a flat cap pulled low, nonprescription glasses. He was carrying a memoir, written by a friend, titled “Bullies.”

Wells had encouraged me to arrive just ahead of him, and to ask for the reservation for two, at nine-forty-five, under the randomly chosen name of Michael Patcher. There was half a chance that I’d be allowed to sit before he showed up. If so, then at least one aspect of the evening would have what Wells calls a “civilian” texture, even if he was recognized. (As he put it, “If we’re very lucky, we might get a bad table.”) But when Wells arrived I was still waiting to sit down. So we stood near the door, at an awkward, congested spot from which we could have reached out and taken a clam from someone’s plate of Asian-Italian noodles.

The front of the room was bare and bright, and filled with thirty-year-olds on backless stools at communal pale-wood tables—a picnic held in a cell-phone store. The noise level reminded me of Wells’s review of a Tex-Mex restaurant: “It always sounds as if somebody were telling a woman at the far end of the table that he had just found $1,000 under the menu, and the woman were shouting back that Ryan Gosling had just texted and he’s coming to the restaurant in, like, five minutes!” Wells is not peevish about discomfort. His columns make a subtle study of what counts as fun in middle age—loyalties divided between abandon and an early night. His expressions of enthusiasm often take the form of wariness swept away: Wells found joy in a conga line at Señor Frog’s, in Times Square. But after dining at Momofuku Nishi he returned to his home, in Brooklyn, and wrote in his notes about “a hurricane of noise.”

Two minutes after Wells arrived at the restaurant, it became clear that he’d been spotted. His friend Jeff Gordinier—a journalist who, until recently, reported on restaurants for the Times—had spoken with me about Wells’s chances of remaining anonymous by referring to a famous contractual demand made by Van Halen: concert promoters were asked to supply the band with a backstage bowl of M&M’s, with the brown ones removed. David Lee Roth, Van Halen’s lead singer, has said that the request was not whimsical. It helped to show whether a contract had been carefully read and, therefore, whether the band’s complex, and potentially dangerous, technical requirements were likely to have been met. Gordinier said that an ambitious New York restaurant’s ability to spot Pete Wells is a similar indicator of thoroughness: “If they don’t recognize who he is, then they are missing a very important detail, and therefore they may not be paying attention to other important details.”

In 1962, Craig Claiborne became the first person at the Times to review restaurants regularly; two decades later, he published a memoir, noting that he had “disliked the power” of being a critic. He added, “It burdened my conscience to know that the existence or demise of an establishment might depend on the praise or damnation to be found in the Times.” Much of that power remains, even as it has seeped away from reviewers of theatre and painting; Wells is a vestige of newspaper clout. And, because successful chefs now often sit atop empires, a single bad review can threaten a dozen restaurants and a thousand employees. When Wells reviewed Vaucluse, on the Upper East Side, he began by identifying the restaurant’s parent company, founded by the chef Michael White and Ahmass Fakahany, a former Merrill Lynch executive: “A critic could run out of new ways to express disappointment in Altamarea Group restaurants if Altamarea didn’t keep coming up with new ways to disappoint.”

The Momofuku Group, run by the thirty-nine-year-old chef David Chang, has in recent years expanded into fast food, overseas restaurants, and a quarterly magazine named Lucky Peach. But Momofuku Nishi was the company’s first full-scale, sit-down restaurant to open in New York in five years. A visit from Wells was a certainty. A copy of the one photograph of him that is widely available online, in which he looks like a character actor available to play sardonic police sergeants, was fixed to a wall in the restaurant’s back stairwell. Chang recently told me that, despite the profusion of opinion online, he still thought of the Times as the “judge and jury” of a new venture, if not the executioner.

In the logjam by the restaurant’s door, a young woman in a dark fitted jacket—later identified as Gabrielle Nurnberger, one of the restaurant’s managers—smiled at Wells, then turned away. Wells said to me, “Look at this,” and we watched as she strode toward the kitchen with her arms down, like a gymnast starting a run-up. (At the equivalent moment of discovery in another restaurant, I saw a manager mouth to Wells’s server “Good luck,” and place a reassuring hand on her arm.) There was increased activity in and out of the kitchen, which was half exposed to the room. We waited a few more minutes, and were then shown to a spot at the edge of the hurricane, against a wall. Our neighbors were taking photographs directly above their bowls of Ceci e Pepe. The dish, a riff on pasta cacio e pepe, using fermented chickpea paste in place of Pecorino, was central to the restaurant’s promoted identity, suggesting technical expertise in the service of amused nonconformity. (Chang told me, later, that he had conceived of the menu as a “Fuck you” to Italian cuisine.) We were given menus with wry footnotes. Wells took off his fake glasses and put on his reading glasses.

Nurnberger became our server. Wells is an unassuming man who has become used to causing a stir, and this can be disorienting: it’s odd to hear him wonder, not unreasonably, if restaurants ever think of bugging his table. But a restaurant can’t openly acknowledge him. A while ago, he happened to sit next to Jimmy Fallon, the host of the “Tonight Show,” at the counter of a sushi restaurant in the Village. Both men were recognized. As Wells recalled it, Fallon “got the overt treatment”: “big smiles and ‘Thank you for coming in’ ” and perhaps an extra dish or two. Wells’s experience was that “every dish of mine was an object of attention and worry before it got to me”—he often has a slower meal than other diners do, because dishes get done again and again until they are deemed exemplary. As usual, his water glass “was always being topped up.” But it was “as if none of this were happening.”

Experienced for the first time, this covert cosseting feels slightly melancholy, like an episode of Cold War fiction involving futile charades and a likely defenestration. Nurnberger was a gracious server but, understandably, not quite at ease. She risked overplaying her role, like Sartre’s waiter in “Being and Nothingness,” who “bends forward a little too eagerly” and voices “an interest a little too solicitous for the order of the customer.” In her effort to help, Nurnberger came close to explaining what a menu was. Rote questions about how we gentlemen were getting on—usually asked of me—had a peculiar intensity. “I’m very reluctant to break the fourth wall,” Wells had said to me earlier, speaking of restaurant staff. “But I wish there were some subtle way to say, ‘Don’t worry!’ ” He sighed—he often sighs—and added, “I can’t honestly say that. Because sometimes they should worry.”

When Wells speaks, his fingers often flutter near his temples, as if he were a stage mentalist trying to focus. He ordered several plates of food; after hesitation, he asked for a glass of white wine. He does not follow Craig Claiborne’s practice, in the nineteen-sixties, of weighing himself every day, but he has begun to think of alcohol as calories that he can skip without being professionally lax. He is not fat, but the job stands between him and leanness: he can’t turn down food. “My body is not my own,” he said.

When dishes arrived, he looked at them sternly for a moment. We talked, or shouted, about his older son’s food allergies, and about a decision, just made at the Times, to have him regularly assess restaurants outside New York. (The first of these reviews, from Los Angeles, appears online on September 6th.) He talked of his earlier career, as an editor at Details, a columnist at Food & Wine, and the dining editor of the Times, when he had opportunities to watch chefs work and ask them questions. In his current role, he’d probably leave the room if someone like Chang turned up at the same cocktail party. “The danger is getting friendly with people you should feel free to destroy,” he said, and then stopped. “That’s not really the word, but you get the idea. People you should feel free to savage, when you have to.” Over my shoulder, Wells could see into the kitchen. At the start of the evening, Chang wasn’t visible, but then he was. “He may have been airlifted,” Wells said. For the critic’s benefit, a chef-commander, summoned from a sister restaurant or a back office, may take over from a lieutenant. Though Chang’s brand is built on unconventionality, he respected the convention of the fourth wall. The two men, who were on friendly terms before Wells became a critic, made eye contact but did not acknowledge each other.

by Ian Parker, New Yorker |  Read more:
Image: Luci Gutierrez

Monday, September 5, 2016


Richard Misrach, Desert Fire #77, 1984
via: