Friday, July 4, 2014

The Fear Factor

In 2010, a respected international team published a study finding that old age generally arrives later than the dependency ratio assumes—if old age is defined as the point at which older people need permanent care, that is, when they are disabled. The demographers Warren C. Sanderson and Sergei Scherbov wrote in Science magazine, “Alternative measures that account for life-expectancy changes”—improvements in health and longevity—“show slower rates of aging than their conventional counterparts,” based on “fixed chronological ages.”

They wrote that chronological age is less useful than life expectancies in predicting national health costs, because “most of those costs occur in the last few years of life.” Sanderson and Scherbov developed a measure they called the adult disability dependency ratio, defined as the number of adults 20 and over with disabilities, divided by the number of adults 20 and over without them. In the United States, this measure will likely remain flat for the next generation, meaning that the cost of caring for the disabled is not likely to skyrocket as a result of a major increase in the number of disabled people.

John Shoven, a Stanford economist, takes that idea a step further: in a scholarly paper called “New Age Thinking,” he argues that age should be defined differently from the universal convention of years since birth. “The measurement of age with different measures is not like choosing between measuring temperature on a Fahrenheit or Centigrade scale,” he warned. The reason to change how age is measured is that the connection between the universal definition of age and the alternatives he proposes is constantly changing. Because of advances in nutrition, sanitation, and other factors, as well as health care, someone who has lived a long time is no longer as old as his or her numerical age once indicated.

A man born in 1900 was expected to live until he was 51 ½ and had less than a 50 percent chance of living until he reached 65. A man born in 2000 is expected to live until he is 80 and has an 86 percent chance of reaching 65. That dramatic advance in longevity indicates that knowing how many years a person has been alive tells only so much about the person’s risk of dying.

Shoven proposes that instead of measuring age backward, as in years since birth, we measure it forward, as in years until projected death. One option is to measure age by mortality risk. A 51-year-old man in 1970 had the same mortality risk (a one percent chance that he would die) as a 58-year-old man in 2000: in one generation, longevity advanced by seven years for that level of risk. Another option is to measure age by remaining life expectancy, a more accessible measure because it is computed in years rather than as a percentage. In 1900, a man who reached 65 had a remaining life expectancy of about 13 years. In 2000, a man who reached 65 had a life expectancy of about 21 years.

Measuring backward yields starkly different results from measuring forward. “Consider two alternative definitions of who is elderly in the population,” Shoven writes, “those who are currently 65 or older and those who have a mortality rate of 1.5 percent or worse.” In 2007, when he wrote this paper, the two definitions were equal: the average mortality rate was 1.5 percent or worse for 65-year-olds. According to the U.S. Census, the population of those who are 65 or older will increase from about 12.5 percent of the population in 2035 to about 20.5 percent in 2050. But “the percent of the population with mortality risks higher than 1.5 percent (currently also 12.5 percent of the population) never gets above 16.5 percent,” because of what James Fries of the Stanford School of Medicine called “the compression of morbidity”—the tendency of illnesses to occur during a short period before death if the first serious illness can be postponed. That number “is projected to be just slightly below 15 percent and declining by 2050.”

By the conventional measure of years since birth, the population considered elderly is expected to grow by 64 percent. By Shoven’s measure, on the other hand, it is expected to grow by just 32 percent. “The point,” he says, “is the great aging of our society is partly a straightforward consequence of how we measure age.”

To Laura Carstensen, a psychologist who directs the Stanford Center on Longevity, the striking advance in lifespan requires “us to answer a uniquely twenty-first-century question: What are we going to do with super-sized lives?” In her book A Long Bright Future, she envisions a transformation in American culture and society that would “expand youth and middle age” as well as old age, in “a new model for longer life” that would “harness the best of each stage at its natural peak.”

She proposes that young adults should ease into the work force, “working fewer hours during the years that they’re caring for young children, completing their educations, and trying to find the right careers.” Around 40, full-time work life would begin, when people “have developed the emotional stability that guides them as leaders.” Older workers, rather than “vaulting into full retirement on their sixty-fifth birthdays,” would continue to work for more years but for fewer hours, and retirement “could be the pinnacle of life, rather than its ‘leftovers.’ ”

Carstensen’s proposal rests on findings in her work about the capabilities of older workers. She learned that they are generally more stable emotionally than younger workers and better at dealing with stress, and that while younger workers, by and large, pick up new information faster, older workers often have wider knowledge and more expertise. One important study by a group at the Rush University Medical Center casts doubt even on the cognitive advantage of younger workers. The decline in cognitive processing speed found in older workers turns out to be negligible when people who later developed Alzheimer’s disease are removed from the group studied. That would include one out of every nine people who are 65 and over.

Carstensen and others are building on the work of the late Robert N. Butler, a psychiatrist whose biographer described him as a “visionary of healthy aging.” The founding director of the National Institute on Aging at the National Institutes of Health, Butler believed that the extension of American lives—especially the extension of the healthy years—requires new thinking about some of America’s basic institutions. “Many of our economic, political, ethical, health, and other institutions, such as education and work life, have been rendered obsolete by the added years of life for so many citizens,” Butler wrote in his 2008 book, The Longevity Revolution.

Butler was a realist about the discrimination that older Americans can face in addition to declines in physical capability, health, and cognitive ability. He coined the term ageism for this form of discrimination and catalogued how it can manifest itself in problems finding appropriate work, housing, transportation, and satisfying other basic needs. But Butler was an optimist, convinced that many healthy older Americans represent not a liability but a great asset of experience, skill, and drive that the country should learn how to exploit.

In a nation whose motto is E pluribus unum, a fundamental disagreement about social policy in recent decades has been about how policymakers should reinforce the mutual support called for in the motto. They could emphasize the value of older Americans working on behalf of children in education, for example, and younger Americans supporting older ones who need help. Or policymakers could strive to ease the allegedly large conflict between generations over the allocation of scarce resources. The shorthand for this difference of opinion in our splintered political culture is “warfare” versus “interdependence” between the boomer generation and the generations that follow. Our emphasis should be on generational interdependence.

by Lincoln Caplan, American Scholar |  Read more:
Image: David Herbick

[ed. I always keep a fresh supply of Simply Lemonade on hand (because it's so good!). Enjoy your 4th of July.]

Zero 7



[ed. See also: this excellent Skeye cover.]

David Hockney, A Lawn Sprinkler, 1967
via:

Jessica Barthel
via:

Thursday, July 3, 2014

Out West, There’s a Showdown Between Green Chili And Kalua Pig

This is review No. 8 of 16 in the first round of our competition. Each review will compare four burritos, with my favorite advancing to Round 2.

I’m rating the burritos on five attributes, each worth 20 points. The scoring will resemble a standard bell curve; the very highest totals will be reserved for exceptional burritos, while a score of 60 is still a burrito I would want to eat regularly.

The final competition in the West is a face-off among restaurants from the far corners of the region, from Kalua pig in Hawaii to green chili and pork in El Paso, Texas.

Kono's, Haleiwa, Hawaii

I landed at the airport in Honolulu, my first time in Hawaii, and headed straight to the marketplace in historic downtown Haleiwa on the North Shore of Oahu island. Tchotchkes at the airport’s shops spoke the joys of “island time,” as had the websites for hotels and rentals I’d seen before I came. Although I was in a hurry, just 36 hours on the island, I was determined to slow those hours down and make the most of them. I broke out of the oppressive Honolulu traffic and made it to Haleiwa. The air was warm and thick with humidity, slowing me down already.

Inside Kono’s, surfboards from the young all-stars of the North shore (John John Florence, Kiron Jabour, Eli Olson) hang on the wall, with handwritten dedications to the restaurant and its stellar burritos. A giant blackboard behind the counter describes the menu, and a pulley system takes order tickets from the counter to the kitchen.

I ordered a lemonade and a Pig Bomber. The lemon juicer was broken, so there was no lemonade. The receipt machine was broken, too, though I was probably the only patron in weeks to request a receipt. Kono’s was out of iced coffee as well, so I settled on a milkshake, which came in tropical fruit flavors and the standard chocolate, vanilla and peanut butter. I took a seat outside and waited. When a voice on the 1950s drive-in-style microphone called out for “the person who ordered a shake a few minutes ago,” I went inside to find out the restaurant was also out of ice cream.

It seemed a small price to pay for island life. Who was I to be frustrated? I was there for the burrito, which made it to my picnic table without incident.

The Pig Bomber was filled with burgundy shreds of kalua pig, caramelized onions, sweet and tangy guava barbecue sauce, jack and cheddar cheese, and plain rice. Unfortunately the distribution of ingredients was a bit erratic, meaning too many bites of plain rice, but the combination of flavors was magical where the fat, round rice kernels had soaked up the bright guava and kalua juices. After nearly two weeks in the land of expertly handmade tortillas, the run-of-the-mill tortillas at Kono’s were clearly utilitarian, and although they did little to enhance the burrito’s flavor, they were better than most grocery-store varieties. The salty shreds of kalua pig were offset by the sweetness of the onions and guava barbecue; even now, back on the mainland, I can close my eyes and taste that flavor combination.

I ate at Kono’s again the next morning, to try a breakfast burrito, before returning to Honolulu to catch my flight. The Breakfast Bombers are stuffed with gobs of fluffy egg, kalua pig and heavily spiced potato; I added avocado for good measure. I can understand why it’s a favorite among surfers in need of a heavy dose of calories, but it’s also delicious. (...)

Rancho Bravo, Seatlle, Washington

Like so many, it started as a taco truck. Rancho Bravo then added a brick-and-mortar spot in downtown Seattle, taking over where a fast-food chain had been. Employees use the old drive-thru, indicated by the wraparound driveway and remnant “Clearance 8’ 10” ” sign, for parking. For several blocks in either direction, hip bars and coffee shops line the street on one side and a lush park sits on the other; Rancho Bravo seems to be from a different era. There was no sign anywhere on the building indicating the occupant, but judging by the crowd, its location is not a secret.

The Pacific Northwest knows Asian flavors almost the way the Southwest knows Mexican ones. It seems like there’s bàhn mí on every block, pho on every corner and hand-pulled Chinese noodles available 24 hours a day. It’s not the same with good Mexican food, and our Burrito Selection Committee struggled while choosing a burrito in Oregon or Washington. Online sources varied wildly: Some lauded Rancho Bravo as the real deal in an area with few options, but others begrudgingly accepted it as good enough only to sop up the alcohol at the end of a long night. I had no idea what to expect.

I placed my order and walked toward the end of the counter to pick out salsas. Before I had time to make sense of the four on offer (two red, two green — a hot and mild version for each color), my order was ready. It couldn’t have been more than 30 seconds – impressive, though slightly disconcerting. I got a tiny paper cup of each salsa and took a seat at a table outside to catch the last sunbeams before the night cold set in.

The Bravo Burrito comes with sour cream, grilled onion, cheese, rice, pinto beans, cilantro and tomato. I ordered it with al pastor, which was cut in tiny little cubes and charred all around. It was also salty, and I’m not sure where the pineapple in the marinade disappeared to, but I washed each bite down with naturally sweet agua fresca de piña, so I didn’t miss it much. The tortilla was griddled, but the burrito started to fall apart before I was halfway through. The grilled onions were brown and flavorful, the pintos flush with juice but heavy on the salt and a little mushy.

by Anna Maria Barry-Jester, FiveThirtyEight |  Read more:
Images: uncredited 

Machines v. Lawyers

Law schools are in crisis, facing their most substantial decline in enrollment in decades, if not in the history of legal education. Applications have fallen over 40 percent since 2004. The legal workplace is troubled, too. Benjamin Barton, of the University of Tennessee College of Law, has shown that attorneys in “small law,” such as solo practitioners, have been hurting for a decade. Attorney job growth has been flat; partner incomes at large firms have recently recovered from the economic downturn, but the going rate for associates, even at the best firms, has stagnated since 2007.

Some observers, not implausibly, blame the recession for these developments. But the plight of legal education and of the attorney workplace is also a harbinger of a looming transformation in the legal profession. Law is, in effect, an information technology—a code that regulates social life. And as the machinery of information technology grows exponentially in power, the legal profession faces a great disruption not unlike that already experienced by journalism, which has seen employment drop by about a third and the market value of newspapers devastated. The effects on law will take longer to play themselves out, but they will likely be even greater because of the central role that lawyers play in public life.

The growing role of machine intelligence will create new competition in the legal profession and reduce the incomes of many lawyers. The job category that the Bureau of Labor Statistics calls “other legal services”—which includes the use of technology to help perform legal tasks—has already been surging, over 7 percent per year from 1999 to 2010. As a consequence, the law-school crisis will deepen, forcing some schools to close and others to reduce tuitions. While lawyers and law professors may mourn the loss of more lucrative professional opportunities, consumers of modest means will enjoy access to previously cost-prohibitive services.

A decline in the clout of law schools and lawyers could have potentially broader political effects. For the last half-century, many law professors and lawyers have pressed for more government intervention in the economy. This isn’t surprising. Lawyers in the modern regulatory state reap rewards from big government because their expertise is needed to understand and comply with (or exploit) complicated and ever-changing rules. In contrast, the entrepreneurs and innovators driving our computational revolution benefit more from a stable regulatory regime and limited government. As they replace lawyers in influence, they’re likely to shape a politics more friendly to markets and less so to regulation. (...)

Five key areas of law now face encroachment by this machine intelligence. Some invasions are imminent, and others more distant but no less likely.

by John O. McGinnis, City Journal |  Read more:
Image: Arnold Roth

This Fish Could Save the Caribbean Coral Reefs


The International Union for Conservation of Nature has released a massive report on the health of Caribbean coral reefs. Based on data collected from 35,000 surveys spanning 42 years, it is the most comprehensive study on the reefs ever published.

The bad news is that coral reefs are declining at a breakneck pace. Only a sixth of the structure’s original range has survived the last few decades, and it may take only 20 years to edge out the remaining reefs.

The good news is that there may be a very simple answer: don’t kill parrotfish. The study found that these keystone herbivores disproportionately contribute to the health of their host reefs by feeding on coral-suffocating algae.

The IUCN cited the declining parrotfish population, as well as other algae grazers like sea urchins, as the key driver behind reef collapse. It even trumps the negative effects of climate change on tropical reefs, though that may change over the coming decades.

by Becky Ferreira, Motherboard |  Read more:
Image: Phil's 1stPix

Rude Hooker


From the movie High Road (2011)
[ed. Hilarious. The audio quality is kind of... fagotty. You may need to turn up the sound a bit.]


Candy Chang - Self-evalutation in transit - 2006
Sidewalk Psychiatry
via:

Wednesday, July 2, 2014

Fastest-Growing Metro Area in U.S. Has No Crime or Kids


[ed. Sounds like hell on earth to me. Who'd want to isolate themselves from grandchildren, dogs, young people and families? Different ethnic cultures and businesses? New experiences and new perspectives? A lot of people, apparently. But hey, at least there's no crime (and it's clean!)]  

For Jerry Conkle, life in America’s fastest-growing metropolitan area moves as slowly as the golf carts that meander through his palm-lined neighborhood at dusk. Most days, he wakes early, reads the newspaper, and then hops into his four-wheeled buggy for a 20-mile-per-hour ride to one of the 42 golf courses that surround his home.

“It’s like an adult Disney World,” Conkle, 77, said of The Villages, Florida, whose expansion has come with virtually no crime, traffic, pollution -- or children.

The mix has attracted flocks of senior citizens, making The Villages the world’s largest retirement community. Its population of 110,000 has more than quadrupled since 2000, U.S. Census Bureau data show. It rose 5.2 percent last year, on par with megacities like Lagos, Nigeria, and Dhaka, Bangladesh.

That the most rapidly expanding U.S. metro area is a Manhattan-sized retirement village -- with more golf carts than New York has taxis -- highlights the transformation of the world’s demographic profile. The over-60 set -- which the United Nations projects will almost triple to 2 billion by 2050 -- offers opportunity to marketers and homebuilders even as it confounds governments that must care for an aging populace.

“A lot of communities see seniors as a huge benefit -- they contribute to the tax base and the local economy,” said William Frey, a demographer and senior fellow at the Brookings Institution in Washington. “But these people are going to get older, and they’re going to have health needs and service needs.”

Few have benefited from the spending power of retirees more than H. Gary Morse, who developed The Villages. The Holding Company of the Villages Ltd., owned by Morse and his family, has sold more than 50,000 new homes since 1986, generating $9.9 billion in revenue, according to disclosures in municipal-bond filings.

The Villages, which has rules governing everything from how long children can visit to how many pet fish residents can keep, has helped Morse build a family fortune worth $2.9 billion, according to the Bloomberg Billionaires Index.

In addition to selling homes, Morse, 77, and his family own the local newspaper, a radio station and a television channel.

They also hold a controlling interest in Citizens First Bank, which provides mortgages. The holding company is the landlord of more than 4.5 million square feet of commercial real estate, including dozens of restaurants and retailers.

“They own everything,” said Andrew D. Blechman, author of “Leisureville,” a book about The Villages and other retirement communities that ranks Morse’s as the biggest. “You basically have a city of 100,000 people, owned by a company.”

by Toluse Olorunnipa, Bloomberg | Read more:
Image: Bloomberg

The Power of Two

In the fall of 1966, during a stretch of nine weeks away from the Beatles, John Lennon wrote a song. He was in rural Spain at the time, on the set of a movie called How I Won the War, but the lyrics cast back to an icon of his boyhood in Liverpool: the Strawberry Field children’s home, whose sprawling grounds he’d often explored with his gang and visited with his Aunt Mimi. In late November, the Beatles began work on the song at EMI Studios, on Abbey Road in London. After four weeks and scores of session hours, the band had a final cut of “Strawberry Fields Forever.” That was December 22.

On December 29, Paul McCartney brought in a song that took listeners back to another icon of Liverpool: Penny Lane, a traffic roundabout and popular meeting spot near his home. This sort of call-and-response was no anomaly. He and John, Paul said later, had a habit of “answering” each other’s songs. “He’d write ‘Strawberry Fields,’ ” Paul explained. “I’d go away and write ‘Penny Lane’ … to compete with each other. But it was very friendly competition.”

It’s a famous anecdote. Paul, of course, was stressing the collaborative nature of his partnership with John (he went on to note that their competition made them “better and better all the time”). But in this vignette, as in so many from the Beatles years, it’s easy to get distracted by the idea of John and Paul composing independently. The notion that the two need to be understood as individual creators, in fact, has become the contemporary “smart” take on them. “Although most of the songs on any given Beatles album are usually credited to the Lennon-McCartney songwriting team,” Wikipedia declares, “that description is often misleading.” Entries on the site about individual Beatles songs take care to assert their “true” author. Even the superb rock critic Greg Kot once succumbed to this folly. (...)

For centuries, the myth of the lone genius has towered over us, its shadow obscuring the way creative work really gets done. The attempts to pick apart the Lennon-McCartney partnership reveal just how misleading that myth can be, because John and Paul were so obviously more creative as a pair than as individuals, even if at times they appeared to work in opposition to each other. The lone-genius myth prevents us from grappling with a series of paradoxes about creative pairs: that distance doesn’t impede intimacy, and is often a crucial ingredient of it; that competition and collaboration are often entwined. Only when we explore this terrain can we grasp how such pairs as Steve Jobs and Steve Wozniak, William and Dorothy Wordsworth, and Martin Luther King Jr. and Ralph Abernathy all managed to do such creative work. The essence of their achievements, it turns out, was relational. If that seems far-fetched, it’s because our cultural obsession with the individual has obscured the power of the creative pair.

John and Paul epitomize this power. Geoff Emerick—who served as the principal engineer for EMI on Revolver, Sgt. Pepper’s Lonely Hearts Club Band, some of The White Album, and Abbey Road—recognized from the outset that the two formed a single creative being. “Even from the earliest days,” he wrote in his memoir, Here, There and Everywhere, “I always felt that the artist was John Lennon and Paul McCartney, not the Beatles.”

One reason it's so tempting to try to cleave John and Paul apart is that the distinctions between them were so stark. Observing the pair through the control-room glass at Abbey Road’s Studio Two, Emerick was fascinated by their odd-couple quality:
Paul was meticulous and organized: he always carried a notebook around with him, in which he methodically wrote down lyrics and chord changes in his neat handwriting. In contrast, John seemed to live in chaos: he was constantly searching for scraps of paper that he’d hurriedly scribbled ideas on. Paul was a natural communicator; John couldn’t articulate his ideas well. Paul was the diplomat; John was the agitator. Paul was soft-spoken and almost unfailingly polite; John could be a right loudmouth and quite rude. Paul was willing to put in long hours to get a part right; John was impatient, always ready to move on to the next thing. Paul usually knew exactly what he wanted and would often take offense at criticism; John was much more thick-skinned and was open to hearing what others had to say. In fact, unless he felt especially strongly about something, he was usually amenable to change.
The diplomat and the agitator. The neatnik and the whirling dervish. Spending time with Paul and John, one couldn’t help but be struck by these sorts of differences. “John needed Paul’s attention to detail and persistence,” Cynthia Lennon, John’s first wife, said. “Paul needed John’s anarchic, lateral thinking.”

by Joshua Wolf Shenk, The Atlantic | Read more:
Image: Robert Whitaker

The Stress of Ageing

How do I knock off thirty years from my age?

Faust, the protagonist in Johann Wolfgang von Goethe’s famous play, poses this question to Mephistopheles in the chapter Hexenküche (Witches’ kitchen). Mephistopheles provides some pretty good advice – considering that he is the devil and this fictitious exchange takes place in the dark Middle Ages: (...)

Here is the paraphrased essence of the devil’s advice: Seek out a life of moderation, stop being lazy, exercise regularly by ploughing the field and avoid unhealthy foods!

How does the great scholar and scientist Faust respond to these commonsense suggestions?

Thanks, but no thanks. Faust does not like manual labor and is quite happy with his current lifestyle, so he instead opts for plan B – a magic youth potion. (...)

At the 64th Lindau Nobel Laureate meeting, Elizabeth Blackburn reviewed the history of how she and her colleagues identified the role of telomeres and telomerase in the cellular aging process, but also presented newer data of how measuring the length of telomeres in a blood sample can predict one’s propensity for longevity and health. It makes intuitive and theoretical sense that having long telomeres would be a good thing but it is nice to have real-world data collected from thousands of humans confirming that this is indeed the case. A prospective study collected blood samples and measured the mean telomere length of white blood cells in 787 participants and followed them for 10 years to see who would develop cancer. Telomere length was inversely correlated with likelihood of developing cancer and dying from cancer. The individuals in the shortest telomere group were three times more likely to develop cancer than the longest telomere group within the ten year observation period! A similar correlation between long telomeres and less disease also exists for cardiovascular disease.

Dr. Blackburn was quick to point out that these correlations do not necessarily mean that there is a direct cause and effect relationship. In fact, increasing telomerase levels ought to lengthen telomeres but in the case of cancer, too much telomerase can be just as bad as too little telomeres. Too much telomerase can help confer immortality onto cancer cells and actually increase the likelihood of cancer, whereas too little telomerase can also increase cancer by depleting the healthy regenerative potential of the body. To reduce the risk of cancer we need an ideal level of telomerase, with not a whole lot of room for error. This clarifies that “telomerase shots” are not the magical anti-aging potion that Faust and so many other humans have sought throughout history.

Why is that telomere lengths are such good predictors of longevity, but too much telomerase can be bad for you? The answer is probably that telomere lengths measured in the white blood cells reflect a broad range of factors, such as our genetic makeup but also the history of a cell. Some of us may be lucky because we are genetically endowed with a slightly higher telomerase activity or longer telomeres, but the environment also plays a major role in regulating telomeres. If our cells are exposed to a lot of stress and injury – even at a young age – then they are forced to divide more often and shorten their telomeres. The telomere length measurements which predict health and longevity are snapshots taken at a certain point in time and cannot distinguish between inherited traits which confer the gift of longer telomeres to some and the lack environmental stressors which may have allowed cells to maintain long telomeres.

by Jalees Rehman, Lindau Nobel Laureate Meetings | Read more:
Image: Tiziano Vecellio: Three Ages of Man

Stoop Stories

My black friends call it Baldamore, Harm City or Bodymore Murderland. My white friends call it Balti-mo, Charm City or Smalltimore while falling in love with the quaint pubs, trendy cafés and distinctive little shops. I just call it home.

We all love Baltimore, Maryland. It’s one of those places that people never leave – literally. I know people, blacks and whites, who have been residents for 30-plus years and haven’t even been as north as Philly or as south as DC.

Baltimore is one of the few major metropolitan cities with a small-town feel. (...)

I went to all-black schools, lived in an all-black neighbourhood, and had almost no interactions with whites other than teachers and housing police until college, where I got my first introduction to the other Baltimore.

My SAT scores and grades were exceptional for an east Baltimore kid. This gained me acceptance into schools I probably wouldn’t have been admitted to if I weren’t a ghetto kid. Thirsty for a new experience, I wanted to go to an out-of-state college. But my plans were derailed when, months before my high-school graduation in 2000, my brother Bip and my close friend DI were murdered. I became severely depressed and rejected the idea of school.

Most of my family and friends came around in effort to get me back on track. My best friend Dre hit my crib everyday.

I met Dre way back in the nineties. His mom sucked dick for crack until she became too hideous to touch. Then she caught AIDS and died.

Dre’s my age. He had so many holes in his shoes that his feet were bruised. I started giving him clothes that I didn’t want, and he stayed with us most nights. We became brothers.

At 13, Dre started hustling drugs for Bip and never looked back. He loved his job. Dre was organised, he recruited, and he outworked everyone else on the corner. Like a little Bip, Dre beat the sun to work every morning: 4am every day in the blistering cold, with fist full of loose vials. His workload tripled after Bip passed, but he called everyday.

‘D, how you holdin’ up, shorty?’ said Dre.

‘I don’t even know. Man, I been in this house for weeks,’ I replied.

‘Naw, nigga, get out. Get a cut, nigga, go do some shit! Least you still alive!’

‘You right,’ I said as I sat on the edge of my bed. ‘Wet floor’ signs were needed for my tears.

‘What the fuck, Yo, you cry everyday?’ Dre said.

‘Naw, well no, shit. I dunno.’

‘Yo anyway, I’m gonna murder dat nigga that popped Bip. Ricky Black, bitch ass. So go live, nigga, get some new clothes, pussy or sumthin’.’

I picked my head up for the first time in days. I didn’t know my brother even had static with Ricky Black. They played ball together a week before Bip died. But it didn’t matter if Dre killed Ricky, or I did, because someone would eventually.

Murder made Dre smile theatrically; he leapt from his seat. ‘Nigga, I keep the ratchet on me,’ he said, lifting his sweatshirt to show me the gun gleaming on his waist.

I told him he was crazy, but I didn’t care. I wouldn’t commit that murder – I’m not a killer. Or am I? I am capable of hate, and I am a direct product of this culture of retaliation – a culture that won’t let me sleep, eat or rest until I know that Bip’s killer is dead.

‘Be careful,’ I said.

‘You should think about school, D,’ said Dre on his way out the door. ‘Bip would like that.’

He was right. My brother always wanted me to attend college: I owed Bip that.

I decided to stay in state to be close to family, so I attended Loyola University, a local school on the edge of the city.

I always thought college would be like that TV show, A Different World. Dimed-out Whitney Gilberts and Denise Huxtables hanging by my dorm – young, pure and making a difference. I’d be in Jordans and Jordan jerseys or Cosby sweaters like Ron Johnson and Dwayne Wayne, getting As and living that black intellectual life on a beautiful campus. No row homes, hood-rats, housing police or gunshots: just pizza, good girls and opportunity. I could even graduate and be ‘The Dude Who Saves the Hood!’ (...)

I wore six braids straight back like the basketball player Allen Iverson, real Gucci sweat suits, and a $15,000 mixture of mine and Bip’s old jewellery. The other students looked at me like I was an alien. I’d walk up on a student and clearly say: ‘Excuse me, where is the book store?’ And they’d look back with a twisted face, like: ‘I don’t understand you. What are you saying?’ And I had this dance with multiple students every day until I mastered my ‘Carlton from The Fresh Prince of Bel-Air’ voice.

by D Watkins, Aeon | Read more:
Image: Stacey Watkins

Tuesday, July 1, 2014

Rickie Lee Jones

Hippie Roots & The Perennial Subculture

In 1906 Bill Pester first set foot on American soil having left Saxony, Germany that same year at age 19 to avoid military service. With his long hair, beard and lebensreform background he wasted no time in heading to California to begin his new life.

He settled in majestic Palm Canyon in the San Jacinto Mountains near Palm Springs California and built himself a palm hut by the flowing stream and palm grove.

Bill spent his time exploring the desert canyons, caves and waterfalls, but was also an avid reader and writer. He earned some of his living making walking sticks from palm blossom stalks, selling postcards with lebensreform health tips, and charging people 10 cents to look through his telescope while he gave lectures on astronomy.

He made his own sandals, had a wonderful collection of Indian pottery and artifacts, played slide guitar, lived on raw fruits and vegetables and managed to spend most of his time naked under the California sunshine.

During the time when Bill lived near Palm Springs he was on Cahuilla Indian land, with permission from the local tribe who had great admiration for him. His name even appeared on the 1920 census with the Indians, and in 1995 An American Indian woman Millie Fischer published a small booklet about Palm Canyon that included a chapter on Pester.

The many photos of Pester clearly reveal the strong link between the 19th century German reformers and the flower children of the 1960’s…long hair and beards, bare feet or sandals, guitars, love of nature, draft dodger, living simple and an aversion to rigid political structure. Undoubtedly Bill Pester introduced a new human type to California and was a mentor for many of the American Nature Boys.

In 1914 another German immigrant, Professor Arnold Ehret arrived in California. The philosophy he preached had a powerful influence on various aspects of American culture. Ehret advocated fasting, raw foods, nude sun bathing and letting your hair and beard grow un-trimmed. His "Rational Fasting" (1914) and "Mucus-less Diet"(1922) were literary standbys within hippie circles in San Francisco and Los Angeles in the 1960’s.

The husband and wife team of John and Vera Richter first opened their Raw-Foods cafeteria the "Eutropheon" in 1917, and during it’s lifetime it hosted thousands of customers and taught many people how to prepare such raw treats as sun-dried bread, salads, dressings, soups, beverages and many other healthy alternatives to the typical Los Angeles cuisine of the 1920’s-1940’s.

John’s powerful lectures were attended by many young health enthusiasts, who later went on to become well known health teachers and authors, and Vera’s recipe book was the precursor to many of the modern Live-Food recipe books.

Some of the young employees of the Eutropheon were Americans who had adopted the German Naturmensch and Lebensreform image and philosophy, wearing their hair and beards long and feeding exclusively on raw fruits and vegetables. The "Nature Boys" came from all over America but usually ended up in southern California. Some of the familiar ones were Gypsy Jean, eden ahbez, Maximilian Sikinger, Bob Wallace, Emile Zimmerman, Gypsy Boots, Buddy Rose, Fred Bushnoff and Conrad. This was decades before the Beats or Hippies and their influence was very substantial. In "On The Road" Kerouac noted that while passing through Los Angeles in the summer of 1947 he saw"an occasional Nature Boy saint in beard and sandals".

But in the spring of 1948 eden ahbez became an internationally recognized personality when his song "Nature Boy" was recorded by Nat King Cole. Photos and story of eden and his wife Anna appeared in Life, Time and Newsweek magazines that year.

Born in Brooklyn New York, April 15, 1908 "ahbez" had walked across America 4 times, hopped freight trains and lived in a cave in Tahquitz Canyon before he penned his #1 hit tune, which was on the hit parade for 15 weeks.

The song itself was part autobiographical but was also a nod to his German mentor Bill Pester who was 23 years his senior and had been a Nature Boy for decades when eden encountered him in the Coachella Valley of southern California.

Another one of the Nature Boys, Maximillian Sikinger was born in Augsberg Germany in 1913 and spent most of his childhood and youth living wild in the environs of various European cities. Through his wanderings, personal contacts and outdoor living he developed a keen interest in various aspects of natural healing; nutrition, water cure, fasting, sitz baths, deep breathing and sunshine.

Max left Europe in 1935 at age 22, arrived in America then eventually made his way west to California where he traveled with the Nature Boys who valued his introspective and philosophical ideas very highly. Maximillian’s world travels and rugged background had given him deep insight into many of life’s puzzles.

But the one Nature Boy to pass the torch from the old era (circa 1930’s-40’s)…into the 1960’s hippie generation was Gypsy Boots.

Born in San Francisco in 1916 to Russian Jewish parents "Boots" grew up in the San Francisco area where he quit school at an early age to travel and live a life close to nature. He met Maximillian on the beach at Kelley’s Cove in 1935 and it was then that his life began to change. Boots noted in his autobiography: "It was with Max that I first experimented with fasting and special diets, and also learned much about yoga".

In the 1940’s Boots lived wild in Tahquitz Canyon with all of the Nature Boys, bathing in the cool mountain water, eating fruits and vegetables, sleeping on rocks or in caves, hiking and selling produce in Palm Springs.

In 1953 he married Lois Bloemker, settled near Griffith Park in Los Angeles and had 3 sons. In 1958 he opened his "Health Hut" in Hollywood, which was a big hit, and shortly thereafter began his career as a serious health teacher and example of optimum living.

In the early 1960’s he appeared on the Steve Allen show over 25 times to an audience of some 25 million households. Steve Allen had originally started the "Tonight" show, then began his own show featuring guests like Elvis Presley, Jack Kerouac, Frank Zappa and the psychedelic band Blue Cheer.

When the Beatles and Rolling Stones arrived in Los Angeles in the mid 1960’s their "pudding basin" hairstyles seemed tame when compared to a local rock band "The Seeds" who wore shoulder length hair, thanks to the influence of Gypsy Boots and his ilk. "Seeds" singer Sky Saxon, a vegetarian, had invented a new type of music…."Flower Punk". Even Jimi Hendrix had a front row seat to a Seeds concert, and the Doors played second bill on a Seeds tour.

When the Love-In’s began in Griffith Park in 1966 some of the Flower Children who were stoned on Owsley acid looked up in the big trees to see Gypsy Boots swinging and climbing from branch to limb, then exclaiming "what’s that guy on…. I’d sure like to have a hit of that!" But Boots "high" was always induced from his sun-charged foods like figs and grapes, as well as his fitness regime.

by Gordon Kennedy & Kody Ryan, Hippie.com | Read more:
Images: Palm Springs Art Museum and Gypsy Boots

Misty Copeland