It’s National Eating Disorder Awareness Week, and once again, I’m not participating

girl-food-s

(Credit: Salon/Mireia Triguero Roura)

Even vegetarians lapse in California — such is the pull of In-N-Out Burger. My first time in the state, on the ludicrous heels of the foie gras supplement at The French Laundry, I ordered fries and a cheeseburger (Animal Style) at the zoo passing for a fast food chain at Fisherman’s Wharf in San Francisco. That’s a long way of saying that I wasn’t surprised a few days ago when my friend, admittedly not a burger fan, wanted to hit the drive-thru before the end of her trip to the West Coast.

“I still worry about you,” she said, as we left In-N-Out.

She was referring to my eating disorder. I watched the road. She’d ordered food; I’d ordered a Diet Coke.

“I’m the healthiest I’ve ever been,” I said.

That was true, and I made a point of both showing and telling its trueness: I stole a few fries and chattered about how good fountain soda could be but how weird it was going to make me feel since I’d basically cold-turkeyed the habit a decade ago, when I learned from a dietician’s handout that artificial sweeteners caused lab rats to binge.

Still, I felt guilty. Because I couldn’t say to my friend, “I’m fine, but last week I was asking my psychologist why not eating is really such a big deal.” Because I couldn’t say, “I still weigh myself twice a day, but I don’t write it down anymore, so that’s something, right?” Because I couldn’t say, “Ordering a Diet Coke when I’m still full from brunch is a responsible choice for me, even though it might not seem like it.” Because I couldn’t say, “I’m a food-lover (see: The French Laundry; see: In-N-Out) who is resigned to also being kind-of non-actively eating disordered weird about food.” Because I couldn’t say, “Doesn’t that just mean human?”

Because sometimes I eat three balanced meals and three snacks like the meal plan I was prescribed a decade ago when I spent a winter break doing in-patient treatment for my purging anorexia and other times I eat many cups of raw vegetables or popcorn for lunch. Because according to my current therapist, popcorn is a snack, not a meal; popcorn is a problem. Because one of my new year’s resolutions is to work out every day and, before my friend’s visit, I didn’t miss once. Because I go shopping a week later and devour the compliments of saleswomen plying me with sweet nothings about my waist. Because my waist, objectively, as in according to the pink tape measure I still keep in my makeup bag, isn’t actually that good, not even with everyday exercise, not even with cups of raw vegetables, not even with six feedings a day. Because I know what comes up at the end of the month.

The last week of February is designated by the National Eating Disorder Association (NEDA) as National Eating Disorder Awareness Week. Once again I won’t be participating. There are options. I could be walking for the cause along the beach in Santa Monica, showing my solidarity with a leisurely calorie burn. I could be championing my “Recovery Heroes” (though at the time of writing this essay, I comprise 1/452 of the video’s viewership). I could be showing my support on a T-shirt: “Strong beautiful me” printed on the “longer length, easy fit” scoop neck from Aerie, Aeropostale’s sister store; 100 percent of the profits go to NEDA. The organization encourages its Twitter followers to “wear your support for #bodypositivity and #recovery on your sleeve — literally.”

But I won’t do any of that. (Plus: Body positivity, NEDA? Really? I imagine all the ED teens who shop at Aerie, deciding they’ll buy that shirt a size too small and use it as thinspo to starve.)

No matter that my doctors, dentists, therapists, nutritionists, colleagues, students, professors, yoga teachers, best friends, foes, family and husband (and probably even my Chihuahua) know I’ve struggled with food since 1998. No matter that I’ve written articles about the need for better literature about eating disorders or the importance of recognizing body neutrality, the recognition that — whether they’re weak, strong, fat, skinny, toned, tanned, turgid, curvy, zaftig — bodies don’t need to be beautiful. (Didn’t Lucy Grealy say this twenty-something years ago?) (But, also, maybe our culture’s limited vocabulary, the one that leads to overreliance on tired words — beauty, beautiful — is part of the self-as-physical obsession that NEDA, I think, probably would admit to identifying as a problem not unrelated to eating disorders.) No matter that my first novel is filled with eating disorders, just like the first poem I wrote more than 20 years ago was filled with eating disorders. Even though my world has become bigger than the mirror and the scale and the measuring cups and the 00s or the 23s or the XXSs, I still believe in the importance of representing that shrinking world. Because even when the world becomes bigger, it is and it isn’t.

Even without amenorrhea or purging or calves so knotted from the gym sleep is impossible or a lumpy bed at a decent treatment facility, I still have an eating disorder.

And every year, NEDA Week confuses awareness with absolution. I remember when I first learned about it: I was 16, marking three years with anorexia, and I couldn’t make myself throw up for all the size 1 Bubblegum pants in the world. The Student Council at my high school had taped up images of a waify brunette, fingers jammed down her throat. The pictures were everywhere — in the lunchroom, in the library, in the bathrooms where they taunted me on the backs of stall doors.

I hated those pictures. So did a friend of mine at the time, a boy I knew from art class, who liked to chew and spit oatmeal cream pies over the studio trash. We tore all the posters down and covered them with tempera paint.

We were angry. If I were walk to the pier in Santa Monica, I would be angry. No, I wouldn’t be angered like I was when I was 16, when an eating disorder felt like my private property and those posters represented an invasion, a ham-fisted trespass that irritated and triggered me. They were, my friend and I decided with our hands streaked with paint, in poor taste. Haha, we laughed, poor taste. We slugged our Diet Pepsis.

Now, though, NEDA makes me angry because it makes me feel trapped, excluded from a club to which I thought I’d earned a lifetime membership. When my friend and I muddied that skinny model on those photocopies, we were experiencing our eating disorders the same way; it didn’t matter that I’d never chewed and spat and that he hated to run. Our intention was the same, very ’90s, very Fiona Apple: making a mistake, being bad on purpose. I’m not that 16-year-old anymore — and I’m not the waif on that poster — but I don’t see myself as the face of NEDA either. None of the people I know with eating disorders joined that club. They got better or they didn’t, but, either way, they kept quiet.

The language of “body positivity” and “recovery,” as optimistic as it is, doesn’t leave room at the table for the vast spectrum of people whose eating disorders — in all their messy, vacillating, tentative particularities — don’t let them identify or want to identify as recovered, who don’t want to disavow the disorder that has shaped their relationships, their worldviews, their lives. Just like the victory of reaching a goal weight proved disappointing, so too, are the victories of recovery. I remember leaving the hospital and counting days without a behavior: no binging, no purging, no overexercising. When I purged, I called it a slip and tried to get counting again. It was exhausting and boring, the focus on counting. It felt like the same old thing all over again.

Now, a decade out of the hospital, I’m hesitant to publicly disavow a major part of my life and my identity. Without an eating disorder I wouldn’t have become a writer. Without an eating disorder I probably wouldn’t have learned to revere, to consider, to enjoy food. Isn’t that recovery enough — being able to enjoy the foie gras at The French Laundry, even if I feel a little gross admitting it?

I imagine there are many like me, people with eating disorders whose to-do list doesn’t include recover. Maybe they grew up reading Marya Hornbacher’s “Wasted” like I did. Maybe they internalized Hornbacher’s “state of mutual antagonism” with her eating disorder, the one that leaves you shaking at the end of her memoir, the one of which Caroline Knapp in the New York Times wrote: “This lack of resolution is both the book’s strength and its weakness.” Maybe those loose ends seemed true to the 13-year-old I was when I first decided I wanted an eating disorder — like, if the toil of my illness had to end, at least there would be a healthy kind of perspective, a freak oasis of moderation. I could have an eating disorder, of some kind, forever. It’s terrifying and comforting.

Given NEDA’s “It’s time to talk about it” campaign (#NEDAwareness), I hope that our cultural dialogue about this subject can leave room for people whose relationships with their eating disorders are marked not only by admission but ambiguity. I don’t identify as recovered because to do so would be lying. To be recovered would be to have been given an esophageal transplant, a heart transplant, a brain replacement. Recovery would mean swapping out the consciousness that has given me the strength to get through some shit, to process the world on my own terms.

Today was a regular food day: I weighed myself three times, exercised for 45 minutes, ate three balanced meals, and two squares of milk chocolate for a snack. The last part I feel not-great about, but also it was fine.

Source: New feed

Immigration as “leverage against Muslims”: Trevor Aaronson reveals FBI tactics on informants

FEATURE_PHOTO_FBI Informants

Immigration and intelligence officials have been working closely to use a carrot-and-stick approach to potential informants, according to The Intercept, which reported on the classified documents.

Worse yet, immigration officials have been quick to discard the informants were no longer valuable, Intercept contributing writer Trevor Aaronson told Salon.

“What hadn’t previously been disclosed is that under the FBI’s guidelines they are required to turn over to ICE, by providing the location of the informant to immigration authorities as soon as that person is no longer useful,” Aaronson said. “It creates a situation where informants are incentivized to provide any sort of information.

“People argue that it would be beneficial for people just make stuff up just to be able to provide their FBI agent with enough information to be able to continue to receive the benefit that they are receiving as a result of this relationship.”

Informants have been able to profit on their connections, meanwhile, without making it seem that they’re actually acting as paid informants.

“The FBI guidelines allow for FBI agents to reimburse all sorts of expenses to FBI informants,” Aaronson said. “These could be medical bills, living costs, they could be car payments. An FBI informant can be paid up to $100,000 per year with just having the lower level special agent in charge, the person who’s running the lower level office in Kansas City, for example.”

Aaronson noted that one reporter was paid nearly $5 million by the FBI and other government agencies. And the FBI and intelligence agencies have played along, hoping it would going to stop future attacks. The FBI hasn’t publicly claimed they stopped a terrorist attack through these measures.

Even if no attack was stopped, there are reasons why the FBI continues the practice.

“What the FBI says is that, by finding these guys whose capacity was questionable, they’re creating a hostile environment for terrorism,” Aaronson noted.

Source: New feed

Cater-waitering the Oscars: My glamorous night at the Academy Awards, feeding canapés to hungry stars

life-essay-oscars

(Credit: Getty/Salon/Mireia Triguero Roura)

It was 2013 and I was between jobs working as a production coordinator on low-budget made-for-TV Christmas movies. I was running out of money fast, but with a decade of food service experience behind me and a few handy connections, I managed to snag a position as a cater waiter for the 85th Academy Awards. It began, as most glamorous assignments do, in a parking garage somewhere south of Sunset Boulevard.

12:30 p.m.: Check-in with a check-up

Working the Oscars is the Holy Grail of catering gigs — it’s reserved for SEAL team-level slingers of hors d’oeuvres. I’m standing in a dark and drafty parking garage, where I’ll be vetted before being allowed anywhere near the Dolby Theatre on Hollywood Boulevard. An exhausted woman in an Arizona State sweatshirt inspects my shirt for proper starching before handing me a questionnaire in which I am asked to detail my gastrointestinal health:

“Have you, in the last two weeks, experienced any of the following?

Check yes or no: Nausea? Diarrhea? Excessive gas? Belching?”

Listen, I understand that no one wants Amy Adams to get infected with embarrassing bathroom issues on her big night, but detailing my bowel habits to a non-medical professional in the concrete shadows of a parking garage steals a small part of my ego that I will never get back.

But it’s Oscar Sunday. I shake it off.

2:02 p.m.: The streets of Hollywood

I’m on a bus filled with the other A-team cater waiters and we’re barreling down Hollywood Boulevard. The streets are barricaded and gawkers ogle us from the other side of eight-foot-high chain link fences. I’ve never been chauffeured down a barricaded street and it dawns on me this is the closest I’ll ever get to feeling like the Pope.

2:42 p.m.: The Loews Hollywood Hotel

The bus drops us off at the back service entrance and we’re ushered through a purgatory of fluorescent-lit back hallways and service elevators until we emerge on the rooftop of the Loews Hollywood Hotel. I pause just outside the service elevator to take in the Hollywood sign gleaming in the near distance. Its crooked white letters seem to tip their hats to me as “Somewhere Over the Rainbow” plays over the PA system. The memory of that ego-snatching gastrointestinal questionnaire begins to melt away under the cloudless blue sky. Hollywood, baby.

A catering manager approaches me. She has the air of a Virgin Airlines stewardess: polished, hospitable and better-looking than anyone else in the vicinity. She hands me a Diet Coke and a bag of Popchips.

“Take a load off,” she says and waves me toward the aquamarine pool where dozens of cater waiters clad in identical black button-down shirts and clunky yet practical Dansko clogs have already settled into chaise longues.

I’m guessing you’ve never considered the question: “Where do good cater waiters go when they die?” But I’m going to answer that question for you anyway: They go to the rooftop of the Loews Hollywood Hotel on Oscar Sunday. It is the Happy Hunting Ground of the Cater Waiter. If you’ve ever had the soul-crushing experience of being forced to smile at the one-percent while offering grilled cheese sliders topped with fig compote on a silver platter, only to go home at the end of the night to face the reality of your empty bank account and shattered Hollywood dreams, then this? This is your very great reward.

I settle into one of the chaise longues. To my right, a James Dean type catches my eye. As if taking a cue from a John Hughes movie, he lowers his sunglasses to get a better look at me. Before sliding his sunglasses back up, he gives me a suave two-fingered salute. Zowie.

On my left, a cherubic young man takes a lint roller out of his pocket no fewer than three times to de-lint his apron. His clogs are spotless, polished to a fine sheen. He looks at me with dead seriousness: “I’m so ready for this.”

I am too. I’m ready for the catering gig of a lifetime.

3:58 p.m.: The back hallways leading to the Dolby Theatre

With 50 or so other catering servers, I make my way through more fluorescent-lit back hallways, only this time they lead from the hotel to the core of the Dolby Theatre. As we march silently toward our hallowed call of duty, maintenance workers pause their sweeping and mopping to applaud us. Damn. Even they know this is going to be a catering tour de force.

4:17 p.m.: Lobby of the Dolby Theatre

It is the calm before the storm. The lobby is empty. I’m holding a heavy platter of lavash bread, topped with smoked salmon, that’s been cut into the shape of an Oscar statuette. To my left and right are cater waiters holding trays of champagne. My anticipation is high. I’ve been watching the ceremonies since I was a little kid and now suddenly here I am, actually at the Oscars. I attempt to steady my nerves.

Chillax, I counsel myself. All you have to do is hand out these weird Oscar-shaped hors d’oeuvres. It’s not like you have to make a speech tonight. I take a deep breath and wait for the A-list to arrive.

And I wait.

And I wait.

I wait for a solid hour holding the tray with my arm bent at a 90-degree angle. My elbows are locking up and I can’t stop thinking about the recessed lighting in the lobby. Are those LED bulbs? LED bulbs make everything look depressingly Soviet. Very unflattering design choice, in my opinion. Why would they make Amy Adams frolic under LED bulbs on her big night? I hear the cater waiter to my left sigh loudly. I look over and notice how the LED lights really bring out the bags under her eyes, make her look —

And just then, the doors swing open. A river of Academy Award-nominated actors streams toward me in an hors d’oeuvre-driven stampede. I had grossly underestimated how hungry celebrities would be after having starved themselves for three weeks to fit into their designer gowns and tuxes.

“Here you go, Joaquin. Reese, would you like a second helping? Of course I have a napkin for you, Nicole!”

Riding high, I waltz through the lobby, smoked salmon in hand. I am popular. I am powerful. I am a purveyor of fine snacks to the stars.

At one point, I spot Jessica Chastain lingering coolly some 20 feet away, utterly uninterested in smoked salmon cut into the shape of an Oscar. Her skin is like fine china and her copper hair frames her face in soft waves. She throws her head back in an easy laugh, exposing a set of perfectly pearly teeth. And even though she’s standing quite still, her rose gold gown appears to sashay around her.

I am suddenly nicked with a razor-sharp self-consciousness. Everything comes into sharp relief: the black necktie I’m wearing that makes me look like a boy, the gazpacho stains on my apron, the blonde hair pulled back into a severe bun with two inches of exposed dark roots, the smudged makeup already melting off my face.

This moment steals another small part of my ego that I will never get back.

You know what? This isn’t the Happy Hunting Ground of the cater waiter after all. Being Jessica Chastain is the Happy Hunting Ground of the cater waiter.

You always have it good until you realize someone else has it better. And this becomes most obvious when mere mortal cater waiters compare themselves to immortal Hollywood stars.

12:30 a.m.: In the back hallways and service areas of the theater

It’s hour 12. My self-respect is sinking to new lows. I’m crumbling under the drudgery of stacking champagne glass racks and scraping bits of dried smoked salmon into trash cans. I’m ready to bash my head in with an Oscar statuette. I concentrate on looking busy and concerned so that no one will ask me to haul a big black bag of garbage out to the dumpster. Oscars or not, garbage is garbage. I’ve already been to the dumpster three times.

1:17 a.m.: Back on the bus to the parking garage

The handsome James Dean type is sitting next to me. His name is Abel and he looks every bit as disheveled, disillusioned and disheartened as I feel.

He looks at me, perplexed. “I have a college degree,” he says.

“What in?” I ask.

He sighs. “Creative writing.”

I pat him on the knee. “I have one too. Theatre.”

He laughs. For the first time today, I feel a true rapport with someone, I feel like myself.

Abel Greenbaum tells me he is working on a novel, but also dreams of one day opening a “sexy haunted house.”  A “sexy haunted house” is truly the last thing I am interested in and I pray to God he won’t elaborate. But I tell him it sounds “very cool” anyway. I’m willing to a lie a little if it means I can top off this day with a little quality commiseration.

1:36 a.m.: The drive home

I head south on the 101, the lights of downtown LA’s skyscrapers towering ahead of me. The rooftop of the Loews Hollywood Hotel already feels like a dim memory, a lost beacon of hope. Reality, as always, has proven itself to be a lot less glamorous — filled with LED bulbs and compromised egos and the fact that there are people who think that cutting smoked salmon into the shape of Oscar statuettes is a good use of time.

Almost nothing is all it’s cracked up to be. Except, possibly, for being Jessica Chastain. Being Jessica Chastain might be all it’s cracked up to be.

Source: New feed

There’s no going back: Racial, ethnic diversity is on the rise in American communities

US-POLITICS-TRUMP-PROTEST

(Credit: Getty/Ryan McBride)

Racial and ethnic diversity is no longer confined to big cities or the east and west coasts of the United States.

In the 2016 U.S. presidential election, racially and ethnically diverse metropolitan areas were more likely to vote for Hillary Clinton. Whiter metro and rural areas supported Donald Trump. This pattern reinforced the stereotype of “white rural” versus “minority urban” areas.

However, our research shows that the populations of communities throughout the nation are being transformed. The share of racial and ethnic minorities is increasing rapidly and irreversibly. These changes will have major impacts on the economy, social cohesion, education and other important parts of American life.

Nearly all communities are becoming more diverse

In everyday language, “diversity” often refers to racial and ethnic variation. But demographers have developed a mathematical definition of this concept: The greater the number of racial-ethnic groups in the community, and the more equal in size the groups are, the greater the diversity. Using this definition, we have estimated that diversity has increased in 98 percent of all metropolitan areas and 97 percent of smaller cities in the United States since 1980.

The trend is not limited to urban America. Dramatic increases are evident in rural places as well. Nine out of 10 rural places experienced increases in diversity between 1990 and 2010, and these changes occurred in every region of the country. Even within metropolitan settings, the traditional divide between diverse cities and white suburbia has been eroded. Immigrant-rich suburbs are rising around cities like Los Angeles and Washington, D.C., which rival urban enclaves as destinations for Asians and Latinos.

Of course, some communities have changed more than others. Despite these differences, a common trend is for a place’s racial-ethnic composition to change from white dominance to a multigroup mix, with some combination of whites, blacks, Latinos and Asians. This led to an increase in “no-majority” communities — including more than 1,100 cities and towns, 110 counties and four states: California, Texas, New Mexico and Hawaii. In these places, none of the major racial-ethnic groups constitutes as much as 50 percent of all residents.

Immigration and diversity

The racial and ethnic diversity we see today stems from the large and sustained wave of immigration that followed the Immigration and Nationality Act of 1965. Between 1965 and 2015, the proportion of non-Hispanic whites in the country dropped from 84 to 62 percent, while the shares of Hispanics and Asians rose. The Pew Research Center found that these changes were largely driven by immigration, not births. Only one-third of Hispanics and one-tenth of Asians would be living in the United States in 2015 had there been no immigration since 1965. Today, Hispanics account for 18 percent and Asians 6 percent of the U.S. population.

Domestic and international migration during the 1990s and 2000s also contributed to the spread of diversity across American communities. Racial and ethnic minorities tended to move to whiter areas, and white young adults tended to move to more diverse urban areas. Notably, Latino immigrants were first concentrated in just a handful of states such as California, Texas, Florida, Illinois and New York. They started to spread across the country during the 1990s to areas known as “new destinations,” like North Carolina, Georgia and Iowa.

By that time, many Hispanic immigrants had acquired legal status and were free to move to new job opportunities in agriculture, construction and manufacturing in the Southeast and Midwest, as well as service sector jobs in high-amenity vacation destinations, such as in Colorado.

Diversity is now self-sustaining

Despite the initial importance of migration, racial and ethnic diversity is now self-sustaining. Minority groups will soon be maintained by “natural increase,” when births exceed deaths, rather than by new immigration.

This is especially true for Hispanics. According to the Pew Research Center study mentioned earlier, about a quarter of the U.S. population is projected to be Hispanic by 2065, up from 18 percent in 2015. This trend would not change if immigration somehow were halted completely after 2015, the final year in Pew’s study. The sustainability of the Latino population is even evident in rural and urban areas in the Southeast and Midwest, where natural increase in the Latino population, rather than international or domestic migration, is now responsible for more than half of Hispanic growth.

But, how can the share of Hispanics continue to grow without new immigration?

A small part of the answer is that Latinos have slightly more children than non-Hispanic whites. On average, Hispanic women have 2.1 children compared with 1.8 among non-Hispanic white women. However, fertility among Hispanic women declines with each new generation in the United States, so this factor is unlikely to play a major role in the long run.

The main engine of America’s future diversity gains will be “cohort succession,” a process in which older majority-white generations are replaced by younger minority-majority generations. As shown in the charts below, which we created from U.S. Census Bureau population projections, children and young adults, many of whom are the children of immigrants, are currently much more diverse than older adults.

Fast-forward to 2050. Today’s older generations will have died. The more diverse younger generations will have grown up and had their own diverse children and grandchildren.

The seeds for future gains in diversity have already been planted.

Fear and distrust

Many Americans respond to these changes with fear and distrust. Some whites have an aversion to living near people of color. A small number of no-majority places and other highly diverse municipalities and neighborhoods like the Chicago suburb of Calumet Park and the Los Angeles suburbs of Lynwood and Monterey Park have already become more homogeneous, as one minority group has grown and whites have moved away. These places are exceptions to the trend of growing diversity, but other communities may follow suit. Some people want to “turn back the clock” by limiting immigration, a sentiment Donald Trump tapped into during his presidential campaign.

Trump described black and Hispanic communities as impoverished, dangerous inner-city neighborhoods. This was an exaggeration, but it may have stoked rural white voters’ fears of racial-ethnic diversity.

Although all-minority communities are often disadvantaged, communities with high levels of diversity with a mixture of racial and ethnic groups do not fit Trump’s image. Highly diverse communities are more common in coastal states and across the South. They have larger populations and a critical mass of foreign-born inhabitants, both of which contribute to their reputation as comfort zones for minorities and immigrants.

Diverse communities also tend to offer attractive housing and labor market opportunities, including an abundant rental stock, higher median income and a job opportunities in a variety of occupations. Some are also hubs for government or military jobs. Overall, the evidence suggests that highly diverse communities are good places to live, and often support industries that employ immigrants, and racial and ethnic minorities.

Throughout history, notions of who belongs in American society have expanded again and again to incorporate new groups. History could repeat itself for today’s immigrants if they are given a fair chance. Many people fear immigrants and the social burdens they seem to bring with them, including poverty, limited education and low English proficiency. But this overlooks the many contributions immigrants make, and the fact that immigrants’ socioeconomic disadvantages will almost certainly diminish if they are given equal opportunities in U.S. schools and workplaces.

The Conversation

Jennifer Van Hook is a liberal arts research professor of sociology and demography at Pennsylvania State UniversityBarrett Lee is a professor of sociology and demography at Pennsylvania State University.

Source: New feed

Real women are still expected to cook: From sitcoms to the Food Network, the “angel in the kitchen” pressure on women prevails

lorelai-lucy-carrie

(Credit: CW/ABC/CBS)

Dinner is served. Eighteen years ago, this straightforward statement introduced second season “Sex and the City” viewers to a Carrie Bradshaw who is absolutely, and proudly, useless in the kitchen. Following this declaration, which is made while wearing a slinky black dress, she throws a basket of torn bread chunks onto Mr. Big’s (Chris Noth) kitchen counter. We’re talking generous, stale-looking pieces that one might toss, in a similar fashion, into a pond of ducks.

Carrie (Sarah Jessica Parker) says fondue is the only dish she has mastered, as she punctures a bite of sourdough with a fork and dips it into a small pot of melted cheese. Big, who looks strangled enough by his awkward black turtleneck, chokes it down. Carrie tastes and agrees it’s horrible.

They laugh it off and agree to go to a restaurant instead, leaving behind a coagulating pool of orange fondue and an uncorked bottle of red wine.

It’s a relatively short scene in the midst of an episode that has a lot going on. Charlotte develops a girl crush on some “power lesbians,” Miranda dates a guy who can’t perform without porn, Samantha’s personal trainer Thor gives her a lightning bolt-shaped shave down there. But again, it establishes something both simple and essential to the development of Carrie as a character:

She can’t cook and she doesn’t care.

The inept female home cook is a common trope on television — from Lucy Ricardo to Lorelai Gilmore. It’s one that has signaled both societal shifts and stagnations in how we view traditional femininity throughout the decades and recently has made it even onto reality food TV.

Emily Contois is a doctoral candidate in American studies at Brown University, where she explores the connections between food, nutrition and identity in the everyday American experience and popular culture. Her dissertation, titled “The Dudification of Diet: Food Masculinities in Twenty-First-Century America,” examines how media representations of food, cooking and dieting construct and negotiate masculinity in our current culture.

She said that even though it is a Victorian relic, the connection between femininity, food and food labor, like cooking, remains strong.

“In many ways, cooking continues to be viewed as part of ‘successful’ femininity. The idea that ‘real’ women cook — and cook well — is still a dominant social convention, even as we’ve seen significant social and demographic shifts,” Contois said. “Media, like scripted TV, both shape and reflect these social trends in that they repeat and reinforce these rather antiquated ideas about gender and food.”

If a woman cooks passably well on television, it’s probably not going to end up as a pivotal point in an episode or a series because it is simply expected. In other words, compare June Cleaver with Lucy Ricardo. One serves Sunday supper with no fanfare; the other is cornered in her kitchen by a gigantic loaf of bread baked with too much yeast.

For that reason, it’s interesting to note the context of the Lucy versus the mammoth breadstick scene. It appears in the 1951 season 1 episode called “Pioneer Women,” in which Lucy (Lucille Ball) and Ethel (Vivian Vance) stage a revolt over their domestic duties and beg their husbands for modern conveniences like dishwashers — or in Ethel’s case, a pair of rubber gloves.

Ricky (Desi Arnaz) and Fred (William Frawley) bet they can survive longer than “the girls” without using anything invented after 1900, including electricity.

Lucy’s discontentment in this episode is part of a greater theme present in the series. She wants to be in the show, too; she wants an identity separate from her role as a homemaker. It’s a constant endeavor that always ends with a laugh.

In the book “Women Watching Television: Gender, Class, and Generation in the American Television Experience,” Andrea Press separates television eras into prefeminist, feminist and postfeminist. Lucy Ricardo would be classified as a prefeminist character, one whose identity and femininity are tied to her value as a homemaker.

Despite the fact that Ball pushes some definite gender boundaries when it comes to her use of physical comedy and vocal resistance to the restraints Ricky places on her daily routine, Press framed Lucy’s comedic failures at the end of each episode — like the badly baked bread — as a process of “domestication,” whereby Ricky welcomes her back, with open arms, into her rightful place. (Keep in mind: This was the 1950s.)

This intertwines with an idea that Contois said is also common in the TV portrayal of female home cooks.

“Another theme to consider is the idea of  ‘food is love’ — that cooking for others, feeding others, is a culturally salient way that women in the role of mother, wife, daughter, etcetera, are expected to show love and affection,” she said.

Perhaps my favorite example of this theme comes from the 1970s series “Little House on the Prairie.”

In the sixth season episode “Back to School: Part 2,” the dashing Almanzo Wilder (Dean Butler) has recently moved into town, and both the independent Laura Ingalls (Melissa Gilbert) and her rival, mean-spirited Nellie Olesen (Alison Arngrim), vie for his attention.

With some help from her overbearing mother, Harriet (Katherine MacGregor), Nellie seemingly gets ahead by inviting Almanzo to their family restaurant for dinner. (Remember: As Harriet says, the way to a man’s heart is through his stomach.)

But here’s where things get interesting: Neither Harriet nor Nellie can cook.

In fact, Laura’s mother, Caroline (Karen Grassle) — occasionally lauded as a great cook throughout the series — is actually the chef at the restaurant. Harriet just assumes she could slip Caroline a little extra cash to make a dinner for which Nellie could take credit (the prairie days’ version of sliding takeout onto a real plate for a date). But Caroline refuses to work on the “Lord’s day.” Laura chimes in, saying she would be glad to do the Lord’s work by helping a friend and offers to pre-make some cinnamon chicken, Almanzo’s favorite.

Laura, who apparently has inherited some level of cooking prowess from her mother, makes a good show of it — but ultimately swaps out the cinnamon for an obscene amount of cayenne pepper, effectively sabotaging date night.

While Almanzo and Nellie spend the next few minutes coughing up a lung and wiping their eyes, there’s a not-so-subtle message underpinning the situation: Nellie is not wife material at all.

Twenty-three episodes later, Almanzo proposes to Laura.

Fast-forward, though, into the late ’90s through today. We’re looking again at our Carrie Bradshaws and Lorelai Gilmores — women who, with their lackluster cooking skills and resistance to recipes, occupy spaces of feminism (or, according to Press, postfeminism), however flawed.

Postfeminism, Press explained, is characterized by “a clear and constant undercutting of the ideals and visions of liberal feminism, which stressed the need for women to achieve equality with men in the workplace, the home, and the bedroom.”

She wrote, “On postfeminist television, women’s family role is normally emphasized; or if it is not, this narrative fact commands a great deal of attention on the show.” Again, the fact that Carrie can’t cook and chooses not to do so is a theme mentioned over and over throughout the series and her various on-screen relationships. Despite Carrie having some modern attitudes toward sexuality and gender roles, this is a trait ostensibly included to make viewers consider her worth as a partner. We see a similar line of writing in “Gilmore Girls” regarding Lorelai’s role as a mother.

“So, yes, presenting a female TV character as a ‘bad cook’ definitely shapes that character’s gender identity and how it does or does not achieve social ideals,” Contois said. “These depictions aren’t always negative, but they are often ambivalent.”

And these depictions have now reached reality food television, perhaps most notably on Food Network’s “Worst Cooks in America.” The premise of the show is pretty simple. Two heralded celebrity chefs — this season it’s Rachael Ray and Anne Burrell — take on the task of transforming useless home cooks into seasoned semipros.

“I’ll say that ‘Worst Cooks’ fits into almost a two-decade tradition at Food Network of an underlying theme that anybody should be able to cook,” said Allen Salkin, author of the 2013 book “From Scratch,” which delves into the history of the Food Network.

And now it’s a theme that’s being amped up for modern reality TV audiences by featuring particularly inept contestants, both male and female. While the show may strike some as inane or mildly exploitative, Salkin said it’s important to remember that the job of Food Network is not to educate people about food or improve the way Americans eat.

“Rather, the business of Food Network is to sell you toothpaste and Acuras and every consumer product that pays for advertising,” Salkin said. “So all the network cares about is eyeballs.”

What viewers see is common tropes surrounding women in the kitchen. In the one-on-one camera interviews this season, each contestant gives a reason why he or she would like to become a better cook.

One contestant named Brittany moans, “How am I going to get a husband if I don’t know how to cook? I wanna cook for my boo.” Another named Mandy shows a photo of her 19-month-old daughter, Ryland. “Everything I serve Ryland is prepackaged and premade,” she says, a fact that Mandy is not proud of.  

According to Contois, advertisers know the angst surrounding cooking and food is a weak spot for many women.

“Katherine Parkin writes about how food advertisers exploited this idea for decades, particularly in making women feel deeply guilty about whether they were cooking for their families,” she said. “And if convenience food products ‘counted’ as cooking — and as love.”

And therein lies one of the major distinctions between the answers given by the male and female “Worst Cooks” contestants. Overwhelmingly, the men want to learn to cook for themselves (save for a dad or two in the group), whereas the women wish to learn to cook real food for their current or prospective loved ones.

On “Worst Cooks,” we hear stories of real women who are useless in the kitchen but who want to improve for the sake of their children, love life or reputations as functioning adults — stories that so clearly mimic themes which, as Contois said, have been repeated and reinforced through years of television. It’s a weekly reminder that while we might be in a new decade, sometimes we’re still tied to old narratives.

Source: New feed

Opiate for the Masses: Jimmy Kimmel may be the right prescription for a politically charged Oscar night

Covered Oscars statues rest under a tent, to guard against rain, along the red carpet ahead of the 86th Academy Awards in Hollywood, California

(Credit: Reuters/Adrees Latif)

Jimmy Kimmel is hours away from strolling in to one of the toughest performance evaluations in showbiz: Oscar host. The late night star told Variety that shortly after he was tapped for the gig, he shared the news with his wife with the same clenched cautiousness a person would use to break the news of a car wreck.

Part of that was shtick, mind you; approaching everything with a smidgen of a disconsolate air is Kimmel’s signature, brandished to humorous effect on his ABC late night show. “Jimmy Kimmel Live!” viewers know he’s a showman, just one who always looks slightly uncomfortable in his suit.

In the pantheon of Oscar, this sets Kimmel up to be the human equivalent of Valium. Spending time with him will be pleasant. He’s easy to laugh with and doesn’t get in the way of whatever happens to be in your face. If he stumbles, people probably won’t care all that much. Not with all the beautiful people to ogle.

Weeks, if not days, after this year’s Oscars are in the books, Kimmel’s missteps will be all but forgotten and most of what we’ll remember are the night’s biggest victors. Should Sunday night fall together thusly, Kimmel will have triumphed.

That said, opinion as to what goes into a successful, effective awards show hosting experience varies wildly and depends upon how each show’s chosen ringmaster chooses to treat his or her job.

The Oscars are Hollywood’s greatest holiday, a celebration of haute couture and self-congratulation as actors fine-tune aggrandizing speeches about the vital importance of film, and the industry that made them famous. The best hosts find a balance between popping those glistening bubbles of braggadocio and giving them more air to help them sail.

Kimmel handily struck that balance when he hosted the 68th Primetime Emmy Awards which aired on ABC, the network home of Sunday’s Oscars telecast. As I wrote back then, his comedic accessibility lent the telecast a welcome measure of agility, moving everything along with hosting style that was edgy at times but never unkind. This represented a drastic improvement from his 2012 virgin voyage with Emmy.

Nevertheless, the 2016 Emmys broadcast netted only 11.3 million viewers, making it the lowest-rated in TV history. With millions of people watching the ceremony around the globe, Kimmel’s first outing with Oscar will be seen, and judged, by tens of millions more people. Last year’s Oscars show had an audience of 34 million, making it the least-watched in eight years.

Legendary Oscar captains are a rare breed and tend to be well-matched for the times in which they were chosen to helm. Billy Crystal set the standard for all modern hosts to follow; he was the candidate of choice through the Clinton years, hosting five of his eight times between 1992 and 2000. Accordingly, he approached the role as if his job was to help America bump up and keep dancing until the end of the party.

Hugh Jackman brought Broadway elegance to the 2009 Academy Awards telecast to rave reviews. In fact, over the past decade, only Jackman and two-time hosts Ellen DeGeneres and Chris Rock have emerged from their times at the wheel smelling like roses. We know this because those of us who can recall their efforts without the benefit of a YouTube refresher don’t grimace at the memory.

Mention Seth MacFarlane, James Franco and Anne Hathaway, however, and the kindest word that comes to mind is “excruciating.” Bombing on Hollywood’s grandest prom night did not adversely affect the careers of any of those performers, but each of their turns was viewed as less as the proverbial feather in the caps of their careers than, say, a fascinator sculpted out of turds.

These unfortunates made the mistake of attempting to be a part of the show, something very few performers can get away with. Only two have done so unequivocally: DeGeneres, who flooded Twitter with her famous Oscars selfie, and Crystal, whose show opening montages transformed the audience’s expectation of what an awards show could be.

Otherwise, the finest service host can offer is to make everything as painless as possible for everyone involved. Kimmel is incredibly skilled at this, because he’s been doing it almost every night for the last 14 years.

At the same time, he is one of the few hosts who actually has a chance to make his mark . . . in a good way.

If each Oscar ceremony’s field of nominees and winners reflects the tone of the year that preceded it, each host must somehow shape his or her approach accordingly, finding the appropriate prescriptive mix for the moment.

And the 89th Oscars is going to be a uniquely rough telecast to wrangle. Many are expecting criticism of Donald Trump and his administration to overshadow the evening. United Talent Agency canceled its Oscar fete and donated $250,000 to the ACLU. Trump’s travel ban targeting majority-Muslim countries resulted in director Asghar Farhadi, whose movie “The Salesman” is a nominee for best foreign-language film, declining to attend the ceremony

Kimmel, for his part, hasn’t noticeably hit harder at Trump in his late-night monologues than he did before, opting to maintain the course his show has always followed. Indeed, Kimmel gets a Japanese economy car’s worth of mileage out of being the underdog next to A-listers who play along with his low-key digs. He often makes himself the butt of the joke, a man steamrolled by the shenanigans of people more powerful than he.

The exception to this is a popular recurring bit on his show involves Kimmel’s fake feud with Matt Damon, the core of which is Kimmel’s ersatz glee at cruelly trolling Damon, to which Damon responds with elaborate displays of schadenfreude. Some version of this all but guaranteed to take center stage at some point on Sunday night, as it did at the Emmys.

A portion of Kimmel’s Everyman act is a something of a sham, but his touch could work well on Sunday, and he’s an expert at humanizing celebrities. His “Mean Tweets” viral videos, in which celebrities read ad hominem attacks directed at them via Twitter, is another quality Kimmel product tailor made for Oscar night.

He’s also the perennial third-place finisher in a late night ratings wars that Stephen Colbert has been winning lately by taking a more aggressive stance against the president and his policies.

Then again, at the Emmys Kimmel famously referred to the woman who is now our nation’s First Lady as “Malaria”. He also told “The Celebrity Apprentice” creator and executive producer Mark Burnett, “If Donald Trump gets elected, and he builds that wall, the first person we’re throwing over it is Mark Burnett.”

Kimmel can throw some punches if he needs to. But don’t be surprised the host opts to let the stars stick and move instead. There will be plenty of that going around, no doubt. Besides, if the idea of winning a positive review is all but impossible in the simplest of times, negative reaction is all but guaranteed this year.

Regardless of what happens, it will give Kimmel stupendous material for his show. That alone makes his effort worthwhile. As long as Kimmel’s Oscar efforts aren’t memorable for the wrong reasons, he’ll be a winner.

Source: New feed

California’s rain may shed light on new questions about what causes earthquakes

California Storms

A sign is submerged in the water from Coyote Creek Tuesday, Feb. 21, 2017, in Morgan Hill, Calif. Rains have saturated once-drought stricken California but have created chaos for residents hit hard by the storms. The latest downpours swelled waterways to flood levels and left about half the state under flood, wind and snow advisories. () (Credit: AP Photo/Marcio Jose Sanchez)

In recent weeks, California has experienced unusually heavy rainfall. California is also earthquake-prone, hosting the great San Andreas fault zone.

If there is an unusual surge of earthquakes in the near future – allowing time for the rain to percolate deep into faults – California may well become an interesting laboratory to study possible connections between weather and earthquakes. The effect is likely to be subtle and will require sophisticated computer modeling and statistical analysis.

Earthquakes are triggered by a tiny additional increment of stress added to a fault already loaded almost to breaking point. Many natural processes can provide this tiny increment of stress, including the movement of plate tectonics, a melting icecap, and even human activities.

For example, injecting water into boreholes – either for waste disposal or to drive residual oil out of depleted reservoirs – is particularly likely to trigger earthquakes.

This is because water pressure in the fault zone is important in controlling when a geological fault slips. Fault zones invariably contain groundwater, and if the pressure of this water increases, the fault may become “unclamped.” The two sides are then free to slip past each other, causing an earthquake.

Hydrological changes do not need to be sudden or large to change the water pressure in a fault zone. As aquifers are depleted for irrigation, the water table slowly drops, which may also trigger earthquakes. It is thus unsurprising that extreme rainfall events might also encourage earthquakes. A number of instances of this have been flagged by scientists. For example, swarms of earthquakes in 2002 followed intense rainfall around Mt. Hochstaufen in Germany and the Muotatal and Riemenstalden regions of Switzerland.

Any study of the relationship between weather and earthquakes is likely to take time, and the results to be controversial. In the meantime, now is a good time to check that your gas heater is earthquake-secure and your emergency drinking water is fresh. After all, a “big one” could come at any time.

Triggering earthquakes

People knew we could induce earthquakes before we knew what they were. As soon as people started to dig minerals out of the ground, rockfalls and tunnel collapses must have become recognized hazards.

Today, earthquakes caused by humans occur on a much greater scale. Events over the last century have shown mining is just one of many industrial activities that can induce earthquakes large enough to cause significant damage and death. Filling of water reservoirs behind dams, extraction of oil and gas, and geothermal energy production are just a few of the modern industrial activities shown to induce earthquakes.

As more and more types of industrial activity were recognized to be potentially seismogenic, the Nederlandse Aardolie Maatschappij BV, an oil and gas company based in the Netherlands, commissioned us to conduct a comprehensive global review of all human-induced earthquakes.

Our work assembled a rich picture from the hundreds of jigsaw pieces scattered throughout the national and international scientific literature of many nations. The sheer breadth of industrial activity we found to be potentially seismogenic came as a surprise to many scientists. As the scale of industry grows, the problem of induced earthquakes is increasing also.

In addition, we found that, because small earthquakes can trigger larger ones, industrial activity has the potential, on rare occasions, to induce extremely large, damaging events.

How humans induce earthquakes

As part of our review we assembled a database of cases that is, to our knowledge, the fullest drawn up to date. In January, we released this database publicly. We hope it will inform citizens about the subject and stimulate scientific research into how to manage this very new challenge to human ingenuity.

Our survey showed mining-related activity accounts for the largest number of cases in our database.

Initially, mining technology was primitive. Mines were small and relatively shallow. Collapse events would have been minor – though this might have been little comfort to anyone caught in one.

But modern mines exist on a totally different scale. Precious minerals are extracted from mines that may be over two miles deep or extend several miles offshore under the oceans. The total amount of rock removed by mining worldwide now amounts to several tens of billions of tons per year. That’s double what it was 15 years ago – and it’s set to double again over the next 15. Meanwhile, much of the coal that fuels the world’s industry has already been exhausted from shallow layers, and mines must become bigger and deeper to satisfy demand.

As mines expand, mining-related earthquakes become bigger and more frequent. Damage and fatalities, too, scale up. Hundreds of deaths have occurred in coal and mineral mines over the last few decades as a result of earthquakes up to magnitude 6.1 that have been induced.

Other activities that might induce earthquakes include the erection of heavy superstructures. The 700-megaton Taipei 101 building, raised in Taiwan in the 1990s, was blamed for the increasing frequency and size of nearby earthquakes.

Since the early 20th century, it has been clear that filling large water reservoirs can induce potentially dangerous earthquakes. This came into tragic focus in 1967 when, just five years after the 32-mile-long Koyna reservoir in west India was filled, a magnitude 6.3 earthquake struck, killing at least 180 people and damaging the dam.

Throughout the following decades, ongoing cyclic earthquake activity accompanied rises and falls in the annual reservoir-level cycle. An earthquake larger than magnitude 5 occurs there on average every four years. Our report found that, to date, some 170 reservoirs the world over have reportedly induced earthquake activity.

The production of oil and gas was implicated in several destructive earthquakes in the magnitude 6 range in California. This industry is becoming increasingly seismogenic as oil and gas fields become depleted. In such fields, in addition to mass removal by production, fluids are also injected to flush out the last of the hydrocarbons and to dispose of the large quantities of salt water that accompany production in expiring fields.

A relatively new technology in oil and gas is shale-gas hydraulic fracturing, or fracking, which by its very nature generates small earthquakes as the rock fractures. Occasionally, this can lead to a larger-magnitude earthquake if the injected fluids leak into a fault that is already stressed by geological processes.

The largest fracking-related earthquake that has so far been reported occurred in Canada, with a magnitude of 4.6. In Oklahoma, multiple processes are underway simultaneously, including oil and gas production, wastewater disposal and fracking. There, earthquakes as large as magnitude 5.7 have rattled skyscrapers that were erected long before such seismicity was expected. If such an earthquake is induced in Europe in the future, it could be felt in the capital cities of several nations.

Our research shows that production of geothermal steam and water has been associated with earthquakes up to magnitude 6.6 in the Cerro Prieto Field, Mexico. Geothermal energy is not renewable by natural processes on the timescale of a human lifetime, so water must be reinjected underground to ensure a continuous supply. This process appears to be even more seismogenic than production. There are numerous examples of earthquake swarms accompanying water injection into boreholes, such as at The Geysers, California.

Other materials pumped underground, including carbon dioxide and natural gas, also cause seismic activity. A recent project to store 25 percent of Spain’s natural gas requirements in an old, abandoned offshore oilfield resulted in the immediate onset of vigorous earthquake activity with events up to magnitude 4.3. The threat that this posed to public safety necessitated abandonment of this US$1.8 billion project.

What this means for the future

Nowadays, earthquakes induced by large industrial projects no longer meet with surprise or even denial. On the contrary, when an event occurs, the tendency may be to look for an industrial project to blame. In 2008, an earthquake in the magnitude 8 range struck Ngawa Prefecture, China, killing about 90,000 people, devastating over 100 towns, and collapsing houses, roads and bridges. Attention quickly turned to the nearby Zipingpu Dam, whose reservoir had been filled just a few months previously, although the link between the earthquake and the reservoir has yet to be proven.

The minimum amount of stress loading scientists think is needed to induce earthquakes is creeping steadily downward. The great Three Gorges Dam in China, which now impounds 10 cubic miles of water, has already been associated with earthquakes as large as magnitude 4.6 and is under careful surveillance.

Scientists are now presented with some exciting challenges. Earthquakes can produce a “butterfly effect”: Small changes can have a large impact. Thus, not only can a plethora of human activities load Earth’s crust with stress, but just tiny additions can become the last straw that breaks the camel’s back, precipitating great earthquakes that release the accumulated stress loaded onto geological faults by centuries of geological processes. Whether or when that stress would have been released naturally in an earthquake is a challenging question.

An earthquake in the magnitude 5 range releases as much energy as the atomic bomb dropped on Hiroshima in 1945. A earthquake in the magnitude 7 range releases as much energy as the largest nuclear weapon ever tested, the Tsar Bomba test conducted by the Soviet Union in 1961. The risk of inducing such earthquakes is extremely small, but the consequences if it were to happen are extremely large. This poses a health and safety issue that may be unique in industry for the maximum size of disaster that could, in theory, occur. However, rare and devastating earthquakes are a fact of life on our dynamic planet, regardless of whether or not there is human activity.

Our work suggests that the only evidence-based way to limit the size of potential earthquakes may be to limit the scale of the projects themselves. In practice, this would mean smaller mines and reservoirs, less minerals, oil and gas extracted from fields, shallower boreholes and smaller volumes injected. A balance must be struck between the growing need for energy and resources and the level of risk that is acceptable in every individual project.

This article is an updated version of one published on Jan. 22, 2017.

The Conversation

Gillian Foulger, Professor of Geophysics, Durham University; Jon Gluyas, , Durham University, and Miles Wilson, Ph.D. Student in the Department of Earth Sciences, Durham University

Source: New feed

Sharp vision: New glasses help the legally blind see

Glasses For The Blind

In this photo taken Thursday, Feb. 2, 2017, Yvonne Felix wears eSight electronic glasses and looks around Union Square during a visit to San Francisco. The glasses enable the legally blind to see. Felix was diagnosed with Stargardt’s disease after being hit by a car at the age of seven. (AP Photo/Eric Risberg) (Credit: AP)

SAN FRANCISCO  — Jeff Regan was born with underdeveloped optic nerves and had spent most of his life in a blur. Then four years ago, he donned an unwieldy headset made by a Toronto company called eSight.

Suddenly, Regan could read a newspaper while eating breakfast and make out the faces of his co-workers from across the room. He’s been able to attend plays and watch what’s happening on stage, without having to guess why people around him were laughing.

“These glasses have made my life so much better,” said Regan, 48, a Canadian engineer who lives in London, Ontario.

The headsets from eSight transmit images from a forward-facing camera to small internal screens — one for each eye — in a way that beams the video into the wearer’s peripheral vision. That turns out to be all that some people with limited vision, even legal blindness, need to see things they never could before. That’s because many visual impairments degrade central vision while leaving peripheral vision largely intact.

Although eSight’s glasses won’t help people with total blindness, they could still be a huge deal for the millions of people whose vision is so impaired that it can’t be corrected with ordinary lenses.

Eye test

But eSight still needs to clear a few minor hurdles.

Among them: proving the glasses are safe and effective for the legally blind. While eSight’s headsets don’t require the approval of health regulators — they fall into the same low-risk category as dental floss — there’s not yet firm evidence of their benefits. The company is funding clinical trials to provide that proof.

The headsets also carry an eye-popping price tag. The latest version of the glasses, released just last week, sells for about $10,000. While that’s $5,000 less than its predecessor, it’s still a lot for people who often have trouble getting high-paying jobs because they can’t see.

Insurers won’t cover the cost; they consider the glasses an “assistive” technology similar to hearing aids.

ESight CEO Brian Mech said the latest improvements might help insurers overcome their short-sighted view of his product. Mech argues that it would be more cost-effective for insurers to pay for the headsets, even in part, than to cover more expensive surgical procedures that may restore some sight to the visually impaired.

New glasses

The latest version of ESight’s technology, built with investments of $32 million over the past decade, is a gadget that vaguely resembles the visor worn by the blind “Star Trek” character Geordi La Forge , played by LeVar Burton.

The third-generation model lets wearers magnify the video feed up to 24 times, compared to just 14 times in earlier models. There’s a hand control for adjusting brightness and contrast. The new glasses also come with a more powerful high-definition camera.

ESight believes that about 200 million people worldwide with visual acuity of 20/70 to 20/1200 could be potential candidates for its glasses. That number includes people with a variety of disabling eye conditions such as macular degeneration, diabetic retinopathy, ocular albinism, Stargardt’s disease, or, like Regan, optic nerve hypoplasia.

So far, though, the company has sold only about 1,000 headsets, despite the testimonials of wearers who’ve become true believers.

Take, for instance, Yvonne Felix, an artist who now works as an advocate for eSight after seeing the previously indistinguishable faces of her husband and two sons for the first time via its glasses. Others, ranging from kids to senior citizens, have worn the gadgets to golf, watch football or just perform daily tasks such as reading nutrition labels.

Eyeing the competition 

ESight isn’t the only company focused on helping the legally blind. Other companies working on high-tech glasses and related tools include Aira , Orcam , ThirdEye , NuEyes and Microsoft .

But most of them are doing something very different. While their approaches also involve cameras attached to glasses, they don’t magnify live video. Instead, they take still images, analyze them with image recognition software and then generate an automated voice that describes what the wearer is looking at — anything from a child to words written on a page.

Samuel Markowitz, a University of Toronto professor of ophthalmology, says that eSight’s glasses are the most versatile option for the legally blind currently available, as they can improve vision at near and far distances, plus everything in between.

Markowitz is one of the researchers from five universities and the Center for Retina and Macular Disease that recently completed a clinical trial of eSight’s second-generation glasses. Although the results won’t be released until later this year, Markowitz said the trials found little risk to the glasses. The biggest hazard, he said, is the possibility of tripping and falling while walking with the glasses covering the eyes.

The device “is meant to be used while in a stationary situation, either sitting or standing, for looking around at the environment,” Markowitz said.

Source: New feed

Angst in the Church of America the Redeemer

girl_prayer_flag

(Credit: ideabug via iStock)

Apart from being a police officer, firefighter, or soldier engaged in one of this nation’s endless wars, writing a column for a major American newspaper has got to be one of the toughest and most unforgiving jobs there is.  The pay may be decent (at least if your gig is with one of the major papers in New York or Washington), but the pressures to perform on cue are undoubtedly relentless.

Anyone who has ever tried cramming a coherent and ostensibly insightful argument into a mere 750 words knows what I’m talking about.  Writing op-eds does not perhaps qualify as high art.  Yet, like tying flies or knitting sweaters, it requires no small amount of skill.  Performing the trick week in and week out without too obviously recycling the same ideas over and over again — or at least while disguising repetitions and concealing inconsistencies — requires notable gifts.

David Brooks of the New York Times is a gifted columnist.  Among contemporary journalists, he is our Walter Lippmann, the closest thing we have to an establishment-approved public intellectual.  As was the case with Lippmann, Brooks works hard to suppress the temptation to rant.  He shuns raw partisanship.  In his frequent radio and television appearances, he speaks in measured tones.  Dry humor and ironic references abound.  And like Lippmann, when circumstances change, he makes at least a show of adjusting his views accordingly.

For all that, Brooks remains an ideologue.  In his columns, and even more so in his weekly appearances on NPR and PBS, he plays the role of the thoughtful, non-screaming conservative, his very presence affirming the ideological balance that, until November 8th of last year, was a prized hallmark of “respectable” journalism.  Just as that balance always involved considerable posturing, so, too, with the ostensible conservatism of David Brooks: it’s an act.

Praying at the Altar of American Greatness

In terms of confessional fealty, his true allegiance is not to conservatism as such, but to the Church of America the Redeemer.  This is a virtual congregation, albeit one possessing many of the attributes of a more traditional religion.  The Church has its own Holy Scripture, authenticated on July 4, 1776, at a gathering of 56 prophets.  And it has its own saints, prominent among them the Good Thomas Jefferson, chief author of the sacred text (not the Bad Thomas Jefferson who owned and impregnated slaves); Abraham Lincoln, who freed said slaves and thereby suffered martyrdom (on Good Friday no less); and, of course, the duly canonized figures most credited with saving the world itself from evil: Winston Churchill and Franklin Roosevelt, their status akin to that of saints Peter and Paul in Christianity.  The Church of America the Redeemer even has its own Jerusalem, located on the banks of the Potomac, and its own hierarchy, its members situated nearby in High Temples of varying architectural distinction.

This ecumenical enterprise does not prize theological rigor. When it comes to shalts and shalt nots, it tends to be flexible, if not altogether squishy. It demands of the faithful just one thing: a fervent belief in America’s mission to remake the world in its own image. Although in times of crisis Brooks has occasionally gone a bit wobbly, he remains at heart a true believer.

In a March 1997 piece for The Weekly Standardhis then-employer, he summarized his credo.  Entitled “A Return to National Greatness,” the essay opened with a glowing tribute to the Library of Congress and, in particular, to the building completed precisely a century earlier to house its many books and artifacts.  According to Brooks, the structure itself embodied the aspirations defining America’s enduring purpose.  He called particular attention to the dome above the main reading room decorated with a dozen “monumental figures” representing the advance of civilization and culminating in a figure representing America itself.  Contemplating the imagery, Brooks rhapsodized:

“The theory of history depicted in this mural gave America impressive historical roots, a spiritual connection to the centuries. And it assigned a specific historic role to America as the latest successor to Jerusalem, Athens, and Rome. In the procession of civilization, certain nations rise up to make extraordinary contributions… At the dawn of the 20th century, America was to take its turn at global supremacy.  It was America’s task to take the grandeur of past civilizations, modernize it, and democratize it.  This common destiny would unify diverse Americans and give them a great national purpose.”

This February, 20 years later, in a column with an identical title, but this time appearing in the pages of his present employer, the New York TimesBrooks revisited this theme.  Again, he began with a paean to the Library of Congress and its spectacular dome with its series of “monumental figures” that placed America “at the vanguard of the great human march of progress.”  For Brooks, those 12 allegorical figures convey a profound truth.

“America is the grateful inheritor of other people’s gifts.  It has a spiritual connection to all people in all places, but also an exceptional role.  America culminates history.  It advances a way of life and a democratic model that will provide people everywhere with dignity.  The things Americans do are not for themselves only, but for all mankind.”

In 1997, in the midst of the Clinton presidency, Brooks had written that “America’s mission was to advance civilization itself.”  In 2017, as Donald Trump gained entry into the Oval Office, he embellished and expanded that mission, describing a nation “assigned by providence to spread democracy and prosperity; to welcome the stranger; to be brother and sister to the whole human race.”

Back in 1997, “a moment of world supremacy unlike any other,” Brooks had worried that his countrymen might not seize the opportunity that was presenting itself.  On the cusp of the twenty-first century, he worried that Americans had “discarded their pursuit of national greatness in just about every particular.”  The times called for a leader like Theodore Roosevelt, who wielded that classic “big stick” and undertook monster projects like the Panama Canal.  Yet Americans were stuck instead with Bill Clinton, a small-bore triangulator.  “We no longer look at history as a succession of golden ages,” Brooks lamented.  “And, save in the speeches of politicians who usually have no clue what they are talking about,” America was no longer fulfilling its “special role as the vanguard of civilization.”

By early 2017, with Donald Trump in the White House and Steve Bannon whispering in his ear, matters had become worse still.  Americans had seemingly abandoned their calling outright.  “The Trump and Bannon anschluss has exposed the hollowness of our patriotism,” wrote Brooks, inserting the now-obligatory reference to Nazi Germany.  The November 2016 presidential election had “exposed how attenuated our vision of national greatness has become and how easy it was for Trump and Bannon to replace a youthful vision of American greatness with a reactionary, alien one.”  That vision now threatens to leave America as “just another nation, hunkered down in a fearful world.”

What exactly happened between 1997 and 2017, you might ask?  What occurred during that “moment of world supremacy” to reduce the United States from a nation summoned to redeem humankind to one hunkered down in fear?

Trust Brooks to have at hand a brow-furrowing explanation.  The fault, he explains, lies with an “educational system that doesn’t teach civilizational history or real American history but instead a shapeless multiculturalism,” as well as with “an intellectual culture that can’t imagine providence.”  Brooks blames “people on the left who are uncomfortable with patriotism and people on the right who are uncomfortable with the federal government that is necessary to lead our project.”

An America that no longer believes in itself — that’s the problem. In effect, Brooks revises Norma Desmond’s famous complaint about the movies, now repurposed to diagnose an ailing nation: it’s the politics that got small.

Nowhere does he consider the possibility that his formula for “national greatness” just might be so much hooey. Between 1997 and 2017, after all, egged on by people like David Brooks, Americans took a stab at “greatness,” with the execrable Donald Trump now numbering among the eventual results.

Invading Greatness

Say what you will about the shortcomings of the American educational system and the country’s intellectual culture, they had far less to do with creating Trump than did popular revulsion prompted by specific policies that Brooks, among others, enthusiastically promoted. Not that he is inclined to tally up the consequences. Only as a sort of postscript to his litany of contemporary American ailments does he refer even in passing to what he calls the “humiliations of Iraq.”

A great phrase, that. Yet much like, say, the “tragedy of Vietnam” or the “crisis of Watergate,” it conceals more than it reveals.  Here, in short, is a succinct historical reference that cries out for further explanation. It bursts at the seams with implications demanding to be unpacked, weighed, and scrutinized.  Brooks shrugs off Iraq as a minor embarrassment, the equivalent of having shown up at a dinner party wearing the wrong clothes.

Under the circumstances, it’s easy to forget that, back in 2003, he and other members of the Church of America the Redeemer devoutly supported the invasion of Iraq.  They welcomed war.  They urged it. They did so not because Saddam Hussein was uniquely evil — although he was evil enough — but because they saw in such a war the means for the United States to accomplish its salvific mission.  Toppling Saddam and transforming Iraq would provide the mechanism for affirming and renewing America’s “national greatness.”

Anyone daring to disagree with that proposition they denounced as craven or cowardly.  Writing at the time, Brooks disparaged those opposing the war as mere “marchers.” They were effete, pretentious, ineffective, and absurd.  “These people are always in the streets with their banners and puppets.  They march against the IMF and World Bank one day, and against whatever war happens to be going on the next… They just march against.”

Perhaps space constraints did not permit Brooks in his recent column to spell out the “humiliations” that resulted and that even today continue to accumulate.  Here in any event is a brief inventory of what that euphemism conceals: thousands of Americans needlessly killed; tens of thousands grievously wounded in body or spirit; trillions of dollars wasted; millions of Iraqis dead, injured, or displaced; this nation’s moral standing compromised by its resort to torture, kidnapping, assassination, and other perversions; a region thrown into chaos and threatened by radical terrorist entities like the Islamic State that U.S. military actions helped foster.  And now, if only as an oblique second-order bonus, we have Donald Trump’s elevation to the presidency to boot.

In refusing to reckon with the results of the war he once so ardently endorsed, Brooks is hardly alone.  Members of the Church of America the Redeemer, Democrats and Republicans alike, are demonstrably incapable of rendering an honest accounting of what their missionary efforts have yielded.

Brooks belongs, or once did, to the Church’s neoconservative branch. But liberals such as Bill Clinton, along with his secretary of state Madeleine Albright, were congregants in good standing, as were Barack Obama and his secretary of state Hillary Clinton.  So, too, are putative conservatives like Senators John McCainTed Cruz, and Marco Rubio, all of them subscribing to the belief in the singularity and indispensability of the United States as the chief engine of history, now and forever.

Back in April 2003, confident that the fall of Baghdad had ended the Iraq War, Brooks predicted that “no day will come when the enemies of this endeavor turn around and say, ‘We were wrong. Bush was right.’” Rather than admitting error, he continued, the war’s opponents “will just extend their forebodings into a more distant future.”

Yet it is the war’s proponents who, in the intervening years, have choked on admitting that they were wrong. Or when making such an admission, as did both John Kerry and Hillary Clinton while running for president, they write it off as an aberration, a momentary lapse in judgment of no particular significance, like having guessed wrong on a TV quiz show.

Rather than requiring acts of contrition, the Church of America the Redeemer has long promulgated a doctrine of self-forgiveness, freely available to all adherents all the time. “You think our country’s so innocent?” the nation’s 45th president recently barked at a TV host who had the temerity to ask how he could have kind words for the likes of Russian President Vladimir Putin. Observers professed shock that a sitting president would openly question American innocence.

In fact, Trump’s response and the kerfuffle that ensued both missed the point. No serious person believes that the United States is “innocent.” Worshipers in the Church of America the Redeemer do firmly believe, however, that America’s transgressions, unlike those of other countries, don’t count against it. Once committed, such sins are simply to be set aside and then expunged, a process that allows American politicians and pundits to condemn a “killer” like Putin with a perfectly clear conscience while demanding that Donald Trump do the same.

What the Russian president has done in Crimea, Ukraine, and Syria qualifies as criminal. What American presidents have done in Iraq, Afghanistan, and Libya qualifies as incidental and, above all, beside the point.

Rather than confronting the havoc and bloodshed to which the United States has contributed, those who worship in the Church of America the Redeemer keep their eyes fixed on the far horizon and the work still to be done in aligning the world with American expectations. At least they would, were it not for the arrival at center stage of a manifestly false prophet who, in promising to “make America great again,” inverts all that “national greatness” is meant to signify.

For Brooks and his fellow believers, the call to “greatness” emanates from faraway precincts — in the Middle East, East Asia, and Eastern Europe.  For Trump, the key to “greatness” lies in keeping faraway places and the people who live there as faraway as possible. Brooks et al. see a world that needs saving and believe that it’s America’s calling to do just that.  In Trump’s view, saving others is not a peculiarly American responsibility. Events beyond our borders matter only to the extent that they affect America’s well-being. Trump worships in the Church of America First, or at least pretends to do so in order to impress his followers.

That Donald Trump inhabits a universe of his own devising, constructed of carefully arranged alt-facts, is no doubt the case. Yet, in truth, much the same can be said of David Brooks and others sharing his view of a country providentially charged to serve as the “successor to Jerusalem, Athens, and Rome.” In fact, this conception of America’s purpose expresses not the intent of providence, which is inherently ambiguous, but their own arrogance and conceit. Out of that conceit comes much mischief. And in the wake of mischief come charlatans like Donald Trump.

Source: New feed

Dance scenes in movies can be tricky, but sometimes magical

Oscars Dance

This image released by Lionsgate shows a dance scene from the Oscar-nominated film, “La La Land.” It’s not easy to stage a successful dance scene for the cameras, especially on a highway interchange, but when such a scene works, it can be memorable. (Dale Robinette/Lionsgate via AP) (Credit: AP)

Oscar front-runner “La La Land” opens with a bang, or should we say a burst — of leaps and pirouettes, not to mention bicycles sashaying along the roofs of automobiles. It’s not easy to stage a successful dance scene for the cameras — especially on a highway interchange — but when such a scene works, it can be memorable.

One secret, says “La La Land” choreographer Mandy Moore, is not to compete with the camera, but in a sense, to find a way to dance WITH it. “When it’s done right, it’s this perfect marriage of how the camera is moving in conjunction and collaboration with the movement of the dancer,” she says.

Dancing on a stage is three-dimensional; on a screen, you lose an entire dimension. But what you can do is use the camera to convey emotion in a dancer in ways you can’t onstage. “You can see how dance changes the person — that’s a key,” says Wendy Perron, former editor in chief of Dance Magazine and author of “Through the Eyes of a Dancer.”

Because everyone has their favorite dance moments in movies, and because the Oscars are coming, and because, hey, it’s just fun to remember this stuff (all available online), here are a few scenes where the cameras helped create dance magic:

Yep, it was heaven

“I’m in heaven,” Fred Astaire sings to Ginger Rogers, warbling Irving Berlin’s “Cheek to Cheek” in the 1935 film “Top Hat.” And so are we. “Fred is so cool and she’s so coy,” Moore notes, adding that the scene is so successful because it tells a story through movement. “They’re almost a little icy the way they start, and then just this beautiful way that they open up through the performance, and they’re just so free and gorgeous through dancing together,” she says. Check out those swoon-worthy twirling lifts toward the end.

Log-spinning and arm wrestling

There’s some real gymnastics in the rip-roaring choreography by Michael Kidd in the 1954 film, “Seven Brides for Seven Brothers.” The big dance in the barn — with guys competing for the gals — is a showstopper. Moore loves that this dance story is told without lyrics. “These days, we’re so used to being spoon-fed what we’re supposed to feel,” she says. Check out that guy on the spinning log, not to mention what can best be described as a balance beam routine that includes arm-wrestling.

Dancin’ in the rain

Of course, Kelly’s rain-drenched virtuoso performance in the title song of “Singin’ in the Rain” (1952) is a wonder — especially when you consider that, according to movie lore, he had a bad cold and fever. Then there are Donald O’Connor’s athletics — including wall-climbing somersaults — in “Make ‘Em Laugh.” But let’s consider the recently departed Debbie Reynolds, who at age 19 had no dance training, and somehow held her own, expertly tapping away with Kelly and O’Connor in the joyous “Good Morning” — which she has said made her feet bleed.

Mambo in the gym

There’s no debating the brilliance of Jerome Robbins’ choreography for “West Side Story” (1961). But which dance scene gets top billing? For Moore, it’s that opening with the Jets and Sharks and those snapping fingers. “You just do that snap and a little jump and everybody instantly knows it’s ‘West Side Story,’” she says. For Perron, it’s the Mambo dance at the gym, where Maria (Natalie Wood) and Tony (Richard Beymer) fall in love. Especially that cinematic moment “when all the others blur out, and Tony and Maria come into focus, and it’s just an amazing falling-in-love moment. The music slows down, and there’s an inevitability about their coming together and ignoring the whole world.”

The magic of Mikhail

You can dispute the overall quality of the 1985 “White Nights,” but here’s one thing you can’t dispute: the dancing prowess of Mikhail Baryshnikov and Gregory Hines. The two, who both play defectors (it’s complicated), have silly dialogue but compelling dancing, together and apart. And, if you only have 2 minutes on your hands, search for “Baryshnikov” and “11 pirouettes.” For 11 rubles, he does what is really one single pirouette with 11 revolutions — perfectly. In street clothes.

Step in time

They’re doing a high-profile “Mary Poppins” sequel, but for many it will be hard to match some of the memories of the 1964 original, like Julie Andrews and Dick Van Dyke dancing in “Step in Time” — that joyful chimney sweep scene on the London rooftops. “It takes the dirty, sooty experience of working on chimneys and makes it magical,” says Perron.

Travolta trifecta 

You gotta hand it to John Travolta, who’s provided more than his share of lasting dance memories. First there was “Saturday Night Fever” (1977), where the actor earned big-screen fame as Tony Manero, king of the disco floor and champion of the strut. Only a year later he tore up the gym floor in “Grease,” co-starring Olivia Newton-John. And in 1994, there was that understated — but unforgettable — twist contest with Uma Thurman in “Pulp Fiction.”

You know, that lift 

No one leaves Baby off a list. Before Emma Stone and Ryan Gosling did “La La Land,” they did “Crazy, Stupid, Love,” (2011) in which they recreated the famous “Dirty Dancing” lift made famous by Patrick Swayze and Jennifer Grey in 1977. You know the one. Enough said.

A storm of dancing

If you watch one dance clip, let it be this: the Nicholas Brothers, Fayard and Harold, in their have-to-see-it-to-believe-it performance in “Stormy Weather” (1943). It’s not just that the brothers, who overcame racial hurdles to earn fame for their astounding talent, tap and twirl and jump onto tables; they jump into full splits, too, in moves that look like they would be horribly painful. At the end, they leapfrog over each other down a staircase, landing in splits each time. And then they get up and smile.

Source: New feed