Breaking up with my type: How I learned to stop worrying and let myself loathe the men I once desired

Hipster

(Credit: pio3 via Shutterstock)

Realizing that you hate the people you want to sleep with is the adult version of learning that Santa Claus and the Easter Bunny and the tooth fairy don’t exist, all at once. It’s disappointing. But at some point you have to stop pretending.

For most of my 20s I had a type: tall, unshaven, record-obsessed hipster men who treated me like trash. Not the kind of trash you tie up neatly and respectfully bring to the curb every day. The kind of trash that gets stuck at the bottom of a dumpster soaked in some mysterious liquid and you pretend you don’t see because it’s too scary to acknowledge that you might one day have to confront its grossness.

I could pick them out of a bar like a game hunter on safari. Though to be fair, they weren’t hard to spot. My type wore, exclusively, distressed denim shirts, tight black jeans and colorless canvas sneakers. They were attractive but not in your typical way, in a way where their big nose or lanky build or asymmetrical haircut made me think I had discovered gold. And they always shared a name with at least one of The Beatles (if Ringo had been named Mike).

Because I am an idealist, my type was always about my age. I have a hard time opting into the current system whereby men can live half an adult life without considering another person’s emotions —until they turn 40, at which point they decide to start caring and then can freely choose from millions of unsuspecting women of any legal age. It’s a rule of principle not desire, and so it is more or less useless. I love salt-and-pepper-haired men, with distinguished creases and futon-less living rooms. But while this rule is probably the single biggest factor keeping me single as I continue to catch, then recover from, 30-somethings like colds, I refuse to be part of any system that perpetuates a cycle of inequality.

Men of my type were very confident. Not brazenly confident, like loud men in suits named Blake can be. Subtly confident, in a way where their opinions on “Pet Sounds,” Lars von Trier, “Infinite Jest” and other things I taught myself to care about were indisputable. In a way where they could talk for hours about a particular photography project without ever thinking to ask if the other person was interested because of course they were. They weren’t loud as much as they were firm, perpetually in their own heads with unwavering conviction about what they liked and didn’t, and I loved that about them. I envied it.

I was always a tomboy. I liked cars and bikes and anything that got me dirty. But at what seemed like an arbitrary age to a girl who never really grew breasts, it came to my attention that girls shouldn’t like bikes and cars and things that got them dirty. And so the pattern began. Sneakers? No. Curly hair? No. Seconds of spaghetti? No way. I’m pretty sure I spent the next two decades trying to figure out what the hell I was supposed to want. And so for a long time my type was someone who could help me do that.

But my type would always end it by letting me know he “just wasn’t feeling it,” which usually coincided with some mid-tier life event like his 30th birthday or his best friend’s wedding. These swift and silent breakups would leave me so utterly disconnected from what I assumed to be reality, where actions and words represented the thoughts of the people who perform them, and so completely distrusting of my ability to read social signals, like our late nights sharing secrets and his telling me he cared, that eventually I decided to stop dating altogether.

It was then, when I gave up on men completely, that I discovered the treasure — the pearls — that are 30-something female friendships. Around 30, I realized, was when single women got better and single men got worse. It’s the age when women have internalized and learned to deal with the injustice that comes with their gender; they get stronger and give fewer shits. Men, on the other hand, learn that their wrinkles are by some weird miracle considered attractive, as are their dad-like bodies, and that essentially the limits of time as we know it do not apply to them. They get spoiled.

My relationships with women were like a whole other species compared with my romantic flings. We traded honest stories of struggle. We empathized with years of pushing ourselves to be more aggressive with the men we worked with and more chill with the men we slept with. We’d learned to manipulate and contort our feelings so many times we were lion tamers of emotion. When we finished a bag of Kettle Chips in one sitting we reminded one another that we deserved it. We shared tips on body-hair removal and fears of infertility. I learned what real conversation felt like. We asked questions, admitted flaws; we listened to one another and let ourselves be vulnerable.

But emotional satisfaction alone can last only so long.

As I exchanged awkward hellos with my first real date in years, I was hopeful. He looked the part. But as soon as he started speaking things went haywire. He spoke unaffectedly for what felt like an hour about why bourbon was better than whiskey and didn’t seem to care or even notice that I would have rather been clearing glasses behind the bar than listening to his explanation.

The next date’s loud voice cut through the crowded cafe as he described his job with pride. I saw my former self like a ghost in some Christmas special, hanging on his words. But now the tone that once seemed so wise and admirable — because men are taught to brag and women are taught to smile — felt brash and unreflective to me. I could see him trying to feign interest in my job just enough to get another drink, but could tell by the shift in his eye as soon as I started that anything in the room was more interesting than the part of the conversation he had nothing to do with. I felt like I had x-ray vision glasses — some super power that allowed me to see beyond his artsy attire and firm opinions into a skeleton of self-importance.

On each date I found myself apologizing for our conversation desert and wondered how many times in a year, or in a decade even, it occurred to these tall men with messy hair and ratty tees to apologize to anyone for anything. If they ever felt an unbearable need to make every interaction pleasant for the other person, to the point of sacrificing their own interests one by one. If they ever experienced the overwhelming guilt that came from talking about themselves for more than two minutes without asking a deluge of reparatory questions, like the useless prompts I dished out to keep them interested and comfortable. The attraction first seeped out of me like a leaky pipe, then gushed out like an open fire hydrant.

Eventually, I met a man I liked. He seemed in touch with his feelings, self-aware and willing to admit weakness. After hours of juicy conversation, we broached the subject of sexual reciprocation.

“I’ve just been put in positions, over and over, where I’ve felt pressure to please men without any consideration of my own needs. So it’s kind of a trigger if a guy expects sex,” I explained, proud that I was finally able to articulate years of repression, thanks to my girlfriends.

He rolled his eyes.

I’ve grown very familiar with the eye roll since I’ve started dating again. Here are some words that will generally elicit a roll: Hillary, Clinton, feminist, patriarchy, Beyoncé, fertility, privilege, consent, “Ghostbusters.”

Sometimes I wish I could erase my memories of the wasted hours I spent listening to their stories and quieting my own. But ignorance is not bliss.

Instead, over the past few years I’ve begun the process of redefining my sexual attraction. Ripping out all my Pavlovian responses like remodeling a house — a real fixer-upper. But I’m even pulling out the goddamn foundation. I’m wondering if maybe I don’t even want to date men. Maybe if I dig deep enough my love for women can move from the emotional to the physical.

Reconsidering and redefining my type to from scratch at a time when most people my age are having babies feels like starring in the bizarro version of the movie “Big” — an adult trapped in an adolescent’s body. It’s terrifying and confusing and lonely. But any amount of discomfort is better than the alternative: sitting across the table, pretending to listen raptly for even one more minute to the men I called my type.

Source: New feed

Defined by her dress size: “This Is Us” is NBC’s best new show — and it has a fat-shaming problem

This Is Us

Chrissy Metz in “This Is Us” (Credit: NBC/Ron Batzdorff/nbc)

It’s the worst part of one of fall’s best new shows.

NBC’s “This Is Us,” created by Dan Fogelman (“Crazy, Stupid, Love”), picks up where the recently cancelled “Parenthood” left off. The show is a heartfelt story of intergenerational families dealing with everyday difficulties, one that mixes laughter with tears. (Read Salon TV critic Melanie McFarland’s review.) In the pilot, viewers are introduced to Randall (Sterling K. Brown), an adoptee who has recently tracked down the addict father who abandoned him as a newborn. There’s also Jack (Milo Ventimiglia) and Rebecca (Mandy Moore), a young couple who are expecting triplets.

Lastly, Kevin (Justin Hartley) and Kate (Chrissy Metz) are twins with little in common except a womb: He’s the handsome star of a broad laugh-track comedy he hates (called “The Manny”), and she’s overweight.

What distinguishes “This Is Us,” which has already proven to be a breakout hit for the network after two episodes, is the show’s raw empathy for its characters. Rebecca, after giving birth, is frustrated that her husband isn’t pulling his weight around the house. Jack would rather be the “Cool Dad” than lend a hand when she needs it. She’s a 10, Rebecca tells him, and she needs her partner to be at her level. He tells his wife that he’s ready to be an 11 for her — if that’s what it takes to keep their family together.

Brown, who just won an Emmy for “The People v. O.J. Simpson,” is particularly strong as a man divided by his own feelings. Randall knows he should tell his deadbeat dad to get lost. Why then does he invite the man into his home?

Most of the characters in “This Is Us” are flawed and complex, acting in ways that are contradictory yet relatable. It’s too bad that the show doesn’t extend the same courtesy to Kate, who is solely defined by her weight. The other characters in the show have goals, dreams and aspirations. Kevin, who ends up a viral sensation after having an expletive-laden meltdown on his show, wants to do something more meaningful than star in a crappy sitcom. Maybe he’ll pursue Broadway, he wonders aloud.

The only thing viewers know about Kate, the least developed person on the show so far, is that she’s fat and she hates it. After years of being unhappy with the person she sees in the mirror, Kate vows to lose the pounds.

There’s nothing wrong with telling stories about people who don’t like the way they look. “This Is Us,” however, is sending the wrong message to a population that’s already widely stigmatized and shamed on television. Although shows like ‘The Mindy Project,” “Drop Dead Diva,” “Roseanne,” and “Parks and Recreation” have helped further positive visibility for people of non-sample size, characters with larger bodies are frequently mocked — depicted as nothing more than walking punchlines.

When TV writers and producers portray women like Kate as laughable or pathetic, they are sending the message that people who look like her should feel bad about it. “This Is Us” ends up hurting the very population it wants to represent.

Kate’s story is a personal one for Fogelman. The prolific writer/producer, who is also behind TV’s “Galavant” and “Pitch,” based the character on his sister, who has struggled with negative body image her entire life. And Metz said that the storyline also resonates with her journey as a plus-size woman. “A lot of people [have these issues], and I have myself,” the 37-year-old actress told SheKnows. “So this is a story that needs to be told because there are people who find their self-worth in the number on the scale.”

Weight-loss narratives abound in a society obsessed with the spectacle of big bodies (see: “The Biggest Loser”), but these storylines are rarely handled with care. On the sixth season of the long-running NBC sitcom “Friends,” Monica Geller (Courteney Cox) is revealed to be a teenage overeater when the Central Perk crew watches a home video of Monica attending her prom. Attempting to explain away the difference in her appearance, Monica says that the camera adds 10 pounds. “So how many cameras are actually on you?” Chandler quips. Stunned by what he’s seeing onscreen, Joey stammers, “Some girl ate Monica!”

Fox’s “New Girl” also revealed its resident metrosexual, Schmidt (Max Greenfield), to have been much heavier in college. “Fat Schmidt” has become a favorite running gag of the six-year-old sitcom, and has even been turned into a series of GIFs.

When audiences are invited to laugh at fat people, we see them as less human. The Farrelly Brothers’ “Shallow Hal,” perhaps the most egregious example of fat-shaming in pop culture, wants to have it both ways. A chauvinistic jerk, Hal (Jack Black), is cursed with only seeing others’ inner beauty after a visit to guru Tony Robbins. When he meets Rosemary (Gwyneth Paltrow), he pictures her as thin and attractive  — someone who looks like, well, Gwyneth Paltrow. His friends, however, treat her with disgust and contempt, as if she were an untouchable.

Hal might see the gorgeous woman inside, but the film continually underscores that Rosemary’s body is her reality. When she sits down on a chair to eat lunch, Rosemary breaks it into a million pieces. When she and Hal enjoy a romantic boat ride, the gondola tips like an aquatic seesaw.

Following a wave of body-positive programs like “My Mad Fat Diary” and “Huge,” “This Is Us” represents a step backward, a show that asks us to both root for Kate and pity her.

Viewers meet Kate, who doesn’t seem to have a job to speak of, after she has fallen down in her bathroom. Like the old lady in the Life Call commercial, Kate calls her brother for help. When she goes to the fridge to eat her feelings later in the episode, there’s a note warning her that she made a promise to herself to get healthy. She rips off the Post-It, revealing an identical one underneath.

The audience is meant to find these moments both sad and funny, underscoring that the character isn’t someone we should want to be like. Kate is a person the viewer should feel sorry for.

Things improve when she meets a self-described “fat friend” in her support group, Toby (Chris Sullivan), who has a more positive outlook on his size. Toby, who regularly pokes fun at the other members of their circle, openly states that he may never be thin. But after Toby makes his romantic interest in Kate clear, she rebuffs her new acquaintance. “I can’t fall for a fat person right now,” Kate says. “Guess I’ll lose the weight then,” he responds.

The blossoming couple’s interactions are warm and deeply felt, and it’s refreshing to see plus-size characters — especially ones falling in love — portrayed onscreen at all. (“Mike and Molly,” the CBS show starring Melissa McCarthy that ended in May, was an exception to that rule.) It’s disappointing, though, for “This Is Us” to treat these moments of Kate grappling with her self-worth as the only ones from the vastness of her lived experience that matter. “It’s always going to be about the weight for me,” Kate tells Toby. “It’s been my story ever since I was a little girl, and every moment that I’m not thinking about it, I’m thinking about it.”

By treating Kate’s weight as a shameful burden — as well as her only personality trait — ”This Is Us” serves to reinforce the harmful messages people of size are constantly fed on television, often from a young age.

A 2014 study in the International Journal of Eating Disorders showed that children are most vulnerable to fat-shaming in media. Marla Eisenberg of the University of Minnesota and her team found that programs aimed at a youth audience (e.g., “SpongeBob SquarePants” and “iCarly”) were the most likely to contain negative messages around body image. Of the 30 shows Eisenberg surveyed, there were 66 instances of characters making shaming comments about others’ appearance. The overall rate of body negativity was 52 percent higher in kids’ shows than ones created for adults.

If children begin internalizing these messages at a young age, what impact does it have on them? Spoiler: It’s not good.

Researchers from Harvard Medical School famously studied the effect of U.S. television shows like “Melrose Place” and “Beverly Hills 90210” on the indigenous culture of Fiji, where “you’ve gained weight” is considered a compliment. Three years after the soaps began broadcasting in the island nation, the rate of eating disorders skyrocketed. In 1998, 15 percent of young girls “reported that they had induced vomiting to control their weight . . . compared with 3 percent in the 1995 survey,” as the New York Times reported. Overnight, bulimia went from being virtually unheard of to a national epidemic.

Shows that promote one definition of female beauty — looking like Heather Locklear, not Chrissy Metz — do little to help anyone be healthier. Numerous studies have shown that fat-shaming simply isn’t effective and only serves to make the problem worse. People who are discriminated against because of their size often report feeling discouraged from exercising or eating healthy, thus increasing the likelihood that they will gain additional weight. Promoting negative body image, it would seem, only creates more negativity.

In the case of actor Wentworth Miller, being body-shamed on the Internet only added to his feelings of depression and worthlessness. The openly gay actor became a meme when a before-and-after photo of the extra pounds he gained after leaving Fox’s “Prison Break” went viral. Miller, who was suicidal at the time, said that the mockery only exacerbated his isolation.

“This Is Us,” which airs its third episode Wednesday, still has time to get it right. Metz explained that the goal with Kate’s storyline is to examine the feelings of insecurity that drive her to use overeating as a source of comfort. “While I know that food is not the issue, it’s the symptom,” Metz said. “People fill voids in their lives, and Kate happens to fill it with food.” That may be true, but the Kates of the world deserve to have the multiplicity of their struggles represented, instead of repeating the same old story. Fitting into a little black dress is great, but being seen as a whole person is even better.

Source: New feed

Is the DEA high? The agency’s emergency ban on kratom has to make you wonder what they’re smoking

Kratom

(Credit: Getty/Joe Raedle)

How insane is America’s drug war? Look no further than the Southeast Asian plant known as kratom, which the Drug Enforcement Administration recently announced it would be temporarily adding to the list of Schedule 1 substances, along with heroin, LSD, cocaine and marijuana. This emergency ban, which the DEA justified by calling the plant an “imminent public health and safety threat,” may go into effect as early as this weekend, and can last up to three years before becoming permanent or being reversed.

Kratom, which is related to coffee, has been used for therapeutic purposes across Indochina for centuries, and has become increasingly popular in the United States over the past several years for treating chronic pain, depression, anxiety, PTSD and a variety of other ailments (veterans have been particularly vocal about the plant benefiting their lives and getting them off of a plethora of pharmaceuticals). Kratom has also been reported to help recovering opiate and heroin addicts, even though it can be mildly addictive itself if used on a daily basis — not unlike coffee.

Online forums suggest that it is used predominantly by adults as a therapeutic herb, and a recent survey by the Pain News Network, which polled over 6,000 kratom users, found that over 50 percent use it for acute and/or chronic pain, almost 15 percent for anxiety, 10 percent for opioid dependency, and less than 2 percent for recreational use or curiosity. It is particularly popular for those suffering from back/spine pain, migraines and fibromyalgia.

The Schedule 1 classification is supposed to be reserved for substances that are considered to have high potential for abuse/addiction and no medicinal value, which any rational observer can see is not the case with kratom (even the DEA spokesman has admitted this much — but more on that later). As with cannabis, kratom does not appear to cause overdose or death when too much is taken because it does not slow down breathing as opiates do.

In The Verge, Alessandra Potenza reports on a study conducted by Edward Boyer, a professor of emergency medicine at the University of Massachusetts Medical School, who says that kratom looks like it could be a promising alternative to prescription opiates:

“The plant acts like an opioid painkiller without one of the worst side effects: difficulty breathing. In opioid overdoses, patients stop breathing. But when rats were given kratom’s major chemical compound (called mitragynine) in substantial doses, they breathed freely. The results suggest that kratom could one day be developed into a pain medication that doesn’t pose the same risks as opioids. ‘I think it’s worthy of additional scientific research,’ Boyer says.”

In its letter of intent, the DEA pointed to just 15 known deaths that involved kratom (just to put that number in perspective: roughly 88,000 people die each year from alcohol, which has high potential for abuse and little to no medicinal value). But even this small number is misleading, since almost all of these cases involved high doses of other dangerous substances in the subjects’ systems. Most kratom users report that consuming too much of the plant, which has an unpleasant taste and can give an energizing effect in lower doses, simply causes nausea and vomiting. Unfortunately, prohibition will hinder further scientific research that could establish better understanding of the plant’s safety and therapeutic benefits.

Hasty and ill-considered, the claim that kratom is an imminent public health and safety threat is enough to make you wonder whether they are getting high on their own supply over at the DEA. Indeed, policy experts seem to overwhelmingly agree that this knee-jerk measure will only worsen the current heroin and opiate epidemic in America by forcing many kratom users to turn to more dangerous and addictive drugs in treating their pain — thus exacerbating a legitimate public health crisis.

The agency’s spokesman, Melvin Patterson, has responded to the public backlash — which includes a White House petition with over 135,000 signatures and a letter from 45 members of Congress calling on the DEA to delay this “hasty decision” — with feigned sympathy and doublespeak.

“I don’t see it being Schedule 2 [or higher] because that would be a drug that’s highly addictive. Kratom’s at a point where it needs to be recognized as medicine,” Patterson told the Washington Post, seemingly contradicting his employer’s position. “I want the kratom community to know that the DEA does hear them,” he continued. “Our goal is to make sure this is available to all of them.”

Right — what better way to make kratom available than to make it illegal!

This entire episode provides an unsettling insight into the drug warrior’s irrational and authoritarian mindset. Of course, the DEA’s plan to ban kratom shouldn’t have come as a complete surprise (after all, just a few weeks earlier the agency announced that it would be keeping cannabis — which is now widely accepted as medicinal — as a Schedule 1 drug). But the impassioned response from the public has been encouraging. As noted above, 45 members of Congress have signed on to a letter asking the DEA to delay the emergency scheduling, while President Obama will have to respond to a White House petition asking him to stop this measure within the next month or so. The American people — the majority of whom now support the legalization of marijuana — seem to be waking up to how disastrous, counterproductive and harmful the war on drugs has been.

But the DEA depends on the drug war — they literally can’t exist without it — just as the criminal drug lord’s existence depends on the prohibition of drugs. Thus, the vicious circle will likely continue until the public stages an intervention.

Source: New feed

Epic fashion fail: How Lands’ End tried to be young and stylish — and fell flat on its face

Federica Marchionni

Federica Marchionni (Credit: AP/Richard Drew)

To America’s young urban fashionistas, Lands’ End, if they knew it at all, was as hip as dad jeans. So it came as some surprise when the specialty apparel retailer built its strategy around attracting these young, cool customers with some cosmopolitan flair. What was far less surprising was the result — it was a match that went over as well as a glass of French Bordeaux with a bowl of Midwestern cheese curds.

In one of the year’s most striking corporate about-faces, the board of the Wisconsin-based clothing company announced this week it had parted ways with CEO Federica Marchionni just 19 months after hiring her. While the company hasn’t commented publicly on the reasons for her dismissal, insiders told the Wall Street Journal that employees never warmed to the former Ferrari and Dolce & Gabbana executive, and felt she was moving too fast into uncharted territory in a bid to go after younger, cooler adults. Of course, that was why Marchionni was hired. The question is why?

“Bringing her in was a catastrophe,” Howard Davidowitz, chairman of Davidowitz & Associates, a national retail consulting and investment banking firm, told Salon. “Dolce & Gabbana sells lovely $2,000 scarves on Fifth Avenue, but Lands’ End is Middle America. That’s what they stand for and that’s who they are.”

Like other retailers, Lands’ End looked toward the future and saw uncertainty as their core clientele of Baby Boomers began to drift toward retirement and beyond. Also like many others, they envisioned the lucrative young-adult demographic was the answer to their problems. Although the company knew enough to see that its team was not built to lure a base of youthful, fashion-conscious consumers — hence the hiring of Marchionni — that’s about the only foresight Lands’ End can be commended for: It failed to grasp how elusive millennial consumers can be, whether it was capable of producing trend-forward goods, and what the effect on their existing customer base would be. 

Marchionni took cues from the pages of fashion magazines and runway shows to target the crowded market of young, urban-dwelling H&M, Zara and Forever 21 shoppers. Over the past year, the glamorous 44-year-old who speaks in an Italian accent plowed ahead aggressively with her vision, adding high heels and brightly colored slimmer-fit dresses to the company’s lineup, hiring tanned pouty-face models for pricey photo shoots in the tropics, and implementing separate marketing campaigns targeting two radically different demographics. She made Lands’ End wear two hats: one for flyover-state Midwestern parents and the other for younger, childless millennials (Americans born between 1980 and 2000).

But Marchionni’s strategy never took off. Last year, Lands’ End reported a 9 percent plunge in sales, to $1.42 billion and its first loss in years. The company’s situation hasn’t improved in 2016. It posted a loss of $7.7 million for the six months that ended July 29, compared with a profit of $9.2 million a year ago. Sales fell nearly 8 percent, to $565 million, over that period. The stock market, in particular, had no faith in the new direction, as Lands’ End shares plunged 57 percent under Marchionni’s leadership.

Paul McElhone, dean of the Mihalcheon School of Business at Concordia University in Edmonton, Canada, says Lands’ End should have pivoted more gradually.

“They should have started with maybe 10 percent of the line that’s geared towards millennials to capture the attention of people looking at their website,” McElhone told Salon. “Along with that 10 percent they should have had an aggressive advertising campaign, and then grown the business based on the measurements of how it was received.”

Once the country’s largest mail-order apparel company, Lands’ End grew its business around modestly priced, practical apparel items aimed at busy parents, and it continues to be popular with this demographic. Amid the rise of e-commerce, Lands’ End was also uniquely positioned to transition smoothly from mail-order catalogues to a web-based business, thanks to its well-established warehousing and order-fulfillment infrastructure. But instead of switching gears to online, Lands’ End opted for a takeover. It was gobbled up by Sears Roebuck in 2002 for $2.2 billion in a deal that married the brand to a troubled department store chain. That marriage was disastrous, and Lands’ End was spun off in 2014. The company’s first attempt at reinvigorating the brand was not to focus on its strengths but rather to hire Marchionni in the hopes of a quick turnaround. 

Now, Lands’ End has to correct course yet again, and Davidowitz counsels them to look at executives who know the Lands’ End demographic.

“If [instead of Marchionni] they would have picked someone from Kohl’s or the apparel division of Target, I would have applauded,” he said. “Those are folks who would understand Lands’ End customers. Someone from those two companies, from a demographic point of view, would be perfect.”

On Thursday, Lands’ End announced it had hired an outside firm to find Marchionni’s replacement, perhaps an acknowledgement that, at present, they are lost at sea and unsure who and what they need.

 

Source: New feed

Is “grit” really the appropriate response to “coddling” in the classroom?

fifth_place_trophy

(Credit: Alan Bailey via Shutterstock/Salon)

This piece originally appeared on The Conversation.

In the same way that actual grit accumulates in the cracks and crevices of the landscape, our cultural insistence on possessing grit has gradually come to the forefront of child-rearing and education reform.

In 2012, Paul Tough’s book on the topic, “How Children Succeed: Grit, Curiosity and the Hidden Power of Character,” was a critical and commercial success, earning positive acclaim from Kirkus Reviews, The Economist, The New York Times, Slate – and even former Secretary of Education Arne Duncan.

And last year, in a column for The Washington Post, Judy Holland, editor and founder of ParentInsider.com, wrote that the “coddled kids” of the “‘self-esteem’ movement in the 1980s” produced children who were “softer, slower and less likely to persevere.”

“Grit is defined as passion and perseverance in pursuit of long-term goals,” she continued. “Grit determines who survives at West Point, who finals at the National Spelling Bee, and who is tough enough not to be a quitter.”

Recent academic studies on grit include the education-leadership dissertation project of New England College’s Austin Garofalo, titled “Teaching the Character Competencies of Growth Mindset and Grit To Increase Student Motivation in the Classroom,” and UMass Dartmouth professor Kenneth J. Saltman’s “The Austerity School: Grit, Character, and the Privatization of Public Education.”

These articulations of grit frame it as an essential characteristic for healthy, productive maturation – and certainly a necessary component for academic success.

As someone who specializes in children’s literature and cultural attitudes toward childhood, I’ve been interested in this insistence on fostering grit. I’ve also taught writing and literature over the past year to West Point cadets, who, it seems, must learn how to acquire this somewhat elusive quality.

But I can’t help but wonder if we’re talking about grit in an unproductive way. And maybe one of the problems is that it’s presented as a concept: abstract, indeterminate and somewhat magical or mysterious.

How can we define grit, or the idea behind it, in a way that means something? What if we’re not framing the discussion of grit in the right way, since grit can mean something entirely different for a kid living in the Chicago’s South Side than it does for a kid living in the suburbs?

A slippery buzzword?
In 2014, National Public Radio’s Tovia Smith looked at how educators and researchers are using the concept of grit in the classroom. She interviewed MacArthur Genius Grant recipient Angela Duckworth, associate professor of psychology at the University of Pennsylvania and author of “Grit: The Power of Passion and Perseverance,” which was published in May. In it, she considers how teaching grit can revolutionize students’ educational development.

“This quality of being able to sustain your passions, and also work really hard at them, over really disappointingly long periods of time, that’s grit,” Duckworth told Smith in the NPR segment. Expanding on the national significance of grit, Duckworth added, “It’s a very, I think, American idea in some ways – really pursuing something against all odds.”

But more recently, Duckworth has backtracked from some of her earlier advocacy. In March she told NPR’s Anya Kamenetz that the “enthusiasm” for grit “is getting ahead of the science.” And Duckworth has since resigned from the board of a California education group that’s working to find a way to measure grit.

As Kamenetz notes, part of the problem with buzzwords like “grit” – and the attempt to measure or implement them in the classroom – “is inherent in the slippery language we use to describe them.”

Is grit something that can even be taught? Can we measure it? Is it a trait or a skill? If a quality like grit is a trait, then it may be genetic, which would make it difficult to simply instill in kids. If it’s a skill or habit, only then can it be coached or taught.

Grit’s place in children’s literature
The Oxford English Dictionary tells us that grit – the kind that describes “firmness or solidity of character; indomitable spirit or pluck; stamina” – originated as American slang in the early 19th century. It’s easy to see its kinship to the other definition of grit: “minute particles of stone or sand, as produced by attrition or disintegration.”

It’s come to represent a refusal to give up, no matter the odds – a refusal to wash away, break down or completely dissolve.

American children’s literature has long had “gritty” protagonists: characters who’ve arguably instilled moralistic values of bravery, industry and integrity in generations of readers.

In the late 19th and early 20th centuries, another word featured in the Oxford English Dictionary’s “grit” definition figured more prominently in mainstream children’s literature – pluck.

Mark Twain’s Tom Sawyer and Huck Finn both exhibited pluck, seen in their wily charm, adventurous spirit and underlying moral conscience. But the notion of pluck, grit’s forefather, was largely popularized in Horatio Alger’s stories, which are known for their hardworking young male protagonists trying to eke out livings and educate themselves within the American urban landscape.

“Dick knew he must study hard, and he dreaded it,” Alger wrote in his landmark text, “Ragged Dick.” “But Dick had good pluck. He meant to learn, nevertheless, and resolved to buy a book with his first spare earnings.”

Though he hates it, Dick studies hard because he believes he needs an education “to win a respectable position in the world.”

The determined, plucky child figure arguably evolved into one of grit through Mattie Ross in Charles Portis’ 1968 western novel of revenge set in the late 19th century.

The novel quickly establishes Mattie’s resilience and resolve, which solidify after the murder of Mattie’s father. Mattie, reflecting on her doggedness, says, “People do not give it credence that a fourteen-year-old girl could leave home and go off in the wintertime to avenge her father’s blood.”

Grit to what end?
Mattie Ross and Horatio Alger’s clever street boys helped shape an American ideal of youthful grit. But these fictional characters asserted their grit because they had goals. What good is grit if you feel like you have nothing to strive for?

In early children’s literature for African-Americans, publications such as W.E.B. Du Bois’ monthly youth magazine The Brownies’ Book attempted to also give its young readers an idea of what they could achieve. While much of American children’s literature during the turn of the last century – and even today – filters ideas of grit through the perspective of the middle-class white child, The Brownies’ Book specifically addressed the lives and experiences of African-American children. First published in 1920, the magazine encouraged African-American children to fully embrace their cultural identities, participate in their communities and become citizens of the world.

But that was 1920, during the dawn of the Harlem Renaissance, a time when the work of African-American artists, activists and thinkers brought newfound optimism to the push for racial equality and cultural pride. Over the course of the 20th century, circumstances for many children of minority communities changed. As Atlantic writer Ta-Nehisi Coates has explained, a public policy of ghettoization has left many urban school districts impoverished and underserved, with few examples of hope or achievement outside the drug trade. Yes, kids could develop grit – they could find confidence, diligence and resilience outside the law – a version of grit demonized by mainstream society.

David Simon’s Baltimore-set HBO series “The Wire” illustrates the narrow possibilities for black kids growing up in the city. Grit, as depicted in “The Wire,” comes via success in the drug trade. This kind of grit has the bottom line of economic gain. It’s not about a search for identity, cultural understanding or artistry because kids don’t think they have the same opportunities and potential highlighted in the issues of The Brownies’ Book.

A 2014 study from the U.S. Department of Education Office for Civil Rights found that in America, there still exists a pattern of racial inequality in public schools, whether it’s course offerings, teacher performance or student expulsion. These statistics – the same as those echoed in “The Wire” – leave many somber, dejected, angry or, too often, complacent.

So how can students have – or learn – grit when all kids face different realities – different struggles, different dreams and different social structures?

Yes, it’s important to reevaluate the education system, as monumental a task that may be. But all institutional or systemic change starts with the individual.

“A lot of what ‘The Wire’ was about sounds cynical to people,” Simon said in a 2009 Vice interview. “I think it’s very cynical about institutions and their ability to reform. I don’t deny that, but I don’t think it’s at all cynical about people.”

Maybe the first step is to think of grit not as something to cultivate in students. Instead, maybe grit is the debris – the dream – that lingers. If children and young adults get that piece of grit stuck to them, they’ll be motivated to keep going until the grit is gone.

Perhaps the job of adults, then, isn’t to tell kids to buckle down and work through adversity. It’s about opening their eyes to the innumerable possibilities before them – so they’ll want to persevere in the first place.

Source: New feed

This is not a humblebrag: My kid won’t eat junk food and it’s all my fault

Chicken Nuggets and Fries

(Credit: Getty/Juanmonino)

“Chicken nuggets or fish sticks?” I pulled the plastic bags out of the freezer and held them up. By Thursday night after a long day at work during my busy season, I had no shame.

My 5-year-old, Colin (not his real name), looked up from his Legos, strewn across the kitchen floor. “I don’t like those. Can I have salmon?”

I sighed and tossed the bags back in the freezer, knowing I had only myself to blame for his pickiness. When I was pregnant I had bought all the books, read all the blog posts and determined to do it “right.” Breast-feeding until he was 2? Check. Making his baby food from scratch? Check.

I knew the statistics on childhood obesity and how bad junk food was for my kid. Once he transitioned to solids, high-fructose corn syrup was the enemy. Friends warned me that I shouldn’t be so strict with slightly patronizing “first-time mom” comments and “you’ll change your mind once he’s born.” If anything, their comments made me only more stubbornly determined to see it through.

A recovering anorexic, I had spent most of my teen years in a battle with a scale or on a treadmill at the gym. At my lowest point I had weighed 95 pounds, which at 5 feet 3 inches tall, was well below a healthy weight. My ex-husband had been a self-described “fat kid,” teased and tormented in his teen years until he lost weight during his 20s. We both knew that we had issues around food and agreed that we didn’t want to pass on those issues to our kid. No “you have to clear your plate.” No “there are starving children in Africa!” No using food as a reward, like promising him ice cream for cleaning his room.

I think it’s a natural thing: No matter how good your childhood was, you want to improve on it. Anything that bugged you when you were a kid, you’re going to do differently with your own. Until the day you hear the words “I’m not your servant!” coming out of your mouth when your kid refuses to pick up his toys (again). Reality took care of many of our plans, but not those surrounding food. For the first 10 months of his life, I was a stay-at-home mom so it was easier to breast-feed, make meals from scratch and put together healthy snacks.

Even after I went back to work, I would cook healthy dinners while my husband played video games on the couch and claimed to be keeping an eye on Colin. But he never helped with food prep, though he certainly criticized my “bland” palate and “boring” meals. I did all the grocery shopping, meal preparation and the dishes afterward. My husband’s lack of participation extended to every aspect of our marriage. I finally had enough and left him shortly after Colin turned 3.

Divorced, back at work full time and with a toddler, I admitted that I couldn’t make meals from scratch every night. We had been eating out too much, to the point where my kid knew the term “happy hour” because I had figured out how to feed us both on $5 appetizers. Something had to give. On my next grocery run, into the basket went chicken nuggets: all-natural and 100 percent chicken breast, of course. I still had standards.

The next night I laid them out on the baking sheet, put them in the oven for 10 minutes and served them up, sure that Colin would gobble them down.

He wouldn’t eat them.

Not even when I squeezed a hefty dollop of ketchup (his favorite) onto the plate. He ate the microwave-in-a-bag green beans, two cups worth, and turned up his nose at the protein. I sliced up some cheese for him and ate the nuggets myself.

The next week I tried fish sticks, with the same results. Hamburger patties? Nope. Meals in a bag? He spit them out. In my quest to make sure he would eat all his fruits and vegetables, I had turned my kid into a different kind of picky eater: He’ll eat kale chips, but not super-easy, working-mom-needs-a-break food.

As I made yet another grilled cheese sandwich with carrot sticks one night — the only easy food he would eat — I admitted to myself that I should have fed him more junk food when he was younger. I hadn’t planned on being a single mom when he was born. I hadn’t planned on being the sole person responsible for feeding us. And I had unknowingly made my life harder in the future by being so dogmatic about food choices in the past.

Now five years into parenthood, the only advice I have for first-timers is this: Give yourself permission to change your mind. Circumstances change. Your family’s needs can change. Other than always using a car seat and looking both ways before crossing the street, there are very few parenting decisions that will irrevocably screw up your kid. The occasional bag of chips or trip through the drive-through isn’t on that list.

We’ve found compromises. He’ll eat hummus and pita chips, and I’ve decided to call that good on protein. I have upped my budget for eating out and accepted that I’m going to have to pay an extra $4 for a measly cup of broccoli with the kids’ meal. If I complain about his pickiness on Facebook, I’m accused of humblebragging, so I keep my mouth shut and we muddle along.

If I’d had a crystal ball when he was younger, I wouldn’t have been so strict with food choices. You don’t know where you’re going to be in five years, so do yourself a favor. Instead of viewing parenting choices as right or wrong, ask yourself: Does this work for us now? Will this make my life harder in the future? And don’t beat yourself up if dinner comes out of the microwave every once in a while.

Source: New feed

Native apps: How digital technology is helping to preserve fading Native American languages

Mulaka: Origin Tribes

A still from the concept trailer for Mulaka: Origin Tribes. (Credit: YouTube/Lienzo)

When you’re from a sovereign culture within a larger-looming cultural environment — as Native Americans living on and off reservations are — sometimes it’s hard to hold on to your language and mores. Children in particular feel the pull of Western things: language, clothes, toys, media and technology. For this reason, it’s especially important for Native American children to find language tools in modern media. It is typically the elders or years of research that hold lost answers and dialect.

As Richard Littlebear, president of Montana’s Chief Dull Knife College, wrote in a recent research paper, the present generation of fluent speakers of Native languages “needs to honor the preceding generations by strengthening those languages so they remain beyond the seventh generation.”

Reached at his office, Dr. Littlebear applauded recent developments in language and culture retention technology. Natives in academia like Littlebear keep a watchful eye on software and other products that can be used in schools to teach lost and diminishing languages.

Nakay Flotte of Harvard University’s Native American Program (HUNAP) wrote to me about a trend in tribal video game and app development that started around 2013.

Navajo Toddler is one such game developed by Tinkr Labs, which for several years has created programming for Natives to learn or maintain their original languages. It was designed for children ages 2 to 9, is available in the Apple App Store and teaches elements of the Navajo language with interactive visual flashcards, audio and gameplay.

Tinkr Labs co-founder Israel Shortman, a member of the Navajo Nation, created Navajo Toddler. Upon its release, he promised that more games would follow. “I saw culture and language not being passed down, and wanted to reach kids in an everyday way,” he said. “That meant using technology and sharing information through smartphones.” The initial release of the app included three categories: food, numbers and body parts. As they saw images, children could learn the basic words for each category, and audio of the word in Navajo would play simultaneously.

“It has been a unique experience working with the Lakota language — and a learning process,” Shortman told the Falmouth Institute’s Spoken First series in 2013. His daughter had been studying the Lakota language, he said, and was learning other indigenous languages as well.

Shortman, a software engineer, recently spoke to the cultural challenges of working with Native American topics, and the need to be sensitive to cultural rules — such as when a tale must only be told at a certain time of year. He is now working with an illustrator cousin on a storytelling app for young children that relates traditional Navajo tales. One such story, called “Coyote and the Stars,” is a Navajo legend that is supposed to be told in winter. To address this, the games will be open-source software, but will only show up as available for download in the appropriate season.

Shortman is also putting together a video game called Twin Warriors based on a Navajo birth-story, where twin boys must face a series of otherworldly monsters and ancients to prevail in life. Designed for kids ages 6 to 12, the game will be available in the App Store and for Android in February.

Around the same time Shortman released Navajo Toddler, the first indigenous-owned video game developer and publisher in the United States emerged. Upper One Games in Alaska created a game called Never Alone, about a girl lost in a blizzard who must rely upon the friendship of an Arctic fox. This highlights the Alaskan Native philosophy of interdependence between species, among other cultural ideas.

Farther afield, the Mexican developer Lienzo created a 3-D environment video game called Mulaka — Origin Tribes, modeling the mythology of the Tarahumara culture. This game was inspired by the stories of the Tarahumara people, and set in their ancestral lands in the northern state of Chihuahua. Through representations of cultural deities and customs, Tarahumara youth can both learn and share their heritage. The Tarahumara have already incorporated some ancient practices into modern life, so it is helpful that the game features narration in the Tarahumara language.

A surge of fonts for Native American languages have been created over the last 10 years or so by various tribes, which can enable the teaching of traditional language, sometimes in nations where all original language has been lost.

The Digohweli Cherokee Unicode font, for example, was designed to be an easy-to-read, all-purpose font. One can use it for web text or print. This is especially remarkable because many Native tribes did not have written language in the past, and some still do not. Starting in 1809, a Cherokee called Sequoyah spent 12 years developing an 86-symbol syllabary (characters representing syllables), for which she was investigated by the tribal chief and a tribal court. Ultimately the court accepted that the written language was a useful innovation.

Today, the Cherokee are one of the largest Native American tribes, as well as one of the wealthiest. This has led to relationships with technology companies that support cultural and language retention efforts. One can access Microsoft Office Suite, Google, Facebook and Wikipedia in Cherokee. Poorer and smaller tribes have not been able to use technology so handily: the UNESCO Atlas of the World’s Languages in Danger database lists many Native languages as “critically endangered,” “vulnerable” or “extinct,” which means that no living person speaks the language fluently.

In Washington state, a type designer named Juliet Shen worked with the Tulalip tribe to create a font for their Lushootseed language. Lushootseed was barely retained at the time, and just five tribal elders still spoke the language. As in many other tribes, Tulalip children were sent to boarding schools in the 1920s and forbidden to speak their language. The idea of creating a usable type font for their language was revolutionary for a people who had not had a predominant spoken-language culture in decades, and had never had a written language.

In Arizona, the Diné people, whose population runs in the hundreds of thousands, have also created a font. Theirs was one of the first Native (Navajo) fonts developed, and it is widely used today for digital communication by both younger and older people eager to learn and use their traditional language.

Back home on my Shinnecock Indian Reservation on Long Island, it has only been in the last 10 years that children have actively been taught pieces of the original Algonquian Shinnecock tongue, which is related to a broad range of Native languages, but which no one could speak fluently, not even elders. Now, as part of the nation’s cultural preservation program, toddlers at the recently opened Wuneechanunk Shinnecock Preschool are taught both language elements and cultural practices alongside a traditional Western curriculum.

Josephine Smith is the director of the language and cultural program on the Shinnecock reservation in Southampton, New York. Her office is in the basement of a tribal building that houses senior and after-school children’s programs, and her desk is piled high with research papers and curriculum aids — including a computer where she researches ways to convey the Shinnecock language.

Like many Native American tribes, the Shinnecock want to reclaim our language, much of which was taken away by invaders and missionaries and lost over subsequent generations. Smith says that the Shinnecock have several steps to take internally before we can share our language with others through technology. “We are still in the reclamation process,” she says. “I know many tribes are using social media to teach and otherwise share language, and we are going in that direction. First, though, we are working to restore what was taken from us and learn it internally.” As an initial step, Smith says she and Shinnecock language researcher Tina Tarrant purchased an internally developed software program called TRAILS, which teaches Shinnecock vocabulary derived from research. Other Algonquian-originating tribes, like the Cree and the Ojibway, already have fonts to proliferate language studies. With further research, patience and the aid new technology, we Shinnecock may eventually have our own as well.

Source: New feed

Building a better ‘burb: The race to design a sustainable suburbia is also making the suburbs kind of cool

Cite

Cite, New Mexico (Credit: Perkins+Will)

Suburbia is an easy target. But its problems run deeper than disillusioned housewives waking up from their American dreams in cookie-cutter colonials. Suburban living is not environmentally sustainable — at least, not yet. But a new crop of designers, architects and urban planners is envisioning a world in which minivans, manicured lawns and mail carriers are replaced by autonomous cars — aka, suburbia 2.0.

While cities conjure up images of smog clouds and smokestacks, studies show the carbon footprint of a typical suburban home is four times greater than that of an urban dwelling:  Larger households require more heating and cooling. Suburbanites are more likely to own cars than their urban counterparts, and sprawling streets reduce the habitat for animals and decrease biodiversity. All that asphalt — considered “underperforming” by urban planners — allows rainwater to absorb oil and sloppily applied fertilizer before making its way into the ground and eventually the drinking supply.

The majority of academics believe suburbia has a sustainability problem, and you can bet it’s about to get worse. Because 75 percent of American jobs exist outside urban centers, 69 percent of U.S. residents already live in suburbia, and population growth in these neighborhoods is on the cusp of a surge. Somehow, despite the major role these areas play — and their pervasiveness — suburbs have been neglected within the urban planning field.       

“My profession never wanted to look at them,” Ellen Dunham-Jones, director of the urban design program at Georgia Tech, told Salon. “The lack of academic training is shocking. There are architectural studios set in the heart of gritty cities so that students learn to relate buildings to an urban context, and studios set within incredibly pristine natural landscapes, so students learn to relate buildings to nature. But rarely do you see studios set in suburbia, even though this is where most building is taking place.”

So we’ve been left with the soulless and unsustainable monotony of the modern ‘burbs. Change, however, is underway. Architects and designers, thwarted by the restrictive zoning and code laws of high-density urban areas, are beginning to view the troubled suburbs as areas of opportunity, where land use is more flexible and therefore better for experimentation. In other words: Neighborhoods known for matchy-matchy mailboxes and carbon-copy cul-de-sac housing could soon become ground zero for sustainable innovation.

This groundswell has its roots in the new urbanism movement that developed in the early 1990s, when a small set of architects and planners felt frustration with suburban development patterns. They began finding ways to make these areas more walkable, incorporating a downtown dynamism typically reserved for city centers. This involved creating sidewalks and repurposing underperforming properties like defunct shopping malls into usable community spaces for art or theater or food.

Often, it meant increasing the density (and decreasing the sprawl) of a neighborhood by building on rooftops and parking lots. In 2009, Dunham-Jones co-authored “Retrofitting Suburbia: Urban Design Solutions for Redesigning Suburbs,” which brought attention to an idea that architectural magazines had been ignoring: It’s possible to raise our standards for suburban design.

Today, this type of reform is seeing success across the country, and sustainability is increasingly driving the change. Outside of Denver, Zeppelin Development has turned a former industrial area into a growing, mixed-use community with residential homes, a thriving art district and boldly designed office buildings that defy the cubicle-farm standard.

In North Bethesda, Maryland, a 30 minutes’ drive outside Washington, D.C., new urbanist architectural planner Glatting Jackson has been hired by private property owners to transform the 400 acres of ugly arterial roads surrounding the town’s Metro station into a walkable street grid, while preserving nearby farmland.

And in Portland, Oregon’s new urbanist-inspired neighborhood of Orenco Station, half the residents have reported walking to a local business five times a week. 

All of this is helping the suburbs’ appeal. As suburbs diversify, they are increasingly attracting not just nuclear families but single millennials who appreciate urban-style amenities and smart development.

Enter “The Future of Suburbia,” a conference hosted last spring by the Center for Advanced Urbanism at MIT. It was the culmination of a two-year-long research project that will yield the publishing of 1,200-page book “Infinite Suburbia” this fall, authored by 50 experts from around the globe. While new urbanist principles often dictate building vertically, to create smaller spaces with less surface area from which energy can escape, speakers at the conference presented a vision for suburbia that leaves the single-family home in place — albeit equipped with sleek photovoltaic battery packs like Tesla’s Powerwall that harness solar energy and connect to a renewable power grid. Imagine entire neighborhoods free of electrical wires, running entirely on stored sunlight. 

In future suburbia, this technology might also fuel your autonomous car. Alan M Berger, co-director of the Center for Advanced Urbanism, discussed the potential for driverless vehicles to reduce pavement in suburbia 50 percent. And this isn’t just a Jetson-esque fantasy: Ford, Nissan and Tesla plan on releasing a driverless vehicle within the next five years, with Toyota not far behind. And Fiat Chrysler is experimenting with autonomous minivans. The Institute of Electrical and Electronics Engineers predicts these cars, which emit fewer carbon emissions, will make up 75 percent of vehicles on the road by 2040.

Another shift for future suburbia will be centered on the lawn. Gone are the days of perfectly manicured yards and rose bushes as status symbol; coming is the hot new trend of  “rewilding.” This landscaping approach incorporates native plants (think merrybells and woodland iris) that require less synthetic fertilizer and irrigation, while providing habitat for birds, insects and other pollinators necessary for protecting our global food supply. It’s likely there will also be greater attention paid to the edible estates initiative, which advocates replacing lawns with vegetable gardens.

Some green technologies have already been implemented — like the Natural Organic Recycling Machine, or NORM. One of the largest wastewater treatment and reuse systems in North America, this method for recycling black (toilet) and gray (sink) water is cutting usage by 50 percent for residents of Portland’s newest neighborhood, Hassalo on Eighth.

Other green technologies — like an “energy district” that generates electricity for a neighborhood through various renewable sources, including solar and wind power and the chemical element thorium — are being tested. They will get a dry run in the “ghost” suburb known as the Center for Innovation, Testing and Evaluation or CITE. This simulated suburb in the New Mexican desert could house 30,000 people but won’t hold any and instead will serve as a lifelike laboratory for researchers. It could be operational as early as 2018.

All the experimentation makes it pretty clear that no one new design standard for suburbia will take hold. Perhaps a heterogeneous approach is best for areas that were never as homogenous as they seemed. “Suburbia is complex — its production, persistence, and expansion can best be explained as a nonlinear set of interrelationships,” Berger told Salon. “We cannot talk about one aspect of suburbia without considering how it might affect many other social, economic, political, or ecological factors.”

But one thing is clear: Suburbia’s time as a pop-culture punch line might be coming to an end. Or as Art Weingartner warns in the cult classic “The Burbs”: “I think the message to psychos, fanatics, murderers, nut cases all over the world is, uh, Do not mess with suburbanites. Because, frankly we’re just not gonna take it anymore. Ya know, we’re not gonna be content to look after our lawns and wax our cars and paint our houses.

 

Source: New feed

Don’t assume black voters are “with her”: Contempt for Trump doesn’t mean we love Hillary

Hillary Clinton; Donald Trump

(Credit: Reuters/Aaron P. Bernstein/Getty/Drew Angerer/Photo montage by Salon)

Donald Trump slander is being slung by every black writer I know from every inch of the Internet, as it should be. How can a person with so many obvious flaws become the nominee of a major political party? No, really — I’m still confused. Everything from his overtly racist rallies and the way he trashes women to his gross generalizations about immigrants — and his hair — offends us. But collectively, I’m worried that we African-Americans are sending the wrong message to the general public. Contempt for Trump doesn’t equal love for Hillary Clinton and her husband, or even basic support.

* * *

I was in one of those comfy Acela chairs on my way back to Baltimore from a recent New York trip. A middle-aged white dude in non-stylish frames took the chair directly across from me. We had barely pulled out of Penn Station before he asked me, “Did you watch the debate? What did you think of the debate?”

“It was everything I thought it would be,” I said. “She prepared, he didn’t. She came off as sharp, he didn’t.”

He sat up straight in his chair, closed his ears and opened his mouth.

“I’m so over this election,” he said with exhausted eyebrows, “It’s a joke.” He then took off his jacket and went on an “I hate Hillary” rant all the way from Manhattan to his stop in Philly — nonstop, everything from her days as a toddler up to our current conversation. It was the emails, it was Benghazi, it was Monica Lewinsky, it was her pantsuits, and on and on and on. I tuned him out, drifted off in my head — and came back for this: “America can’t take another four years of the Clintons! Sorry I trashed your candidate, man, but she’s so bad. Anyway, have a safe trip!”

I chucked a peace sign in his direction, put my earphones on, turned Coltrane up and gazed out the window. It really didn’t dawn on me until I passed Delaware: I never said anything to the dude about being a Clinton fan. I couldn’t be a Clinton fan if I wanted to — that racist crime bill from 1994 locked my whole family up. Not just my blood relatives, but all of the brothers and sisters in my community­­ — the mentors, the role models, the coaches. The people responsible for grandfathering us into successful lives are gone, and we are bitter. So bitter — an impenetrable bitterness that’s harder than cold steel and so toxic, so acidic that if liquefied it’ll easily melt flesh and dissolve elephant bones. It’s 2016, and Hillary and Bill are still calling us super-predators.

* * *

Michelle Alexander penned a power article in the Nation a few months back on why the Clintons shouldn’t even think about the black vote. She wrote:

Clinton championed the idea of a federal “three strikes” law in his 1994 State of the Union address and, months later, signed a $30 billion crime bill that created dozens of new federal capital crimes, mandated life sentences for some three-time offenders and authorized more than $16 billion for state prison grants and the expansion of police forces. The legislation was hailed by mainstream-media outlets as a victory for the Democrats, who “were able to wrest the crime issue from the Republicans and make it their own.”

When Clinton left office in 2001, the United States had the highest rate of incarceration in the world. Human Rights Watch reported that in seven states, African Americans constituted 80 to 90 percent of all drug offenders sent to prison, even though they were no more likely than whites to use or sell illegal drugs. Prison admissions for drug offenses reached a level in 2000 for African Americans more than 26 times the level in 1983. All of the presidents since 1980 have contributed to mass incarceration, but as Equal Justice Initiative founder Bryan Stevenson recently observed, “President Clinton’s tenure was the worst.”

White people got a chance to explore rehab, while people who look like me explored prison cells. Many African-Americans lost their sanity and sometimes their lives behind those walls. And successfully reentering society has been impossible for many of the people lucky enough to make it out alive. I challenge anybody in the job market without a prison record to check that box sometime. You’ll find out why prison is a revolving door, and why so many why black men have trouble finding good jobs. The Clintons didn’t invent this reality, but they definitely amplified it, and only backed away from the “super-predator” talk when it became politically toxic.

* * *

RIP to my friend Noodle. He was a super-predator, from the Clintons’ perspective. Noodle’s pops was murdered when we were kids, and his moms had a stroke — she received a check but that only covered basics. He had three little brothers and sisters who were growing every day, and he had to feed them.

Noodle fell into the crack game face-first, from lookout, to “hitter” to handling money. He bought some sneakers, but no jewelry. He bought some nice clothes, but no Lexus. That 15-year-old bossed up and fed his family. He wasn’t out there banging guns and being a predator, but he lost his life two weeks before his 18th birthday.

Noodle, like many of us young black people, drowned in a storm we couldn’t handle. Of course you have some people out there selling drugs who don’t have to, unnecessary troublemakers. But that isn’t always the case. For every cold-hearted gangster you name, I can give you three nice guys like Noodle who were dealt a bad hand from a deck of systemic racism, white supremacy and the same bullshit that pushes poor black kids into the streets.

African-Americans don’t have an option in this election. Jill Stein is a wonderful person, but she’ll never win. Even if Gary Johnson had a chance, which he clearly doesn’t, he has no knowledge on the issues that affect us. Yes, he said “Black lives matter.” So what, congratulations — that’s a popular thing to say right now (if you’re not a flat-out racist Trump Republican), but it doesn’t mean he understands the complex issues that oppress us. He doesn’t even fully understand white issues, getting stumped on every news show every time he pops up. It’s like he doesn’t have cable or wifi.

I wish I could’ve had another 15 minutes with that dude from the train. I would’ve told him not to let social media, MSNBC, Bill Clinton, hip-hop artists or Trump jokes fool you. Black people are not on the Clinton train. We aren’t feeling Hillary. We’re not “with her.” We care more about Trump losing than her winning.

Source: New feed