Sansa Stark has always been a warrior: She’s been fierce all along — and it kept her alive

Game of Thrones

Sophie Turner in “Game of Thrones” (Credit: HBO)

Until recently, it’s been lonely on Team Sansa. We “Game of Thrones” fans who have seen great potential in the Northern princess have been shouted down by fans who exclusively preferred more conventionally plucky heroines: She’s a frivolous little girl, they said; her dithering devotion to being the belle of the court, the queen to end all queens, helped get Ned Stark killed, they said; she’s always such a victim, they said. Now, Twitter erupts in exuberant cries of “Yes, Queen!” at the young woman who smirks with a well-earned satisfaction after she’s released the man-eating hounds on her rapist. That woman is simultaneously worlds away from the girl who stood in silent rage as another tormentor displayed her father’s head on a spike—and yet, she is still, in some ways, that same girl, the girl who had an uncanny instinct for survival.

Though the remaining Starks have known unfathomable woe, arguably none has endured the breadth and intensity of abuses and humiliations that Sansa has: Held as a hostage by the same wicked family who slaughtered hers; stripped, beaten, and humiliated by the boy-king Joffrey; married off to Tyrion (who does treat her with kindness and dignity—though, make no mistake, his decency is a gift to her, one that, had he been a different man, he could have revoked with impunity); fleeing for her life after she is framed for Joffrey’s murder—by the very man who takes her on as his pupil and ward, the man who kills her aunt (who, admittedly, was trying to kill her) and sells her into marriage with a man who gets his jollies flaying people alive (the scion of yet another family that killed hers), her rapist and most depraved torturer. Sansa’s capacity for endurance certainly rivals that of any of the more literally battle-hardened characters; indeed, her parlay with Jon Snow prior to the “Battle of the Bastards” displays the parallel lines of each character’s power: He has a warrior’s brute strength and fiery heart, and she has the foresight and tactical reserve that comes from constantly anticipating the next blow.

Characters like Jon Snow, or Brienne of Tarth, or Sansa’s sister Arya are so appealing because they possess the traits—bravery, strength (in terms of physicality and fortitude alike), and cunning—that we want for ourselves; and they earn those traits through the feats of honor, derring-do, and unbridled badassery that we fantasize ourselves capable of. Sansa’s resourcefulness is different—she has learned to keenly read her aggressors in the service of her own skin. When Tyrion Lannister intervenes to stop another beating (administered by one of Joffrey’s heavy-hitting Kingsguard), and offers Sansa out of her betrothal, she parrots back a line she said in fervent earnest back when her father tried to flee King’s Landing—that she loves Joffrey and wants to have his babies—only this time, her words are a refutation of the callow princess who believed in fairy tales and courtly lore, and a canny recognition that pretending to be that princess is the only way she has value, and having value is the only way she stays alive.

This is a sadder, more fragile way to be survivor; it doesn’t uphold any veneer of virtue or valor, it simply gets her through another hard night into another cold day. Our culture has very resolute standards about who gets to be heroic. Victims make us uncomfortable. We can’t accept that our continued safety in the world is so largely a matter of happenstance—being born to sane, loving parents, not leaving the party at that particular moment, with that particular person, or getting in the car just a few moments later—so we find ways to blame these people for their own agonies, even if merely getting through those agonies requires a Herculean fortitude. Only now that Sansa has weaponized her resourcefulness—she brings about Ramsay Bolton’s downfall by understanding his modus operandi (and anticipating Jon’s likely reaction to it, charging in with a righteous, yet blind, fury), and by leveraging Littlefinger’s twisted affections for her to procure the cavalry—is she seen as a compelling heroine. The host of the usually delightfully catty video recap series “Gay of Thrones” used to refer to Sansa as “busted redhead” (an ignoble distinction when even a character as reviled as Cersei is dubbed “evil Cher”). He’s recently given her a more Beyoncé-inspired nomenclature: Sansa Fierce.

Though I would argue that Sansa has, in her own quieter, more subtle way, always been fierce, I am heartened to see her become more active in her ferocity. She drives the Stark resurgence, turning the violations against her house and her body into her rallying cry against the Boltons—if she must always remember, in her deepest, most tender places, what was done to her, then the North will remember as well. If Sansa’s previous plot threads focused on the loss of innocence, and the heroism inherent in keeping her head down, now, her story has pivoted into the aftermath of victimhood—what happens when the blood dries and the bruises recede into muscle memory. Sansa is the callused, and at times, callous, survivor.

In a gorgeously written essay that I absolutely disagree with, Sean T. Collins argues that “revenge under any circumstance—even one as justly deserved as Sansa’s over Ramsay’s—is a sword without a hilt … There’s no way to control which way it swings and how deep it cuts.” Not so: Sansa’s blade has a clear, clean arc—killing Ramsay obliterates an evil, and it directly avenges the litany of terrors he inflicted upon her in any given night. Collins makes a very adept observation about Sansa reclaiming her own bodily autonomy by literally dismantling Ramsay’s body—I’d add that this act of dismantling also reflects what Ramsay tried to do to her spirit. Yet he did not break her. And her revenge, vicious as it is, is her ultimate assertion of self. She didn’t deserve what happened to her, but she does deserve to see her captor punished—though Ramsay would likely have been executed for his crimes against the Northern houses, particularly for killing Rickon Stark, he would not have been called to answer for crimes that could sadly be par for the course on any Westerosi wedding night.

In his essay, Collins laments Sansa’s violence because it seems in opposition to her father’s decency. It’s true that Sansa is no longer her father’s daughter—she is someone new, and more her own woman than ever before. Jon’s cleaving to Ned’s kind of honor nearly gets him and his men killed on the battlefield (and in the season six finale, he admits this to Sansa, praising her for recruiting the Knights of the Vale, even if it meant dealing with the decidedly dishonorable Littlefinger). Still, Sansa’s wrath is worth far more than tactical acumen. It is shared by anyone who has ever been brutalized without recourse, who has seen the wicked-doers walk free far too many times, and it builds a fresh dimension to the ways that survivors are portrayed on-screen.

Sansa is neither a martyr or an avenging angel—which are, by and large, the most prevalent kinds of onscreen survivorship. She occupies a space between these opposing modes, a space she shares with so many of us in the real world—a space between our broken places. Her strength feels so satisfying to viewers who have watched her be so mercilessly abused—but it is a many-chambered strength, and some of those rooms hold a deep, unpitying anger. This anger emerges when she kills Ramsay, of course, and it is in her voice when she says that the Umbers who gave Rickon to the Boltons should be hanged; it is in her threat to have her protector, Brienne, kill Littlefinger where he stands (the only reason she spares him, arguably, is to keep the Knights of the Vale in play). If she has, as Littlefinger once told her, been a bystander to tragedy, she has also been a bystander to power—and observed firsthand how it can insulate even a worthless little peon like Joffrey, or make a woman like Cersei’s will the law of the land.

Cersei and Sansa were initially portrayed as opposites—the soft, daydreaming princess and the hard-bitten, bitter-hearted queen—but in this season there was been a kind of alignment between the two. Both women have emerged from captivity determined to make the agents of their violations pay with blood. And both women crave the security and control that can come from a throne—though, admittedly, Sansa’s craving is more nascent, Littlefinger still knows that the way to pique her interest, however slightly, after she quite wisely rejects his offer of marriage, is to suggest that she could be the Queen in the North (and that he could make her that queen). She loves her half-brother and, in one of the sweeter scenes in the series, assures him that, bastard or no, he will always be a Stark to her—he will always share her blood. When he is crowned King in the North, she smiles at him with genuine joy. Then she looks out at Littlefinger, and the smile slips into a more arctic, inscrutable expression. There is a part of her, like a muscular tendril pushing up through unthawed earth, that demands to be honored and respected, that demands tribute. And to know that, with all of these things, she will never again be subject to anyone’s whims but her own.

As “Game of Thrones” enters its final episodes, Sansa Stark remains one of its most dynamic, complex female leads in a show rich with queens. She hasn’t been broken against the rock of her trauma. She is a knife-blade sharpened against the blunt horror of her experience. And she is uniquely positioned as a bracingly human exemplar of surviving trauma: able to love and hope in measured ways, hell-bent on protecting herself, and yes, capable of a fury to match the magnitude of the indignities she’s endured. I hope she doesn’t go full Cersei, though if she does, it will be undeniably compelling (and maybe, if you squint hard enough, just the tiniest bit deserved). If the show serves her well, it will keep her in that fertile ground between her broken places, between the quest for autonomy and the zeal for vengeance. For so long, her greatest feat in the great game has simply been drawing another breath, and then another one after that—in the next seasons, Sansa should do more than survive. Whether she becomes the Queen in the North, Jon Snow’s ruthless advisor, or simply her own woman, no more, no less—she should do what her long-time fans have always wanted for her: Thrive.

Source: New feed

Secrets of “Blood Simple”: The devious neo-noir classic is more complicated than it looks

Blood Simple

Dan Hedaya and M. Emmet Walsh in “Blood Simple” (Credit: Circle Films)

Here are some delicious nuggets of trivia I did not know about “Blood Simple,” the 1984 Texas neo-noir that marked the feature film debut of Joel and Ethan Coen. Usually I hate this kind of stuff, because most of the time it’s irrelevant. But each of these facts strikes me as integral to the success of the Coens’ epoch-shaping little movie. They are reasons why this devious and imaginative thriller worked as well as it did 32 summers ago, almost as much as the constantly shifting perspectives of Barry Sonnenfeld’s cinematography or Carter Burwell’s subtle, mood-controlling score. (Of the many Motown hits repurposed in ‘80s movies, I’m not sure any is as effective as the Four Tops’ “It’s the Same Old Song” in “Blood Simple.”) They’re reasons why it still works now, in a color-corrected digital restoration from Janus Films with a new 5.1 audio mix.

The phrase “Blood Simple” is never directly used in the film, although M. Emmet Walsh’s character, the sleazy, giggly and profoundly disturbing private detective Loren Visser, keeps hinting at it. (He calls people “money simple” a couple of times.) It comes from the Dashiell Hammett novel “Red Harvest,” when the nameless private detective known as the Continental Op complains about the town he’s supposedly cleaning up: “This damned burg’s getting to me. If I don’t get away soon, I’ll be going blood-simple, like the natives.” As the Coens have said, “Blood Simple” is closer to the sensibility of classic thriller writer James M. Cain than to Hammett, but that sentence comes close to summing up the movie.

Walsh’s ill-fitting leisure suit has peculiar bulges, as seen in the film, because he insisted that the Coens pay him in cash at the beginning of every week of shooting, and carried the money on him at all times. Whether that was method acting or a personality quirk is impossible to say. But it means that the scene when Loren stuffs thousands of dollars into his pockets, after double-crossing and killing Julian (Dan Hedaya), the man who has hired him to document his wife’s infidelity, is both something the character does and something the actor does. Ever since “Blood Simple,” there has been a strain of movie criticism arguing that the Coens are heartless technicians who don’t feel any empathy for their characters. That’s not without foundation, but this anecdote hints at the ways it’s overly simplistic.

Sticking with that theme, the Coens apparently had Loren drive what I believe is a 1968 Volkswagen Beetle in the film — a peculiar and deeply unlikely choice for a low-rent private investigator and/or hit man in central Texas — because they thought Walsh “looked like a bug.” Now, you can decide that’s Kafkaesque brilliance or you can decide that it drives you crazy because it’s manipulative and implausible. I’m not telling you what to think. But however you read it, the Beetle was an aesthetic choice, made while constructing an elaborate artifice.

The Coens denied any art-film intentions while making “Blood Simple,” which Ethan called “a no-bones-about-it entertainment” in a 1985 interview. “If you want something other than that, then you probably have a legitimate complaint,” he joked. OK, I get where he’s going with that, but the Coens aren’t as far away as they like to pretend from the worldview of a director like Michael Haneke, who frequently confronts us with the fact that a movie is a constructed illusion rather than a depiction of the “real world,” and beyond that a shared illusion whose narrative and meaning are created by the viewer as much as by its maker.

These guys went on to make a series of movies, after all, in which naturalism and artificiality are in a constant state of crisis and collision. That breaks through the surface of the story most obviously in “Barton Fink,” but is just as apparent in different ways in “The Big Lebowski” or “A Serious Man” or “Inside Llewyn Davis.” (I’m not going to get involved in defending the Coens’ most recent movie, “Hail, Caesar!” right now, but it’s a work of mischievous genius, however badly miscalculated in marketplace terms.)

In a 2015 interview with Guillermo del Toro, Ethan Coen describes Walsh’s opening monologue in “Blood Simple” this way: “Emmet’s monologue is, like, ‘Let me tell you about the real world,’ and the joke is it’s not exactly the real world, but that’s the proposition.” As del Toro then responds, “I think that’s basically most of the filmography.” If it bugs you too much that Walsh drives a Bug, the Coens would politely suggest that you are likely to find other kinds of movies more to your taste. I’m not saying they wouldn’t say mean stuff about you after you’d left the room.

So what kind of movie did the Coens make, exactly? It seems that Barry Sonnenfeld, the brilliant cinematographer who also made his debut with “Blood Simple,” and would move on to become an A-list Hollywood director with “The Addams Family” and the “Men in Black” series, suggested that the brothers watch three films before starting work on this one. They were Stanley Kubrick’s “Dr. Strangelove,” Bernardo Bertolucci’s “The Conformist” and Carol Reed’s “The Third Man.”

I mean, arguably anyone should see those three movies before doing anything, or at least before assuming that they understand anything about human motivation or politics or the relationship between cause and effect. But all of them are present in “Blood Simple”: the anarchic, apocalyptic black humor of “Strangelove,” the bleak and beautiful fatalism of Bertolucci’s Fascist-era thriller and the devious, cynical philosophy of Harry Lime, Orson Welles’ unforgettable character in “The Third Man.” How a couple of nerdy Jewish guys from suburban Minneapolis, by way of New York City, combined those ingredients into a low-budget B movie made in Texas that both is and is not about its archetypal American setting, and is and is not a series of literary and film-school references — that part defies easy explanation.

Do the Coens, in “Blood Simple” and numerous other films, depict a cruel universe whose moral entropy leads people (normal and decent people, or the other kind) into bad decisions that never go unpunished? Is that a philosophy of life, or just a narrative contrivance gleaned from too many books and movies? Well, the answers are pretty clearly “yes” and “yes,” but watching this beautiful restoration of “Blood Simple” I was actually struck by how non-bleak it is, how much the strength of the film lies in images of beauty, defiance and survival. Even the despicable Loren laughs at fate as he dies; it’s almost an Ingmar Bergman moment, staged under the bathroom sink in a desolate Texas apartment. As in so many subsequent Coen pictures, Frances McDormand (not yet married to Joel Coen) embodies a transcendent female energy in this disordered tale of betrayal and murder, a combination of shrewdness, toughness and intelligence that can absorb many terrible things without being defeated.

In fact, it almost seems relevant that the Coens raised crucial seed money for “Blood Simple” from local members of Hadassah, the international Jewish feminist organization, in St. Louis Park, the middle-class Minnesota suburb where they grew up. That was a mitzvah, for sure, but I can’t help wondering what the do-gooder women of St. Louis Park thought they were contributing to, and what they thought of the final product. Maybe they got paid back decades later in “A Serious Man,” the Coens’ most autobiographical and most overtly Jewish film. And maybe they didn’t need to get paid back. “There’s not a moral to the story, and we’re not moral preceptors,” Joel Coen told del Toro. Not that I claim to understand Talmudic philosophy, or any other kind, but isn’t that pretty close to a moral precept in itself?

The new 4K restoration of “Blood Simple” from Janus Films opens this week at Film Forum in New York and the Alamo New Mission in San Francisco, and opens July 29 at the Nuart Theatre in Los Angeles, with more cities and home video to follow.

Source: New feed

Rob Sheffield: “What I love most about David Bowie is that he celebrated the romance of being a fan.”

Rob Sheffield, David Bowie

Rob Sheffield, David Bowie (Credit: Marisa Bettencourt/AP/Brian Killigrew/Photo montage by Salon)

When Rob Sheffield learned that David Bowie died, he stayed up all night and the next day writing about his hero. The longtime critic and “Rolling Stone” columnist wrote with no real purpose or goal in mind, but simply tried to get ideas on paper. It was his own personal wake for the man who wrote “Space Oddity” and “Starman,” who embodied so many different roles, who inspired Sheffield and millions of other fans around the world. “Words were just pouring out of me,” he recalls. “It was impossible not to write about David Bowie and listen to David Bowie and mourn for David Bowie.”

In the morning, as others were just waking up to the sad news, Sheffield received a call from his editor, who wanted him to keep writing. He spent all of January 2016 and part of February reconsidering Bowie’s greatest hits and his greatest guises, and the result is “On Bowie,” a book-length eulogy for the mercurial rock star that debuts today. Sheffield takes us through the man’s life and career, from his first fluke hit through his glam ascendancy, from his forays into Philly soul to his Thin White Duke experimentalism, from his early ‘80s highs to his late ‘80s lows, from his disappearance in the 2000s to his comeback in the 2010s.

Bowie, of course, is one of the popular and written-about rock stars in the world, and there are scores of biographies that trace a similar trajectory. What makes “On Bowie” unique—what makes it something other than redundant—is its mix of biography and autobiography. In writing about Bowie’s career, Sheffield cannot help but write about his own life and his own experiences with Bowie’s music. As a teenager growing up in Boston, he found inspiration and consolation in songs like “Heroes” and “Let’s Dance” and “Boys Keep Swinging,” dissecting their lyrics and sequencing them on countless mixtapes for himself and his friends. This kind of close identification with an artist can be indulgent or critically irresponsible, but in “On Bowie,” it’s absolutely crucial in presenting the celebrity through the eyes of a fan and in humanizing an artist who barely appeared human onstage.

This is perhaps Sheffield’s greatest trait as a critic: He writes with the enthusiasm of a true fan, whether he’s extolling A Flock of Seagulls in the “Spin Record Guide” or explaining how music helped him grieve his wife in “Love Is a Mixtape” or examining the impact New Wave had on his life in “Talking to Girls About Duran Duran.” His prose burbles with joy and excitement, which may cloud his judgment (he perhaps overpraises Bowie’s ‘90s output) but makes his books refreshing and relatable. Anyone who has ever carefully planned out a mixtape or listened to the same song ten times in a row or driven hundreds of miles just to a see a band perform will recognize something of themselves in Sheffield’s writing—just as Sheffield recognized something of himself in David Bowie.

Especially after so many quick eulogies and hot takes, the topic of David Bowie still seems inexhaustible. Why do you think that is?

David Bowie is somebody who exists on so many different planes. Every audience has their own David Bowie and every culture and really every fan has their own David Bowie. For me, part of grieving for David Bowie after his death was having so many conversations with my friends about our experiences with him and learning all the different things he meant to people. There are so many different ways to live your Bowie-ness and so many different ways to hear him. It’s funny that is was only after he died that I found out what a huge deal “Labyrinth” is in his canon. When that movie came out, I was already a teenager and was not interested in seeing David Bowie in a kids movie with Muppets, so I didn’t realize that for a huge number of his younger fans “Labyrinth” is the gateway drug that got them into Bowie. That’s an example of the things that I kept learning about David Bowie, who is someone I’ve loved all my life and someone I keep learning more and more about even after he died.

I’m constantly surprised by the love for “Labyrinth.” It definitely seems to be a movie that millennials have embraced.

There was a beautiful tribute at a punk DIY spot in Brooklyn called Silent Barn, where a bunch of bands covered the “Labyrinth” soundtrack all the way through. There were ten bands, and each one would play a different song. That was one of the strangest David Bowie tributes I witnessed, but it was a beautiful thing. It was this different side of David Bowie for people who were born in the ‘80s or ‘90s. For me, though, I barely remember that this movie ever happened. I was a New Wave kid in the ‘80s, so the David Bowie of “Let’s Dance” was huge for me. For a lot of other fans, though, that album is some aberration in his career, where he tried to do something that didn’t hold up very well in retrospect. I’m very much a “Let’s Dance” loyalist, and another thing I took away from the conversations I had with friends about David Bowie was that I was maybe in the minority about that. His New Wave period didn’t mean the same to everybody that it meant to me.

My first Bowie record was “Never Let Me Down,” so I have a weird affection for that era.

You’re definitely in the minority there.

No kidding. But it was the first record he released when I was of record-buying age. You have a completely different experience with Bowie depending on where you enter his story.

For me it was “Lodger.” That was the first one that came out when I was already a David Bowie fan. I had already been listening to FM rock radio and knew “Heroes,” and then “Lodger” came out. I have a massive affection for that one. I’m capable of very long and very tedious arguments in bars defending that record. I love “Lodger.” You can love David Bowie in all these different ways, so there’s always more to discover from him. My theater friends have a totally different Bowie than I do. My fashion friends have a completely different Bowie. My art friends, my film friends, everybody. He dabbled in so many different kinds of stuff and incorporated so many different types of expression into his overall statement, and it’s amazing that people live and die for entire corners of the Bowie universe that I barely even know. Part of the way he saw his journey as an artist was dabbling in everything that interested him, whether he had a deep-rooted talent for it or not. He figured he would do it and if it came out disastrously, he’d have a laugh at it and move on. That meant he was able to participate in so many different worlds and so many different artistic languages.

The book portrays him as a fan in his own right, someone who’s constantly listening to other artists and integrating their ideas into his own music. Sometimes he has great taste, as with Iggy Pop or Lou Reed. Sometimes not, as with the Polyphonic Spree.

It’s amazing and really inspiring how personal a fan he was. That’s something that he never lost even after he had been making music for fifty years. That last album he made—when he knew he was close to the end of his life—was so influenced by Kendrick Lamar and D’Angelo, artists who were so much younger than he was. But he listened to them and got different ideas about how to approach music. He was still so passionate about learning and absorbing new influences. There’s an awful lot of rock stars his age who do not share that trait.

It never sounded like he was retreating into the music of his own adolescence or trying to recapture an older sound. He didn’t seem like he was interested in looking backwards.

That’s a constant throughout his career: He’s always interested in the moment. David Bowie was less interested in nostalgia than any great rock-and-roll artist who ever existed, whether it’s 1965 and he’s doing John Lee Hooker and James Brown songs or it’s 1975 and he’s influenced by the O’Jays and the Stylistics or it’s 2015 and he’s listening to Kendrick and D’Angelo. He’s always interested in what’s going on right now. It’s really weird to think that he was listening to “Black Messiah” and “To Pimp a Butterfly” at the same time we were all listening to those album and getting our minds blown by them. And David Bowie was also getting his mind blown by them. He was willing to learn from it. He was someone who was never satisfied with anything he did before and really reluctant to rest on his laurels. He was definitely not someone who was willing to settle for “legacy artist” status. He always wanted to do something new.

But he’s not always successful. “Blackstar” is amazing, but there were times when he was listening to something but not able to transform it into something that’s new or personal. I’m thinking of “Earthling” in particular, which tried to incorporate some drum ‘n bass elements.

Sometimes he’d try something and it would get the better of him. When he tried to do a reggae song [“Don’t Look Down,” off 1984’s universally-reviled “Tonight”], to his credit he only tried to do one reggae song. He did that one and was like, okay, reggae requires a certain emotional skill set and a certain rhythmic skill set that I do not have. He didn’t make a lot more reggae songs. He tried it once and it didn’t fit his particular toolbox. It’s funny you mention his drum ‘n bass album, which is a good example of him hopping a trend right at the time everybody else is really done with it. So the production was not impressive, but he still managed to write songs that were really impressive. He wasn’t necessarily trying to become a master at any of these styles, but just wanted to see what he could learn from them.

Have you ever read the speech he gave at the Berklee College of Music for its commencement in 1999? It’s a really beautiful and funny speech. He said it was funny to be a songwriter talking to real musicians. He said he figured out his place in music in the ‘60s when he tried playing saxophone. And he was a terrible saxophone player and knew he couldn’t express himself playing that instrument. But he also said something like, I can put together different elements of different music and I had figured out the Englishman’s true place in rock and roll. Which is that there was no way he was going to approach anything from the context of virtuosity or authenticity. All he really had to bring to it was curiosity.

That attitude does so much to humanize him. Here’s a guy playing with all these masks and disguises, yet there does seem to be a curiosity driving every decision.

There’s a continuity through all the different phases. He’s a very musically curious and emotionally curious and sexually curious and artistically curious human presence through it all. There’s a tangible emotional aspect to all these experiments that he does. Some of them turn out to be disasters and some of them turned out to be hugely influential. But he was constantly willing to try things and see what happened.

And part of that was constantly referring back to himself and creating this personal mythology through these various songs. He spent decades exploring that outer space metaphor he introduced on “Space Oddity.”

There’s this great song on “Heathen” from 2002 called “Slip Away,” and it’s about being in Coney Island and riding the ferris wheel with his daughter. At the top of the ferris wheel he looks down and it feels like he’s looking down on his past and all these outer space adventures that he used to contextualize in whatever drug or sex or rock-and-roll extreme he was indulging at that time. There’s this beautiful line in the song where he sings, “Down in space it’s always 1982.” It cracks me up to think that this theme of space exploration is something he started strictly as a gimmick, but it became a lifelong metaphor for self-discovery and transformation. Part of what makes him such an inspiring figure is that he was committed to that transformation his entire life. How insane is it that he was able to actually have a happy marriage for the last couple decades of his life—and with a model! That seems like the ultimate transformation.

And it meant he was able to age with some dignity, as opposed to having to go back and play “Starman” on nostalgia tours. He’s one of those rare artists for whom the narrative didn’t stop. His story kept going as he kept evolving.

Absolutely. He was not willing to concede that the narrative had stopped, and I think he attracted the type of audience that didn’t want him to stop. Nobody wanted to go to a David Bowie show and hear all of his greatest hits from the early ‘70s. That would not have been satisfying. He attracted the kind of audience that wanted him to keep trying new things, whether he was good at them or not. I remember seeing him at Madison Square Garden in 2003 where he did three songs from the album “…Outside.” It’s a terrible album, but he did three of those songs. At one point, he said, I’d like to do another song from “… Outside,” and there were a few assorted whoo’s. And he said, All seventeen people who bought that album are clearly here in this room.

We knew we could trust him to do what seemed interesting to him at the moment. We knew he wasn’t phoning it in. If you went to a David Bowie where it felt like he was trying to give the audience what they wanted, it would have been exactly what the audience didn’t want. He wanted his fans to feel hideously appalled and sometimes feel pity and sometimes feel shock and sometimes feel anger. He wanted that response from his fans and that’s what he got.

One of the things I’ve appreciated about your books is how you write with the enthusiasm of an unabashed fan. This book and especially “Talking to Girls About Duran Duran” are about what it means to be a fan and have such an intense connection with an artist. Is that draws you to David Bowie, who shares that similar sense of fandom?

When I was a kid, that was something that was easy to notice right away. He was a fan of the music around him, and he was someone who didn’t see it as a thing where artists were on this mountaintop and the audience was on the ground accepting whatever thunderbolts were thrown down from the clouds. He was someone who was very engaged with his audience. You remember that greatest hits album, “Changesonebowie,” that came out in ’76? I love that album. It’s a greatest hits album where every song sounds like a different guy who’s a fan of something completely different. Listen to a song like “Diamond Dogs” and it’s a guy saying, My god, the Stones are the best ever! Then you listen to “Fame,” and it’s a guy saying, The O’Jays are where it’s at! You listen to “Golden Years,” and this guy is saying, Disco is a galactic language of communication and language! Then you go back to the first side and listen to “Space Oddity,” and that guy is strumming an acoustic guitar and saying, Bob Dylan taught me so much!

That’s one thing about David Bowie: H e was always upfront about being a fan in a way that was radical and unprecedented for rock stars in the ‘70s. He was very upfront about the fact that he was stealing ideas from everything he liked. In the book I talk about that interview with Dinah Shore on her morning talk show in 1976, where he says, I’m very flirty and very faddy and I get easily saturated by things. I’m a big fan of different artists and I just steal things from them. And Dinah Shore basically says, I can’t believe you’re admitting this on TV. His response was, I steal things. That’s what I do. That’s what it really means to be a fan. We steal pieces of our emotional selves from the music we love. There are aspects of my personality that I have modeled on Bowie and aspects that I based on Aretha Franklin and aspects that I modeled on Janet Jackson and Stevie Wonder and Paul McCartney and Lou Reed—all these different artists who have been so inspiring to me in different ways. That’s the thing about being a fan: You steal things from the artists you love. David Bowie saw that as a lifelong heroic, romantic quest, and he was not at all embarrassed about that aspect of being a fan. To me that’s very moving and inspiring. What I love most about David Bowie is that he celebrated the romance of being a fan.

And that elevates the fan. It allows you to create your own personal Ziggy Stardust out of all of these different people. Listening becomes a creative endeavor rather than something passive.

You don’t just listen to one kind of music and become one kind of persona. You listen to different kinds of music and explore different kinds of art and read different kinds of books and then you combine those ideas in ways that turn out to be who you are. That can be a threatening idea, especially for rock stars in the ‘70s who were more interested in establishing their roots or their authenticity. Whereas David Bowie presented being a fan as a creative adventure. It’s not about being a passive receptacle of pop culture. It’s more about being a creative participant in pop culture. As long as you’re passionate about music, something creative will come of it.

Source: New feed

D.C. enacting $15 minimum wage indexed for inflation, in huge victory for labor rights

fight for 15 rally woman

(Credit: AP/Seth Wenig)

The growing Fight for 15 movement had a huge victory this week.

Washington, D.C. passed legislation on Monday night that will increase the minimum wage to $15 by 2020 and index it for inflation.

Mayor Muriel Bowser signed The Fair Shot Minimum Wage Act in D.C.’s Columbia Heights neighborhood.

The current minimum wage in D.C. is $10.50 per hour, although it is slated to increase to $11.50 later this week, The Huffington Post reported. In 2017, it will be increased to $12.50.

D.C. is one of the most expensive cities in the U.S.

The new legislation will incrementally increase the minimum wage to $15 in 2020. The minimum wage for tipped workers will rise from $2.77 now to $5 per hour before tips, although employers must pay the difference if the worker does not make more than $15 per hour with tips.

In an even bigger victory for labor rights, the new law will also increase the minimum wage annually based on an inflation index, so the real wage of workers will not diminish over time.

In 2014, the D.C. council voted to permanently tie the minimum wage to inflation, in a decision that was applauded by unions and labor rights groups throughout the country.

Earlier this month, the D.C. council unanimously voted for the $15 minimum wage, although the measure was only just signed into law on June 27.

SEIU local 1199 called the wage boost “a huge victory for DC workers.” It attributed the win to “the tireless fight waged by workers, activists and labor unions.”

The Economic Policy Institute, a left-leaning think tank, estimates the new minimum wage will benefit 114,000 workers, roughly 14 percent of all D.C. workers and more than one-fifth of the district’s private-sector workers.

“Far from the stereotype of low-wage workers being teenagers working to earn spending money,” the Economic Policy Institute emphasized, “those who would benefit are overwhelmingly adult workers, most of whom come from families of modest means, and many of whom are supporting families of their own.”

Kate Black, executive director of the women’s rights advocacy group American Women, applauded the wage increase as “a welcome relief for the thousands of minimum-wage workers in the District.”

She noted that the new increase will greatly help women. “Nationally, two-thirds of minimum-wage workers are women and more than half are 25 or older,” Black stressed.

“The reality is that our workforce has changed, with women now making up nearly half of the labor force, and they are primary or co-breadwinners in two-thirds of American households,” she continued.

The wage increase will also greatly benefit Americans of color, who are disproportionately employed in low-wage jobs.

Some of the U.S.’s biggest cities, including Los Angeles and San Francisco, have passed $15 minimum wage laws.

New York state and California will also be implementing $15 minimum wages in the next several years.

These victories have been the culmination of years of organizing by workers, labor unions and activists.

Seattle was the first major U.S. city to pass a $15 minimum wage, after a struggle launched by the group Socialist Alternative. Seattle City Councilmember Kshama Sawant, one of the only elected Marxists in the U.S., helped lead the campaign for a living wage.

Since then, the grassroots Fight for 15 movement has blossomed into a powerful national movement, with the active participation of thousands of low-wage workers.

The federal minimum wage is presently just $7.25 per hour.

Since 2013, 18 states and D.C., along with nearly 50 cities and counties, have raised the minimum wage, benefiting the lives of millions of American workers.

Democratic presidential candidate Hillary Clinton has been ambiguous on the issue. She has called for a $12 federal minimum wage, although she says she supports a $15 minimum wage in certain states.

Her opponent, self-declared democratic socialist Bernie Sanders, called for a $15 federal minimum wage as a national standard that can be increased in more expensive states.

At a hearing of the Democratic National Committee’s platform drafting committee on June 24, the representatives appointed by Clinton voted against a $15 minimum wage amendment that was supported by the representatives appointed by Sanders.

Clinton’s representatives argued the DNC platform already expresses support for a $15 minimum wage, but the language is weak and offers no specific mechanism for getting there. Sanders’ supporters, led by Rep. Keith Ellison, called for the explicit demand of an indexed $15 federal minimum wage to be written into the Democratic Party’s platform. Clinton’s surrogates opposed the measure.

Deborah Parker, a committee member appointed by Sanders, called the present federal minimum wage a “starvation wage” and emphasized that single parents cannot afford to work on the minimum wage and provide for their children.

Rep. Ellison stressed that, if the minimum wage in 1968 had been indexed for inflation, it would be at least $22 today.

“We are going through one of the worst periods of wage stagnation in our nation’s history,” he said. Ellison pointed out that Americans who are working full-time on the federal minimum wage are eligible for food stamps, section 8 housing and Medicaid.

“One of the problems in our economy, and the reason we’ve had slow growth, is because the average working American doesn’t have any money,” he continued. “You can’t spend money that you don’t have.”

Even small business owners are hurt by the low minimum wage, Ellison added, “because their customer base is broke.”

Source: New feed

UPDATED: 2 explosions rock Istanbul airport, killing at least 28

Istanbul Bombing

Paramedics push a stretcher at Turkey’s largest airport, Istanbul Ataturk, Turkey, June 28, 2016. (Credit: Reuters/Osman Orsal)

Several suicide bombers have hit the international terminal of Istanbul’s Ataturk airport, killing at least 28 people and wounding some 60 others, Istanbul’s governor and other officials said Tuesday.

Turkey’s NTV television quoted Istanbul Governor Vasip Sahin as saying authorities believe three suicide bombers carried out the attack.

“According to initial assessments 28 people have lost their lives, some 60 people have been taken to hospitals. Our detailed inspections are continuing in all aspects,” Sahin said.

“The entry and exit of passengers are being returned to normal rapidly and planned flights will resume as soon as possible,” he added.

Roads around the airport were sealed off for regular traffic after the attack and several ambulances could be seen driving back and forth. Hundreds of passengers were flooding out of the airport and others were sitting on the grass, their bodies lit by the flashing lights of the emergency vehicles.

Twelve-year-old Hevin Zini had just arrived from Dusseldorf with her family and was in tears from the shock.

She told The Associated Press that there was blood on the ground and everything was blown up to bits.

South African Judy Favish, who spent two days in Istanbul as a layover on her way home from Dublin, had just checked in when she heard an explosion followed by gunfire and a loud bang.

She says she hid under the counter for some time.

Favish says passengers were ushered to a cafeteria at the basement level where they were kept for more than an hour before being allowed outside.

Turkish Justice Minister Bekir Bozdag earlier said that according to preliminary information, “a terrorist at the international terminal entrance first opened fire with a Kalashnikov and then blew himself up.”

Another official said attackers detonated explosives at the entrance of the international terminal after police fired at them.

The official, who spoke on condition of anonymity in line with government protocol, said the attackers blew themselves up before entering the x-ray security check at the airport entrance.

Turkish airports have security checks at both the entrance of terminal buildings and then later before entry to departure gates.

Two South African tourists, Paul and Susie Roos from Cape Town, were at the airport and due to fly home at the time of the explosions and were shaken by what they witnessed.

“We came up from the arrivals to the departures, up the escalator when we heard these shots going off,” Paul Roos said. “There was this guy going roaming around, he was dressed in black and he had a hand gun.”

The private DHA news agency said the wounded, among them police officers, were being transferred to Bakirkoy State Hospital.

Turkey has suffered several bombings in recent months linked to Kurdish or Islamic State group militants.

The bombings include two in Istanbul targeting tourists – which the authorities have blamed on the Islamic State group.

The attacks have increased in scale and frequency, scaring off tourists and hurting the economy, which relies heavily on tourism revenues.

Istanbul’s Ataturk Airport was the 11th busiest airport in the world last year, with 61.8 million passengers, according to Airports Council International. It is also one of the fastest-growing airports in the world, seeing 9.2 percent more passengers last year than in 2014.

The largest carrier at the airport is Turkish Airlines, which operates a major hub there. Low-cost Turkish carrier Onur Air is the second-largest airline there.

Source: New feed

Yet another white lady in jeopardy: “The Shallows” and Hollywood’s empathy gap

The Shallows

Blake Lively in “The Shallows” (Credit: Columbia Pictures)

How many brown people have to die so that Blake Lively can live?

That question is explored in “The Shallows,” the new carnivorous shark B-movie from schlock maestro Jaume Collet-Serra (“Orphan”) that’s like “Open Water” with more posterior shots. Collet-Serra so favors Ms. Lively’s backside that he appears to be moonlighting as her proctologist. In “The Shallows,” the former “Gossip Girl” starlet dons a skimpy bikini to catch some waves in Mexico, where she quickly discovers that there’s trouble in these waters. A shark attacks her, severely injuring her leg. Resourceful protagonist Nancy (a medical student, of course) makes a suture out of her earring while she takes shelter on a rock and waits for rescue.

Collet-Serra, the Spaniard who also directed the Liam Neeson thrillers “Unknown” and “Non-Stop,” knows his way around this material. “The Shallows” is ingeniously constructed, projecting the image of Nancy’s stopwatch on screen at key moments to build suspense; Nancy calculates the distance to the nearest buoy to see how much time she has before the shark catches up to her. Collet-Serra often films in different styles, jumping between lush aerial panoramas of the breathtakingly clear waters and manic handheld shots through the use of a GoPro.

But while the film is elevated by its director’s skill and a surprisingly strong physical performance from Lively, “The Shallows” is yet another in a string of movies that amount to white survival porn—in which the trials of Caucasian tourists are treated as more important than the suffering of the people of color around them. Collet-Serra’s thriller foregrounds the suffering and the fragility of whiteness, while the brutal killings of local Latinos (many of whom attempt to save Nancy) go unmourned and are little remarked upon. While Nancy has agency and a backstory, none of brown people who perish in the harsh sea are granted the same compassion. It’s yet another reminder that in Hollywood, only white lives matter.

There are five people of color in “The Shallows,” and for the purposes of this essay, we will assume they are Mexican, because we are given next to no information about their lives. Nancy is given a ride to a secluded beach by Carlos (Óscar Jeanada), a local who lives near the shore. “The Shallows,” which was written by Anthony Jaswinski, isn’t terribly interested in Carlos’ life. During their car ride, Nancy spends the trip complaining about the friend who got too drunk to accompany her on the excursion. Carlos’ only character trait is that he has a young son. (Spoiler: He will come to Nancy’s aid at a pivotal moment.)

When Latinos aren’t cast in a servile position—around to chauffeur wealthy white people—they are shark bait. After Nancy becomes trapped on the aforementioned rock, she notices that a man passed out on the beach overnight. He’s the town drunk, depicted with an empty liquor bottle seemingly fused to his hand (ala Ellen Barkin in “Drop Dead Gorgeous”). She awakens him by screaming for help. Rather than coming to her rescue, the inebriated fellow steals her stuff: her backpack, her wallet, and her phone. Not yet satiated by looting her, he even tries to make off with her surfboard.

Because this is a horror movie, you can probably guess what happens: He is killed before he can get away. The encounter takes place offscreen, a wise choice on Collet-Serra’s part. But while there’s a nonjudgemental passivity to Nancy’s injuries, the shark reserves a special wrath for the looter, severing him in half. After he is attacked, the man still tries to claw his way to safety, even though he’s missing his legs and trailing his intestines behind him.

This gruesome treatment of Latinos is par for the course. When Nancy first arrives at the beach, she meets two Mexican surfers, who depart just as our heroine is first besieged by the shark. Nancy tries to stop them but to no avail. Luckily for her, the two come back again to surf the next day—but are killed trying to intervene. While Lively gets to play something resembling an actual human being, these men are nothing more than ciphers. They have little dialogue and no depth or dimensionality; you never learn their names. The film’s IMDb page is strangely unhelpful in this regard: The characters aren’t even listed.

There’s, of course, a reason for that. The lives of these Latinos are merely props in the story of a privileged white woman learning an Important Lesson. Nancy has recently decided to drop out of school following the death of her mother from cancer. “Some people just can’t be helped,” she explains to her father over the phone. He argues that Nancy’s mother was a fighter and would have wanted her daughter to keep, well, fighting. (The movie repeatedly stresses this point.) In facing down death, Nancy regains that scrappy spirit.

Such were also the lessons of Juan Antonio Bayona’s “The Impossible,” which tells the story of Maria Belón and her family, who survived the devastating tsunami of 2004 while vacationing in Thailand. Much attention was paid to the film’s casting: In real life, Belón and her family are Spanish. Naomi Watts and Ewan MacGregor, who are Australian and Scottish, respectively, were cast to play the couple instead.

This was pointed out by some as yet another instance of Hollywood whitewashing—the tendency to erase non-white people from their own stories by casting Caucasian actors in place of people of color. Recent examples include the Scarlett Johansson-starring “Ghost in the Shell” (in which she plays Japanese) and Ridley Scott’s “Exodus: Gods and Kings,” featuring Joel Edgerton and Christian Bale as Egyptians. The Belóns are European, so the case is a bit different. But what received less scrutiny who wasn’t represented in the film: The thousands of non-white victims who died in the horrific tragedy, nearly all of whom are reduced to background actors and props in their own story.

As The Guardian’s David Cox explains, few of the those who actually died in the tsunami were white. “The Indian Ocean tsunami of 2004 killed at least 227,898 people,” Cox writes. “Around a third of these were children.” Just 10 percent of those who died in the tsunami were Caucasian, a far cry from Naomi Watts’ assertion that half the victims were tourists. “Holiday paradise Thailand, with its 5,400 deaths, was actually at the margins of the tragedy,” Lee continues. “Indonesia alone suffered 130,700 deaths, largely of low-income Acehnese people; the figure for the U.K., whence [the film’s] family appears to hail, is 149.”

The movie’s treatment of its non-white characters is distressingly similar to “The Shallows.” If the Latinos in Collet-Serra’s film exist to serve and save white people, “The Impossible” exhibits the same racial power dynamics. “Virtually everyone shown suffering after the tsunami is a European, Australian, or American tourist,” the New York Times’ A.O. Scott writes. “At one point Maria and Lucas are cared for by residents of a small village and later they are helped by Thai doctors, but these acts of selfless generosity are treated like services to which wealthy Western travelers are entitled.”

If Scott claims that the fact that “the vast majority of the dead, injured and displaced were Asian never really registers,” why do Hollywood movies keep making these mistakes? After all, these issues were replicated in John Erick Dowdle’s “No Escape,” in which Lake Bell and Owen Wilson play an American couple who relocate their family to a unnamed South Asian country (hint: it’s probably Cambodia) on the eve of violent revolution and must flee the tumult. The rebels are depicted like zombies, a mass horde that craves the flesh of innocent Westerners. But instead of eating brains, they hack their victims to death.

The little humanity afforded to people of color is a product of a Hollywood that privileges the stories of white folks above all else; this is a system in which white actors are considered “bankable,” while even A-list black actors are treated like second class citizens. Since nearly winning an Oscar for “The Help,” Viola Davis has struggled to find roles on film worthy of her, scoring thankless parts in “Extremely Loud and Incredibly Close” and “Ender’s Game.” Kerry Washington has yet to find a breakout role in cinema to match her work on “Scandal” and HBO’s “Confirmation.”

But if Hollywood devalues people of color, this is also indicative of how the lives, experiences, and even the pain of non-white people aren’t recognized in general. A groundbreaking 2013 study from the University of Milano-Bicocca described what researchers called an “empathy gap” along racial lines. When white people are shown images of both white folks and people of color being harmed, such as receiving a prick on the skin, the survey found that the respondents perceive non-white people as feeling less pain.

There have been a number of theories as to why that is. A separate study suggests that it has to do with privilege: Because respondents assume that people of color have experienced greater hardship, they unconsciously assume these subjects are accustomed to pain and can better deal with the occasional poke or pinch. But perhaps the more pertinent reason is that white people still struggle to relate to people of color at all; in Hollywood lingo, their struggles aren’t “universal.” A 2012 study from Indiana University found that the more black actors a movie stars, the less likely white viewers are to want to see it.

It’s telling that the costar who receives the most screen time in “The Shallows” isn’t one of the Latino actors eaten by a shark but a seagull with a broken wing who hides out on the rock with Nancy. She nurses him back to health, popping his dislocated shoulder back into place so he can fly away. He’s still too weak to escape, so Nancy pushes him to shore on some debris. No one would dare wish harm to an injured animal, but it would have been nice if “The Shallows” cared about the pain of people of color as much as it does a bird.

Source: New feed

How Chicago lost the George Lucas museum: A cautionary tale

Storm Troopers

(Credit: AP/Denis Poroy)

“Star Wars” reflects several fundamental and even universal themes in politics, religion, philosophy, and technology. It even raises a controversial question: “Who shot first, Han or Greedo?” The creator of “Star Wars,” George Lucas, deserves credit for inspiring millions of people to explore profound topics, and his epic space opera will certainly be culturally significant for generations to come. Lucas wants to build a museum to solidify his legacy. However, after years of legal entanglements, there is a question of galactic proportions: Where will it be built? Chicago has been the primary candidate for two years, but legal proceedings have prevented it from happening. Chicago’s failure to get the museum is symptomatic of today’s political climate. The entire affair involves as many political questions and philosophical themes as “Star Wars” itself. The controversy is reflective of today’s political climate just as “Star Wars” reflects the time period when Lucas first wrote it.

For years Lucas has been trying to build a multimillion-dollar museum dedicated to his collection of art and movie memorabilia. The Lucas Museum of Narrative Art will be a museum about visual storytelling with a focus on narrative painting, photography, film, and digital art. The museum will include Lucas’ private art collection, and to the delight of dedicated fans, it will include authentic “Star Wars” props. According to the website, the museum will feature “popular art from illustration to comics, an insider’s perspective on the cinematic creative process, and the boundless potential of the digital medium.” It will have three movie theaters, lecture halls, a library, a restaurant, and an education center. Lucas is 72 years old, and he wants to see his “passion project” completed within his lifetime.

There have been a few potential host cities. The selection process is reminiscent to bids made to the International Olympic Committee to host the Olympic Games. The city that offers the best location, will get the museum. Each possible city has a connection to the “Star Wars” creator. Los Angeles wanted the museum near Los Angeles Memorial Sports Arena and the University of Southern California. This site made sense because Lucas graduated from USC, and he has donated millions of dollars to its film school. L.A. is also the filmmaking capital of the world. Oakland was a possibility because of its waterfront sites, but many see the city as a less glamorous destination. In 2010, Lucas approached San Francisco to be the site of his legacy museum. This is the location of his television production company, Lucasfilm. The museum was nearly built in the Presidio in 2014, but after four years of unsuccessful land negotiations, Lucas decided to find a different host city.

In June 2014, Lucas officially selected Chicago, where he and his wife, Mellody Hobson, live part time. Mayor Rahm Emanuel offered to accommodate Lucas by offering him prime real estate—lakefront property. Chicago first proposed land on Lake Michigan that is currently a parking lot just south of Soldier Field, where the Bears play their home games, and within walking distance of three other museums. The Chicago Park District promised to lease the property to the museum for $1 a year, and Lucas would have personally financed the project for over $740 million. The museum would be a 300,000-square-foot building on 17 acres of lakefront property.

Accepting millions of dollars to turn a parking lot into a world-class museum seemed like a realistic goal, especially if George Lucas paid for it and it costs the city nothing. The mayor, the city council, the Chicago Park District, and many Chicagoans wanted the museum built. However, there was a major problem: Building on lakefront property violates a city ordinance designed to protect the land.

A small band of rebels, called Friends of the Parks, did everything in its power to prevent construction on Lake Michigan. Friends of the Parks, a nonprofit organization that seeks to preserve and promote the use of parks in Chicago, filed a lawsuit in November 2014 against the city to prevent construction on public land. The group asserted that the city of Chicago overreached its authority by offering lakefront property to Lucas, and it suggested that Lucas build the museum anywhere in the city but the lakeshore. Doing so would violate a public trust doctrine, which was created a long time ago (in a galaxy far, far away …).

The doctrine has existed since the 1800s, virtually preserving the land and resources for public use. According to the Friends of the Parks, building the Lucas Museum of Narrative Arts on the lakeshore would have spoiled Chicago’s lakefront property and ultimately benefited Lucas more than the citizens of Chicago.

On February 4, 2016, a judge ruled that the lawsuit filed could proceed, which prevented construction pending a decision. The ruling would take longer than Lucas was willing to wait, so he sought alternative host cities, including Los Angeles for the second time, as well as the original proposed city, San Francisco. This was San Francisco’s second serious attempt to win the project, first to the Presidio and now to Treasure Island. If only a Death Star could misfire this many times.

Yet in April 2016, Mayor Emanuel made a “Hail Mary” play by proposing a second site on the lakefront where part of the city’s convention center, McCormick Place, currently lies. The plan involved demolishing a section of the convention center, McCormick Place East, and replacing it with the museum. The cost of tearing down one building and constructing another would have been $1.17 billion, and involved various tax extensions and creative political maneuvering by the mayor. Emanuel also pushed for an accelerated timeline for the legal proceedings. This would have sent the federal court decision into light speed by tossing out the suit before Lucas found another city.

On June 17, 2016, after months of court proceedings that have put construction on hold, and after serious threats by Lucas and his wife to find another host city, Friends of the Parks offered to make a deal. They announced they would drop their lawsuit for promises that other park projects in the city were funded in the future, among other concessions.

This attempt was apparently too much for Lucas. One week later on June 24, 2016, Lucas announced that Chicago is no longer a potential site for the museum. He will instead move to California. Unless a bounty hunter from Ord Mantell changes his mind, Chicago has lost the museum.

Imperial Corruption

The entire endeavor leaves us more confused than young Luke Skywalker learning the ways of the Force. Luke’s journey to become a hero was pretty cut and dry, but the Lucas Museum’s journey is much more complicated. The inability of the museum, Friends of the Parks, and Chicago to reach an agreement is symptomatic of a larger political context. It is reflective of today’s political climate of sharp ideological differences and the refusal to compromise. The result is a political system more paralyzed than Han Solo frozen in carbonite. The entire establishment is unable to accomplish anything at the national, state, and local level.

It starts with the federal government. Lawmakers are unwilling to compromise on the most trivial matters, so vital issues, like creating budgets, are virtually impossible to navigate. At the state level, Illinois is experiencing a budget crisis, and a political stalemate in Springfield has done little to solve it. Subsequent funding cuts have decimated social services and universities. Chicago State was forced to lay off over one-third of its staff, and Governor Bruce Rauner wants the state to take control of the Chicago Public Schools to allow the district to file for bankruptcy. The school district is experiencing a $1 billion budget deficit and mass layoffs. Teachers have been working without a contract, and they were forced to take unpaid furlough days. The Chicago Teacher’s Unions threatened to strike many times during the 2015-2016 school year, and a possible strike looms for the fall.

Mayor Emanuel has also been under intense public scrutiny for his handling of high murder rates and police misconduct. Massive civil unrest erupted following the release of a dashcam video of the killing of Laquan McDonald, who was shot sixteen times by police. The mayor was criticized for keeping the video a secret and for releasing it under suspicious circumstances. Some suggested that the mayor attempted a cover-up. The fallout led directly to the firing of then-Police Chief Gary McCarthy in December 2015, and activists demonstrated their electoral power by defeating Cook County State’s Attorney Anita Alvarez in the Illinois primary election in March 2016. This was due impart to grassroots efforts organized by Black Lives Matter.

Negotiation is Our Only Hope

Chicago has apparently lost one of the largest philanthropic gifts of the 21st century. But as of this writing, Lucas has not officially found another location for his museum. Negotiation is our only hope to bring it back to Chicago.

A difficult political climate makes it easier to build a Death Star than a museum. It comes down to this: A billionaire filmmaker wants to build a museum on Chicago’s most sacred land. What is the moral imperative? Should billionaires be allowed to do whatever they want, regardless of good intentions? Does preserving the environment mean preventing a world-class museum from being built on a parking lot? Should the mayor divert resources and political capital away from critical issues to focus on a museum? The answers are not black and white. “Star Wars” teaches us about the struggle between good and evil; it is a story of good guys and bad guys, but the real world is much different. We must make room for negotiation—and common sense.

First, while there should be special protection for shoreline property, there should be room for compromise. This is not a fight for Chicago’s lakefront; it is a fight for a small piece. Building a museum does not automatically give a green light to build on the entire shoreline. No one wants private condominiums along the shore, but a small compromise would bring one building—a state-of-the-art museum.

The lakefront is a unique destination that offers more than just a natural setting. It is already home to some of Chicago’s world-class museums. The Shedd Aquarium, the Field Museum of Natural History, and the Adler Planetarium are all located on the museum campus. These institutions draw millions of visitors each year and add cultural and economic value to the city. The Lucas Museum is estimated to bring an additional $2 billion to $2.5 billion in tourist spending, and it will generate $120 million to $160 million in new taxes. The chief executives of the top museums in Chicago offered their support to the Lucas Museum with an open letter, writing that “the museum [would be] a long-term investment in our city that will continue to pay returns for generations to come.” They wrote that the new museum would make Chicago a “more creative, more prosperous and more dynamic city.”

A point of contention involves building the museum in another location in the city, perhaps in an underdeveloped neighborhood. Lucas is not interested in this; he wants the lakefront close to the other museums. Critics say he is egotistical and that he does not truly care about Chicago. This is unfair criticism.

Indeed, building the museum somewhere else would make it more accessible to some people; however, it would also inadvertently make it inaccessible to others. Clustering museums together, like other major cities do, create “museum campuses” that are catalysts for the tourist industry, which increases economic benefits to a city. Likewise, the educational impact of any museum is dependent on its location and school accessibility, which is why museums go to extraordinary lengths to provide bus scholarships and even free admission to students. Thus, concentrating museums in one area widens and strengthens their influence in a city.

To be fair, Friends of the Parks, and anyone who advocates for public parks and the environment, are not rebel scum. It is unwarranted to denounce a small advocacy group who has filed a lawsuit based on a public trust doctrine. Chicago’s Lakefront Protection Ordinance forbids any development east of Lake Shore Drive. The doctrine is design to provide special protection to lakefront land. Anyone can file the suit. It is not fair to attack people who are dedicated to protecting the environment and beautiful spaces. Any resident or tourist of Chicago will affirm that Lake Michigan’s shoreline is the city’s most sacred land. Most major cities do not have undeveloped waterfront property. This has helped preserve Chicago’s eighteen-mile lakefront trail and added unparalleled cultural, historical, and recreational value to the city.

However, while it may be wrong to criticize Friends of the Parks for trying to protect public land, it also seems unreasonable for them to not allow any construction at all. One building does not destroy the entire shoreline.

One can speculate that the vast majority of people who enjoy the lakefront do so with no intentions of environmental protection whatsoever. One does not have to be an environmentalist to enjoy Chicago’s lakefront. It is home to running paths, skate parks, Navy Pier, the Bears, and various other recreational amenities that have little to do with nature. Preserving open space does not categorically involve environmental intentions. A museum would increase recreational value.

The first proposal site in Chicago would have put the museum on what is now a parking lot used by Bears’ fans to tailgate. It is not exactly an environmentally friendly space. The museum would replace tailgating spots and add additional greenspace. The second proposed site where the convention center is currently located would have also added greenspace with an ecofriendly park that naturally filters stormwater.

Finally, it is easy to understand why people are suspicious when a wealthy individual wants to build on public land. After all, millionaires and billionaires do not necessarily serve the public while securing private profit. Rising inequality and privatization leaves us with fewer public spaces, and the new prospect of offering naming rights and donor recognition for national parks leaves people skeptical of how the government raises money. Furthermore, our entire political system is held hostage when special interests of the super-rich undermine the government, leaving the rest of us with inefficient social services, budget cuts, and crippling debt. Should a billionaire be allowed to build an investment on public land?

The government has the power to seize land with certain limitations. This is how it builds roads, sewer lines, and schools, but so-called eminent domain becomes problematic when land is seized for private profit. However, building a museum is not a lucrative endeavor. Perhaps a toxic political environment blinds us from seeing goodwill when it is clearly present. Lucas wants to gift millions of dollars to the city in the form of a museum. This will certainly have far-reaching economic and educational implications.

Lucas made $4 billion dollars by selling “Star Wars” to Disney, but he donates much of his wealth to philanthropic causes. In 2010, he signed The Giving Pledge, a promise by billionaires to donate their wealth to good causes. He created the George Lucas Educational Foundation and Edutopia, and in Chicago he donated $25 million to the University of Chicago Laboratory Schools and another $25 million to After School Matters, a nonprofit organization that offers after-school and summer programs to Chicago teenagers. His wife, Mellody Hobson, is the chair of the board. He says, “The whole point of this museum is to stimulate the imagination…to open eyes to the possibilities of creating art.” Why can’t we take him for his word? Our lack of faith is disturbing.

Bringing the Lucas Museum to Chicago would undoubtedly have a positive impact on the educational and emotional lives of millions of people. I have worked at the Museum of Science and Industry in Chicago for almost nine years. I develop and teach hands-on science labs for school-aged students. I can speak with first-hand experience about the educational impact museums have on teachers and students. I see it every single day. Museums expose students to history, science, art, culture, and a wide range of other topics they may not experience at home or school. Museums offer unprecedented teacher professional development courses and after-school programs. It does not matter where one lives or how much money one has — museums are places everyone can go to learn and be inspired.

The controversy surrounding the museum reflects today’s toxic political environment, which leaves little room for compromise. The challenge is overcoming a political climate that makes it difficult to satisfy the needs of the people. A land ordinance that protects lakefront property benefits the city, but so too would a brand new museum. These should not be in conflict.

Source: New feed

The purity of driven work: Bill Cunningham, model for a committed creative life

Bill Cunningham

Bill Cunningham (Credit: AP/Mark Lennihan)

A good friend of mine wears a fedora. He does so without irony, jaunting the hat on his head for runs to the bodega or gambols through the used record shop or sits on the sofa with a book. The fedora is his most enduring sartorial affectation. He is somewhat of a Bohemian, and he was at his most Bohemian several years ago when he lived in New York City. Blond, giraffe-limbed, laser-eyed, my friend is a poet, a painter, a photographer, and I can’t help thinking all that artistic energy must’ve been emanating from him when Bill Cunningham took his picture.

I’m not sure if my friend was wearing the fedora when the legendary New York Times photographer caught him on camera, though I like to think he was. Cunningham, who died yesterday at the age of 87, was a master of sighting the stories daily, out-and-about style told. In a short video about his method, Cunningham said, “I let the street speak to me …You’ve got to stay out there and see what it is.”

“Stay out there and see what it is” might be a perfect epitaph for the man behind the Times’ columns On the Street and Evening Hours. A Harvard dropout who came to New York for a job in advertising, Cunningham’s almost biological response to the aesthetics of clothing led him to open a millinery shop (William J) and work in fashion journalism before he started taking pictures in 1966, what he dubbed “the real beginning” in a 2002 recollection of his life.

That recollection is worth reading and Richard Press’s 2010 documentary “Bill Cunningham New York” is worth watching even if you’re not a fashion blogger (bow down) or a stylista, a New Yorker by zip code or a nascent journalist looking to take a page from Cunningham’s vulgate of decency. His singular eye aside, what makes the photographer a compelling character is not just the subject of his work but his unflagging commitment to it. The way Cunningham lived—industriously, ascetically, single-mindedly—gives us, the people he photographed, something to which we might aspire.

***

What makes driven, productive people so fascinating? They become for us poles of creative and artistic commitment. They inspire us. They are enviable and brilliant, responsible and ethical. There’s something pure and animal about the resolute, the dedicated. The desire to make and make, to do and do is akin to a long squirreling away of nuts for the inevitable winter.

And yet the worker—no matter how beautiful the work nor how beloved the pursuit—can be antisocial or hermetic, a cousin to the obsessive doomed by repetition-compulsion to march on (in ugly, broad strokes, Freud’s notions of the death drive). A hoarder, a compiler, or as Cunningham self-identified, “a record keeper,” “a collector.”

The glint of madness in the mirror we hold up to compulsive workers’ methods is part of their appeal. The industrious are on their own course, often one to which we ourselves may have aspired. In his review of “Bill Cunningham New York,” the himself-prolific Roger Ebert wrote, “Ever since reading Thoreau’s Walden, I have been teased by the notion of leading a life with only the bare essentials and peacefulness. I lacked the nerve to find that little cottage and plant those rows of beans. Bill Cunningham lives a life as pure and idealistic as Thoreau’s, and he does it in the middle of Manhattan.”

Like any good film, Bill Cunningham’s life—as depicted in Press’s documentary, as memorialized by friends and subjects and colleagues and fan—seemed comprised largely of training scenes. Add to Rocky pounding the Philadelphia pavement and Kevin McAllister booby-trapping his home with Micro Machines the octogenarian cycling about Manhattan, clad not in a gray sweat suit or a puffball cap but his iconic French blue jacket.

The process of preparing for a fight or practicing one’s craft or even doing what one loves rarely feels so heroic in real life as it looks on film, set to music and careening by on the wings of montage. The worker’s tools become signatures, metonyms. As Oscar de la Renta admitted of Cunningham: “I don’t know anything about his life, except his bicycle.”

The industrious fascinate us because of their penchant for distillation. Later in his review of Press’s film, Ebert wrote: “[Cunningham] has invented an occupation he does better than anyone else ever has, he has simplified his life until nothing interferes with that vocation.”

That simplification coupled with an indelible artistic and editorial output tinges the most mundane habits with glamour. We love the drudgery upon which the art, the pleasure, the passion is built: “To make money, I worked at a corner drugstore,” Cunningham wrote in his 2002 reflection. “At lunchtime, I’d stop making hats and run out and deliver lunches to people. At night, I worked as a counterman at Howard Johnson’s. Both jobs provided my meals, and the dimes and nickels of my tips paid for millinery supplies.”

Chores become routines become traits. We cherish Cunningham’s preferred diner breakfast (egg, sausage, cheese) and his utilization of a shared bathroom (“Who the hell wants a kitchen and a bathroom?” he says in the documentary); we admire his monkish slumber amidst filing cabinets of negatives; we revere his abstention from movies and TV for how they complete his character and cement his commitment to “[standing] for two hours without knowing whether somebody [is] coming out … the surprise of finding someone.” We like that “most of [his] pictures are never published”—not for the onslaught of those images—but for what that trove represents about its collector. Think of Prince keeping safe the code for his Paisley Park vault, where rumor suggests there’s enough music to release a new album once a year for the next century.

***

“I do everything, really, for myself,” wrote Cunningham in 2002. Perhaps what our fondness for productive, creative people like Cunningham and Prince most expresses is our desire to not only live but thrive on a self-determined code, Thoreau at Walden, an Emersonian state of self-reliance.

Perhaps Cunningham’s unwillingness to accept champagne at galas and his reluctance to join the staff of The Times (only for health insurance), like Prince’s close-grip on his catalogue, gives us a glimpse into the economy of lives governed by principles. Cunningham, the man who didn’t recognize Farrah Fawcett (“I never bothered with celebrities unless they were wearing something interesting”), wasn’t concerned with fame, but artistic freedom: “That’s why my files wouldn’t be of value to anyone.”

In his book, “On Writing: A Memoir of the Craft,” Stephen King offers his own spin on the well-worn adage “write what you know.” “Write what you like,” says King, “then imbue it with life and make it unique by blending in your own personal knowledge of life, friendship, relationships, sex, and work. Especially work. People love to read about work. God knows why, but they do.”

I’m not sure it’s such a mystery. I believe we’ll continue to revere Cunningham and Prince and other people who die and leave behind not just legacies of work but of working for their lives’ power to model multifaceted, even competing, human ideals: beauty, creativity, responsibility, perseverance, independence, self-possession, integrity. Work ethic doesn’t involve perfect pitch or an eye for patterns. When we see people prospering, blissed-out by their duties, we can say: I want to be like them.

This week also brought the death of filmmaker Paul Cox. Cox’s films, like Cunningham’s photographs, focused not on labeling seismic trends but showing or seeing the value in the small ruptures in our interpersonal plots. In their obituary, The Times quoted an interview with Cox. The director and writer said, “I’m not a filmmaker out of ambition … It’s pure compulsion. I have no option.”

Passion for creation and passion for process are what make the labors of the worker appealing, and passion is what makes the worker’s end—when it comes—such a wallop. If the compulsive worker had the option, the worker wouldn’t quit. It is painful to think of Cunningham having to stop. After all, it was only the other day, in a recent On the Street titled Duality, that his love of fashion was as understatedly gushing, as personable and joyous and succinct as always: “Hey, I never saw it as good as this in the 1950s.”

Source: New feed

Judge in Brock Turner rape case handed El Salvadoran immigrant 3 years for similar crimes, report says

Recall Persky Sign

Activists hold signs calling for the removal of Judge Aaron Persky from the bench, San Francisco, California, June 10, 2016. (Credit: Reuters/Stephen Lam)

Santa Clara County Superior Court Judge Aaron Persky — the judge who, on June 2, sentenced Stanford rapist Brock Turner an unprecedentedly light six-months for sexually assaulting an unconscious woman — is facing due criticism for handing a three-year sentence to an El Salvadoran immigrant for similar crimes.

Raul Ramirez, 32, pleaded guilty in March to sexually assaulting his then-roommate in 2014. According to a Guardian report, Ramirez admitted to sexually assaulting his victim “for about five to 10 minutes against her will … and stopped only when she started crying.” (Turner similarly admitted to assaulting his victim, though said he “remembered consent” in his “drunken state.”)

Court documents detailed in the report indicate “Ramirez, like Turner, has no criminal record of convictions for serious or violent felonies.”

“Persky could have approved or helped negotiate a bargain in which Ramirez only pleaded guilty to the lesser of two charges he was facing – assault with intent to commit rape,” the Guardian said. “If the more serious charge was dropped – as was the case with Turner, who had two rape charges dropped – Ramirez could have potentially avoided prison.”

Read the full report over at The Guardian.

Source: New feed

Scalia’s ghost haunts SCOTUS decision: Pro-choice victory prompts grave-dancing over his absent outraged dissent

Antonin Scalia

Antonin Scalia (Credit: Reuters)

If only Antonin Scalia were alive to see this day. Although this would probably kill him.

When the controversial Supreme Court justice died in February after thirty years on the bench, it set off a frenzied political battle for his now-vacant seat and one brief poetic moment when he almost got a law school named ASSOL in his honor. And now, his presence and his legacy in two areas near and dear to his heart are reminding us this week exactly how he will forever be remembered.

On Sunday, as Pride marches celebrated love all over the country, we also marked the first anniversary of the Supreme Court’s ruling in Obergefell v. Hodges, and marriage equality for all. It was a fitting day to also recall Scalia’s memorable dissent, in which he fumed that “One would think Freedom of Intimacy is abridged rather than expanded by marriage. Ask the nearest hippie,” and angrily vowed that if he ever agreed with the Court’s majority opinion, “I would hide my head in a bag.”

Then on Monday, the Court proved yet again that it’s now up to Samuel Alito and Clarence Thomas to be the designated angry foes of social progress, when the Court handed down a decisive victory for reproductive freedom. In a 5-3 decision, the Court struck down the 2013 Texas law that aimed to shut down most of the state’s abortion providers on the blatant lie that it was for the sake of women’s safety. In her opinion, Judge Ruth Bader Ginsburg asserted firmly that “It is beyond rational belief that H.B. 2 could genuinely protect the health of women, and certain that the law ‘would simply make it more difficult for them to obtain abortions.’ When a State severely limits access to safe and legal procedures, women in desperate circumstances may resort to unlicensed rogue practitioners… at great risk to their health and safety.”

And it’s moments like this that really make a person sentimental thinking about the epic fit Scalia would have pitched over this one. Clarence Thomas, after all, could only croak out a grumpy, “As the Court applies whatever standard it likes to any given case, nothing but empty words separates our constitutional decisions from judicial fiat.” The ghost of Scalia must be so disappointed. He, after all, was a man who called the Affordable Care Act “jiggery-pokery” and once asked, with a straight face, “If we cannot have moral feelings against homosexuality, can we have it against murder? Can we have it against other things?” Imagine the word salad he’d have tossed on this auspicious day.

That loss was felt acutely on social media, where Scalia was widely remembered on Monday. Mentioning him in my own Twitter timeline quickly brought forth responses like, “I hope hell exists just so I can hope he is burning in it” and “He’s one zombie I hope to decapitate in the coming apocalypse.” Elsewhere, Tom Ceraulo mournfully observed that “It’s a shame Scalia wasn’t alive for today’s decision, since his vote wouldn’t change the outcome & his dissent would employ Dr. Seuss words,” while Parker Molloy similarly daydreamed, “Kind of wish Scalia was around so we could bask in what would have certainly been a colorful, angry dissent. #pureapplesauce #jiggerypokery.” Iron Circus Comics, meanwhile, wrote, “*Gently gyrates on the fresh earth of Scalia’s grave* STILL GLAD.” But looking on the bright side, Mark Harris noted, “With the death of Scalia, it’s been heartening to see Clarence Thomas come into his own as one of the worst justices in SCOTUS history,” while John Fugelsang reassured, “At least we know that right now Justice Scalia is in Heaven, telling the Virgin Mary what she can and can’t do with her body.”

He worked tirelessly to stand in the way of women’s reproductive rights, and he would have no doubt never stopped doing likewise. He didn’t get the chance Monday in the Supreme Court. And with every step the Court takes that doesn’t drag women into back alleys, it’s clear that Scalia continues to cast a long shadow. For years to come, we’ll look at Supreme Court rulings that affect the health and well-being of women and know exactly how he’d have decided, and how frighteningly different their lives would have been because of it.

Source: New feed