You don’t know if that’s beef: The animals mixed into your meat might shock — and disgust — you

Species substitution is rampant in the seafood industry, but how often do our furry and feathered protein sources get swapped? Was Horsegate the exception or the rule?

In 1995, researchers from the Florida Department of Agriculture and Consumer Services analyzed over 900 meat samples collected from Florida retail markets; 806 were raw samples of meat and 96 were cooked. For their analysis they turned to proteomics (the large-scale study of proteins). One of the methods they used was ELISA (Enzyme-Linked ImmunoSorbent Assay), which is based on antibodies recognizing and binding to specific animal proteins. For this technique, antibodies are developed for specific proteins, such as a heat-tolerant protein found in the muscle of pigs. The sample of ‘all beef’ sausage is mixed with the antibody and if the pig protein is present, it will bind to the antibody. The sample is then washed to get rid of any unbound sample. A second antibody can then be added, which is linked to an enzyme. This antibody binds to any bound pig protein. Finally, a labeling reagent is added. This reagent contains a substance that will be converted into a colored product by the enzyme. This color change can then be used to simply indicate presence of the protein of interest or it can be measured against a set of standards to determine the concentration of protein present.

Of the 900 meat samples, the researchers found that 149 (16.6 percent) contained more than 1 percent of an undeclared meat. The substitution rate was higher among the cooked meats (22.9 percent) than the raw meats (15.9 percent). The undeclared species found in minced beef and veal products were sheep, pork and poultry. However, it must be stated that immunoassays will only recognize the species that they have developed and added antibodies for. In other words, unless they added antibodies for rat and dog, they wouldn’t have found them.

In 2006, a group of Turkish researchers used immunoassays to test processed meat products such as fermented sausages, salami, frankfurters, pastrami, bacon and canned goods. They found 22 percent were adulterated; 11 of the 28 sausages that were labeled as beef contained only chicken.

China has been riddled with meat substitution scandals. There have been reports of rat, mink and fox meat being transformed into mutton slices. Twenty thousand tons of meat were seized and more than 900 people were detained in association with the scandal. In early 2014, Walmart’s operations in China were recalling donkey meat because it had been adulterated with fox meat. Donkey is a very expensive meat and highly sought after for its tenderness and sweetness; fox, not so much.

Pork is swapped for beef, beef is swapped for buffalo, fat trimmings and offal (internal organs) are added to minced beef, chicken is sold as lamb, pork is sold as chicken, and beef and pork gristle and bones are injected into chicken. The list is long, and this is just substitutions between animal species. We haven’t yet mentioned the undeclared ingredients – such as added water, chickpea flour, rice flour and soy – that are added to meat to bulk it up.

Of course, there’s always the possibility that some of these undeclared species are the result of accidental cross-contamination. When a processor takes a carcass from a slaughterhouse and debones it and takes it down to smaller cuts, there are a number of leftover bits that aren’t particularly useful as a cut of meat, and these are called the trimmings. For beef, about 15 to 20 percent of the carcass will end up as trimmings, so this is a significant amount of meat that it would be shameful to waste. The trimmings are shipped to a processor that then mixes the extra fatty trimmings with the extra lean trimmings to get the desired fat-to-lean content for their customers. It can then be packaged up as mince and sold on to other manufacturers or retailers, or processed further into things like burgers. Processors work with several different types of meat and so there is a possibility that some minced pork remnants will be left in the pipeline and get pushed through when the beef goes through the machine. As a result, there is some forgiveness in levels of contamination. The European Food Safety Authority (EFSA) uses a 1 percent threshold – anything above this level of contamination is considered to be intentional adulteration.

As with any food, the more processed the meat, the more difficult it is to tell by visual inspection alone whether it’s been tampered with. By definition, mince is a mixture of meat that’s ground beyond any hints as to its animal origins. The only distinguishing feature the meat sitting in a plastic tray, bound by a thin layer of protective plastic wrapping, has is its color. We can judge its animal origins based on its shade of red – ranging from pale poultry pink to vibrant venetian red venison. The fat content can be estimated based on the relative proportion of red bits and white bits. Freshness is assessed by the saturation of the color – is it a dull grey color or bright red? It’s not a lot to go on and even these attributes can be manipulated. While it’s easy to distinguish some turkey mince from beef, things can get more difficult between the red meats such as horse and beef.

Products such as sausages are among the most prone to adulteration. While there is the possibility of cross-contamination as we just mentioned, the more cynical (and one could argue realistic) viewpoint is that cheaper substitutions are easier to hide in a processed product. In 1991, researchers from the University of New South Wales, Australia went out and bought samples from butchers and supermarkets of the most commonly consumed sausages – thick beef, thin beef and thick pork. The researchers were interested in nutritional quality, but as there had been an article in the media about adulteration, they decided to test for other species in the samples using the ELISA method as well. Cow, sheep and pig meat were detected in all of the ‘all beef’ sausages, thick and thin. Of the 10 pork sausages tested, three contained only pork as labeled, three contained undeclared cow meat and the remaining four samples contained cow and sheep in the ‘pork’ sausage. Of the 30 sausages tested, only the three pork sausages were labeled correctly.

In 2012, researchers in South Africa examined a total of 139 processed meat products – from minced meat to deli meat – to look at what ingredients were not being declared on the label.8 They used the ELISA method to detect undeclared plant proteins, but also used DNA-based methods to look for a total of 14 animal species. They found undeclared plant and/or animal species in 95 (68 percent) of the samples. The highest rates of adulteration were in sausages; nearly half of the sausages contained undeclared pork. All together, the meat products tested contained undeclared soy, gluten, beef, water buffalo, sheep, goat, donkey and chicken. The majority of the products were not complying with labeling laws.
Adding or substituting meats and using vegetable fillers to bulk up the end product in sausages and other highly processed meats isn’t difficult. It’s a matter of adding another ingredient into the giant mixer as it blends together the meats and spices. Doner meat is similar – it’s mince and spices mixed together – which is probably why 70 percent of lamb kebabs from British takeaways tested in 2013 contained cheaper, undeclared meats.

Substitutions aren’t limited to highly processed meat

The techniques of the fraudsters are now sophisticated enough that substitutions can happen beyond the minced and processed meats. Let’s return to the example from China of fake mutton. Thinly shaved mutton slices are a popular hotpot ingredient. There have now been several scams unveiled in China that have involved the sale of fake mutton; one operation, raided in January 2013, had 40 tons of fake mutton and another 540 tons of materials to make more. Allegedly, rat, mink, fox and duck meat have all been used as the base for this fake product. These meats are apparently soaked in a cellulose gum (sodium carboxymethyl cellulose), which is commonly used in food manufacturing to extend shelf life, improve freeze/thaw stability and help bind water. In the making of fake mutton, this process allows the meat to take on more water and therefore increase its apparent weight. Food coloring is used to provide the ideal shade of mutton, and food adhesives (more on meat glue later) are used to bind the fake meat with real mutton fat. The end product is a passable, but not indistinguishable, version of mutton. What sets the fake apart is that the fat is not marbled throughout the meat as would be the case naturally. The fat and meat are quite separate and when it is thawed or cooked, the adhesive fails and one is left with fragments of fat and fragments of meat.

Despite what seems like an arduous process, making the fake mutton is worth the effort. The fraudsters can sell it wholesale to restaurants for about £2.12/kg (US$1.45/lb) less than the real thing, allowing them to undercut competitors selling real mutton. Forty tons of fake mutton would turn a profit of about £128,000 (US$192,000) – nearly 23 times the average annual salary in China for 2014.

In September 2013, police confiscated 20,000kg (44,000lb) of pork masquerading as beef from a factory in north-west China. Not mince, not sausages, not even thinly sliced ‘mutton’, but whole cuts of pork that had been made to look like beef. One wouldn’t think it was possible. The pork is mixed with beef extract and a glazing agent and left to sit for ninety minutes. When cooked, the meat takes on a dark beef-like appearance rather than the characteristic white pork color. The beef extracts even give it the beefy aroma one would expect. Though it may be more difficult to swindle people over whole cuts of meat, this shows it’s not impossible.

If one end of the spectrum is to transform a pork chop into a steak, the other end is to change things at a microscopic scale. In 2001, the UK FSA released results of an investigation that was carried out jointly with 22 local authorities. They tested 68 samples of chicken breasts that were being sold to the catering trade and found that more than half of them were mislabeled, including some that contained undeclared hydrolysed protein. Hydrolysed protein is protein that’s been broken down into smaller segments known as peptides, usually using an enzyme This can be a very useful process as it can remove the allergenic properties of proteins and make them more easily digestible. Baby formula, for example, contains hydrolysed milk proteins (casein or whey). Collagen, which is the main structural protein derived from bone, connective tissue, skin and hide, forms the ideal water-retaining agent when it is hydrolysed – gelatin – and it was this that was being added to the chicken breasts.

The protein powder is purchased by processors and made up into a brine solution. This solution is then directly injected into the breasts using needles, or the chicken breasts are tumbled with the solution in a machine like a cement mixer. Either way, the breast meat takes up this solution and the hydrolysed protein helps retain water, even while cooking. The result can be a product that actually contains as little as 55 percent chicken; the rest is additives, including water. This is a perfectly legal process, but it must be labeled correctly as ‘chicken breast fillets with added hydrolysed chicken protein.’

The technique was developed by Dutch processors to introduce protein and water into salted chicken that they were importing from Brazil and Thailand. The processors were taking advantage of an EU tax loophole, as salted meat is subject to much lower import tariffs. By adding water to the chicken, they were making it more palatable but also effectively selling water for the price of chicken. Of the 68 samples taken by the FSA in 2001, 20 percent contained undeclared hydrolysed protein.

Shortly thereafter, it was revealed that some Dutch manufacturers were not only adding undeclared hydrolysed protein, but also the protein was being extracted from other animals. The FSA conducted DNA testing on 25 samples and found that almost half of them contained traces of DNA from pigs, though all but one of those samples were labeled as halal (meat that adheres to Islamic law and certainly would not include pork).They suspected that beef protein was also being used, but their DNA-based methods weren’t picking up any beef DNA. The hydrolysed protein powders are extremely processed, making any DNA, if present at all, very difficult to detect – particularly when looking for a very small amount of beef or pork DNA in a lot of chicken.The proteins are also fragmented through processing, which eliminated the use of immunoassays, such as ELISA. The FSA needed a new test.

The FSA collaborated with researchers from the University of York who had developed new procedures to identify species of ancient bone fragments dug up in archaeological sites. Archaeological work suffers the same challenges faced by food forensics in that the proteins have decayed – though in the case of archaeology it is through time rather than processing. The researchers had discovered that the collagen protein found in bone has enough variation between species to be useful in fingerprinting collagen-based tissues (for example, bone, cartilage, skin, tendon, blood vessels). Luckily, the hydrolysed protein in the chicken breasts had been extracted from these types of tissues.

The technique the York researchers have developed is called ZooMS, short for ZooArchaeology by Mass Spectrometry. For the analysis, proteins in the tissue sample are cut up into peptide fragments using the enzyme trypsin. The mass of each peptide is then determined using time-of-flight mass spectrometry, which essentially shoots the peptides out using an electric field and uses a detector to see how fast they fly a particular distance. This provides a unique mass-to-charge ratio for the peptides. Certain peptides (fragments of the collagen protein) are species-specific and can be identified by comparing them with a library of collagen proteins developed for different species.

As the chicken processors don’t make the hydrolysed protein powders themselves, the FSA conducted an investigation of the powders directly. They obtained five sample powders, four of which were made in the UK and all of which were labeled as containing only poultry-derived protein. They ran them through numerous tests, including the methods developed atYork.They used real-time PCR to look for chicken DNA in three of the powders and found that two powders tested positive for chicken DNA only and one tested positive for chicken and pork DNA. Had the tests ended there, one might have suspected that only one of the powders had used species other than chicken. Luckily they didn’t stop. Analysis of the collagen protein showed that none of the protein in all five powders had been derived from chicken. All of them contained bovine collagen-specific peptides and two of them contained bovine and porcine-specific peptides. Interestingly, two of the powders also contained unidentified non-food animal peptides.The powders had tested positive for chicken DNA probably because a small amount of chicken blood had been added, which would mask any pork or beef DNA that was likely to be highly degraded. It was the analysis of the collagen proteins themselves that revealed the true sources of the hydrolysed protein in the powders. Mislabeled powder means that some chicken processors may be unaware that they’re injecting protein from other species. Some processors have shifted to the use of plant-based protein powders as a result.Yet it does not eliminate the fact that the process is introducing a lot of water that consumers are paying for, which is fine as long as it’s labeled as an added ingredient. Consumers can then make their own decisions about whether they want to pay for water. But when it’s undeclared … that’s fraud.

Excerpted from “Sorting the Beef From the Bull: The Science of Food Fraud Forensics” by
Richard Evershed and Nicola Temple. Published by Bloomsbury Sigma, a division of Bloomsbury Publishing. Copyright © 2016 Richard Evershed & Nicola Temple. Reprinted with permission of the publisher. All rights reserved

Source: New feed

Tiny Desk Domination: How NPR Music took over the tastes of the tastefully plugged-in

One of the biggest knocks against the Telecommunications Act of 1996 was that deregulating the media industry would lead to a wave of consolidation, stifling public discourse as a handful of corporate conglomerates bought up TV and radio stations and newspapers, and snuffed out local voices in the name of synergy. That’s basically what happened.

Though the Internet has since mitigated many of those effects, the strange combination of old-media consolidation and anarchic proliferation of blogs and podcasts can count among its unintended consequences the dominance of NPR Music. From First Listen album premieres to Tiny Desk performance sessions, from live-streaming concerts to assembling buzzy showcase lineups every year at the South by Southwest festival, along with blogs and podcasts and now a book from “All Songs Considered” host Bob Boilen, NPR Music has rushed to fill the vacuum left as smaller “legacy media” outlets around the country continue to pare back staff and punt on meaningful arts coverage.

That’s not good for music. Or, to be more precise, that’s not good for the huge amount of music that exists outside the narrow scope that NPR Music covers.

There are thoughtful people there doing good work, no question. The “All Songs Considered” show that Boilen created in 2000 and co-hosts with producer Robin Hilton has set a high standard for the innumerable music podcasts that have followed, and NPR Music publishes some excellent long-form stories online. But their coverage overall is safe, predictable and dismayingly shallow, despite sections online dedicated to hip-hop, classical, jazz, Latin, world, electronic/dance and R&B/soul. Maybe that matters less in an era when there are countless blogs that go deep on every conceivable musical taste, but those blogs don’t offer the valuable exposure associated with NPR, which expends it on, well, just imagine Andrew Bird and Bon Iver teaming up with St. Vincent and Alabama Shakes to cover the Carolina Chocolate Drops, featuring special guest Vampire Weekend and someone making a knowing reference to Earl Sweatshirt, and you’ll get a sense of the NPR Music wheelhouse.

It’s often just one big conventional-wisdom feedback loop. Nowhere is that more true than in Boilen’s new book “Your Song Changed My Life.” He interviewed 35 musicians, including Jackson Browne, Jimmy Page, Dave Grohl, Smokey Robinson, Jeff Tweedy and Lucinda Williams, about the songs that most inspired them. Despite the promising concept, it’s a glib and self-congratulatory book, full of pedantic commentary, egregious name-dropping (the second time he met David Byrne, Boilen writes, “I was an NPR journalist attending the White House Correspondents Dinner with Annie Clark,” a.k.a. St. Vincent) and Boilen’s own unchallenged assumptions. “I seldom listen to modern country music,” he declares in a chapter about the country traditionalist Sturgill Simpson, because he’s pretty sure it’s all “songs about women and booze.” Or, after letting the minimalist composer Philip Glass describe his life-changing song — Spike Jones’ comical version of “William Tell Overture,” performed on kitchen cookware — Boilen chimes in with his own melodramatic reaction: “I am flabbergasted. Seriously? Philip Glass likes funny things?”

Such interjections would mostly seem like a grating affectation, if Boilen weren’t also the one setting the tone for what happens at NPR Music — and by extension, all the smaller outlets that take their cues from NPR Music. What message does it send when the head guy, who has bragged about seeing more than 500 live performances in 2015, isn’t curious enough about a major genre to question his own premise and actually listen?

Maybe he’s too busy, what with attending all those live shows, and also running the Tiny Desk concerts. It’s a live video performance series featuring bands playing short, stripped-down sets in Boilen’s cubicle. NPR says there have been nearly 500 of them since Tiny Desk launched in 2008, with recent performers including Graham Nash, Wilco, Tedeschi Trucks Band and Ben Folds. What sounds again like a promising concept ends up coming off as twee, and even disingenuous, as if landing a band to play a Tiny Desk session is somehow a coup for NPR: “Look who we talked into performing at Bob’s desk, in the newsroom of a massive and influential international media organization! Aren’t we the lucky ones!”

NPR Music has lately begun holding contests to find Tiny Desk performers, which is even more cloying. Billed upon its launch in 2015 as “a search for a great undiscovered, unsigned musician,” the Tiny Desk Contest has garnered more than 6,000 entries in each of its two incarnations so far, from people hoping a panel of judges will pick them to perform. But it’s a gimmick that makes Tiny Desk the center of attention instead of the music it’s supposed to celebrate, just like the self-promotional contest Rolling Stone held in 2011 to put an unknown band on the cover of the magazine. Remember the Sheepdogs? Right, of course not. There’s no question that NPR Music is inundated with submissions from undiscovered, unsigned musicians. If Boilen & Co. want to highlight some of those acts, they can do it whenever they want, without throwing themselves a parade.

It’s no secret that public radio personalities can tend toward self-impressed, and that’s generally harmless enough. But when it’s more difficult than ever to be heard in the crowd, NPR Music has a megaphone that reaches a large audience that is also affluent, socially engaged and well-educated. There’s no doubting Boilen’s enthusiasm and passion for (at least some) music. But he often seems more interested in using that megaphone to tell NPR Music listeners what he likes, when offering a more expansive take on various music scenes and sounds might help them discover something they like — that doesn’t sound like the Lumineers all over again. There’s surely some overlap between the two, but the look-at-me! mode of coverage puts the emphasis on the storyteller instead of the story, and that’s the wrong place.

Source: New feed

They should all be tried: George W. Bush, Dick Cheney and America’s overlooked war crimes

The allegations against the man were serious indeed.

* Donald Rumsfeld said he was “if not the number two, very close to the number two person” in al-Qaeda.

* The Central Intelligence Agency informed Assistant Attorney General Jay Bybee that he “served as Usama Bin Laden’s senior lieutenant. In that capacity, he has managed a network of training camps… He also acted as al-Qaeda’s coordinator of external contacts and foreign communications.”

* CIA Director Michael Hayden would tell the press in 2008 that 25% of all the information his agency had gathered about al-Qaeda from human sources “originated” with one other detainee and him.

* George W. Bush would use his case to justify the CIA’s “enhanced interrogation program,” claiming that “he had run a terrorist camp in Afghanistan where some of the 9/11 hijackers trained” and that “he helped smuggle al-Qaeda leaders out of Afghanistan” so they would not be captured by U.S. military forces.

None of it was true.

And even if it had been true, what the CIA did to Abu Zubaydah — with the knowledge and approval of the highest government officials — is a prime example of the kind of still-unpunished crimes that officials like Dick Cheney, George Bush, and Donald Rumsfeld committed in the so-called Global War on Terror.

So who was this infamous figure, and where is he now? His name is Zayn al-Abidin Muhammad Husayn, but he is better known by his Arabic nickname, Abu Zubaydah. And as far as we know, he is still in solitary detention in Guantánamo.

A Saudi national, in the 1980s Zubaydah helped run the Khaldan camp, a mujahedeen training facility set up in Afghanistan with CIA help during the Soviet occupation of that country. In other words, Zubaydah was then an American ally in the fight against the Soviets, one of President Ronald Reagan’s “freedom fighters.”  (But then again, so in effect was Osama bin Laden.)

Zubaydah’s later fate in the hands of the CIA was of a far grimmer nature.  He had the dubious luck to be the subject of a number of CIA “firsts”: the first post-9/11 prisoner to be waterboarded; the first to be experimented on by psychologists working as CIA contractors; one of the first of the Agency’s “ghost prisoners” (detainees hidden from the world, including the International Committee of the Red Cross which, under the Geneva Conventions, must be allowed access to every prisoner of war); and one of the first prisoners to be cited in a memo written by Jay Bybee for the Bush administration on what the CIA could “legally” do to a detainee without supposedly violating U.S. federal laws against torture.

Zubaydah’s story is — or at least should be — the iconic tale of the illegal extremes to which the Bush administration and the CIA went in the wake of the 9/11 attacks. And yet former officials, from CIA head Michael Hayden to Vice President Dick Cheney to George W. Bush himself, have presented it as a glowing example of the use of “enhanced interrogation techniques” to extract desperately needed information from the “evildoers” of that time.

Zubaydah was an early experiment in post-9/11 CIA practices and here’s the remarkable thing (though it has yet to become part of the mainstream media accounts of his case): it was all a big lie. Zubaydah wasn’t involved with al-Qaeda; he was the ringleader of nothing; he never took part in planning for the 9/11 attacks. He was brutally mistreated and, in another kind of world, would be exhibit one in the war crimes trials of America’s top leaders and its major intelligence agency.

Yet notorious as he once was, he’s been forgotten by all but his lawyers and a few tenacious reporters.  He shouldn’t have been.  He was the test case for the kind of torture that Donald Trump now wants the U.S. government to bring back, presumably because it “worked” so well the first time. With Republican presidential hopefuls promising future war crimes, it’s worth reconsidering his case and thinking about how to prevent it from happening again. After all, it’s only because no one has been held to account for the years of Bush administration torture practices that Trump and others feel free to promise even more and “yuger” war crimes in the future.

Experiments in Torture

In August 2002, a group of FBI agents, CIA agents, and Pakistani forces captured Zubaydah (along with about 50 other men) in Faisalabad, Pakistan. In the process, he was severely injured — shot in the thigh, testicle, and stomach. He might well have died, had the CIA not flown in an American surgeon to patch him up. The Agency’s interest in his health was, however, anything but humanitarian. Its officials wanted to interrogate him and, even after he had recovered sufficiently to be questioned, his captors occasionally withheld pain medication as a means of torture.

When he “lost” his left eye under mysterious circumstances while in CIA custody, the agency’s concern again was not for his health. The December 2014 torture report produced by the Senate Select Committee on Intelligence (despite CIA opposition that included hacking into the committee’s computers) described the situation this way: with his left eye gone, “[i]n October 2002, DETENTION SITE GREEN [now known to be Thailand] recommended that the vision in his right eye be tested, noting that ‘[w]e have a lot riding upon his ability to see, read, and write.’ DETENTION SITE GREEN stressed that ‘this request is driven by our intelligence needs [not] humanitarian concern for AZ.’”

The CIA then set to work interrogating Zubaydah with the help of two contractors, thepsychologists Bruce Jessen and James Mitchell. Zubaydah would be the first human subject on whom those two, who were former instructors at the Air Force’s SERE (Survival, Evasion, Resistance, Escape) training center, could test their theories about using torture to induce what they called “learned helplessness,” meant to reduce a suspect’s resistance to interrogation. Their price? Only $81 million.

CIA records show that, using a plan drawn up by Jessen and Mitchell, Abu Zubaydah’s interrogators would waterboard him an almost unimaginable 83 times in the course of a single month; that is, they would strap him to a wooden board, place a cloth over his entire face, and gradually pour water through the cloth until he began to drown. At one point during this endlessly repeated ordeal, the Senate committee reported that Zubaydah became “completely unresponsive, with bubbles rising through his open, full mouth.”

Each of those 83 uses of what was called “the watering cycle” consisted of four steps:

“1) demands for information interspersed with the application of the water just short of blocking his airway 2) escalation of the amount of water applied until it blocked his airway and he started to have involuntary spasms 3) raising the water-board to clear subject’s airway 4) lowering of the water-board and return to demands for information.”

The CIA videotaped Zubaydah undergoing each of these “cycles,” only to destroy those tapes in 2005 when news of their existence surfaced and the embarrassment (and possible future culpability) of the Agency seemed increasingly to be at stake. CIA Director Michael Hayden would later assureCNN that the tapes had been destroyed only because “they no longer had ‘intelligence value’ and they posed a security risk.” Whose “security” was at risk if the tapes became public? Most likely, that of the Agency’s operatives and contractors who were breaking multiple national and international laws against torture, along with the high CIA and Bush administration officials who had directly approved their actions.

In addition to the waterboarding, the Senate torture report indicates that Zubaydah endured excruciating stress positions (which cause terrible pain without leaving a mark); sleep deprivation (for up to 180 hours, which generally induces hallucinations or psychosis); unrelenting exposure to loud noises (another psychosis-inducer); “walling” (the Agency’s term for repeatedly slamming the shoulder blades into a “flexible, false wall,” though Zubaydah told the International Committee of the Red Cross that when this was first done to him, “he was slammed directly against a hard concrete wall”); and confinement for hours in a box so cramped that he could not stand up inside it. All of these methods of torture had been given explicit approval in a memo written to the CIA’s head lawyer, John Rizzo, by Jay Bybee, who was then serving in the Justice Department’s Office of Legal Counsel. In thatmemo Bybee approved the use of 10 different “techniques” on Zubaydah.

It seems likely that, while the CIA was torturing Zubaydah at Jessen’s and Mitchell’s direction for whatever information he might have, it was also using him to test the “effectiveness” of waterboarding as a torture technique. If so, the agency and its contractors violated not only international law, but the U.S. War Crimes Act, which expressly forbids experimenting on prisoners.

What might lead us to think that Zubaydah’s treatment was, in part, an experiment? In a May 30, 2005, memo sent to Rizzo, Steven Bradbury, head of the Justice Department’s Office of Legal Counsel, discussed the CIA’s record keeping. There was, Bradbury commented, method to the CIA’s brutality. “Careful records are kept of each interrogation,” he wrote. This procedure, he continued, “allows for ongoing evaluation of the efficacy of each technique and its potential for any unintended or inappropriate results.” In other words, with the support of the Bush Justice Department, the CIA was keeping careful records of an experimental procedure designed to evaluate how well waterboarding worked.

This was Abu Zubaydah’s impression as well. “I was told during this period that I was one of the first to receive these interrogation techniques,” Zubaydah would later tell the International Committee of the Red Cross, “so no rules applied. It felt like they were experimenting and trying out techniques to be used later on other people.”

In addition to the videotaping, the CIA’s Office of Medical Services required a meticulous written record of every waterboarding session.  The details to be recorded were spelled out clearly:

“In order to best inform future medical judgments and recommendations, it is important that every application of the waterboard be thoroughly documented: how long each application (and the entire procedure) lasted, how much water was used in the process (realizing that much splashes off), how exactly the water was applied, if a seal was achieved, if the naso- or oropharynx was filled, what sort of volume was expelled, how long was the break between applications, and how the subject looked between each treatment.”

Again, these were clearly meant to be the records of an experimental procedure, focusing as they did on how much water was effective; whether a “seal” was achieved (so no air could enter the victim’s lungs); whether the naso- or oropharynx (that is, the nose and throat) were so full of water the victim could not breathe; and just how much the “subject” vomited up.

It was with Zubaydah that the CIA also began its post-9/11 practice of hiding detainees from the International Committee of the Red Cross by transferring them to its “black sites,” the secret prisons it was setting up in countries with complacent or complicit regimes around the world. Such unacknowledged detainees came to be known as “ghost prisoners,” because they had no official existence. As the Senate torture report noted, “In part to avoid declaring Abu Zubaydah to the International Committee of the Red Cross, which would be required if he were detained at a U.S. military base, the CIA decided to seek authorization to clandestinely detain Abu Zubaydah at a facility in Country _______ [now known to have been Thailand].”

Tortured and Circular Reasoning

As British investigative journalist Andy Worthington reported in 2009, the Bush administration used Abu Zubaydah’s “interrogation” results to help justify the greatest crime of that administration, the unprovoked, illegal invasion of Iraq. Officials leaked to the media that he had confessed to knowing about a secret agreement involving Osama bin Laden, Abu Musab al-Zarqawi (who later led al-Qaeda in Iraq), and Iraqi autocrat Saddam Hussein to work together “to destabilize the autonomous Kurdish region in northern Iraq.” Of course, it was all lies. Zubaydah couldn’t have known about such an arrangement, first because it was, as Worthington says, “absurd,” and second, because Zubaydah was not a member of al-Qaeda at all.

In fact, the evidence that Zubaydah had anything to do with al-Qaeda was beyond circumstantial — it was entirely circular. The administration’s reasoning went something like this: Zubaydah, a “senior al-Qaeda lieutenant,” ran the Khaldan camp in Afghanistan; therefore, Khaldan was an al-Qaeda camp; if Khaldan was an al Qaeda camp, then Zubaydah must have been a senior al Qaeda official.

They then used their “enhanced techniques” to drag what they wanted to hear out of a man whose life bore no relation to the tortured lies he evidently finally told his captors. Not surprisingly, no aspect of the administration’s formula proved accurate.  It was true that, for several years, the Bush administration routinely referred to Khaldan as an al-Qaeda training camp, but the CIA was well aware that this wasn’t so.

The Senate Intelligence Committee’s torture report, for instance, made this crystal clear, quoting an August 16, 2006, CIA Intelligence Assessment, “Countering Misconceptions About Training Camps in Afghanistan, 1990-2001” this way:

“Khaldan Not Affiliated With Al-Qa’ida. A common misperception in outside articles is that Khaldan camp was run by al-Qa’ida. Pre-11 September 2001 reporting miscast Abu Zubaydah as a ‘senior al-Qa’ida lieutenant,’ which led to the inference that the Khaldan camp he was administering was tied to Usama bin Laden.”

Not only was Zubaydah not a senior al-Qaeda lieutenant, he had, according to the report, been turned down for membership in al-Qaeda as early as 1993 and the CIA knew it by at least 2006, if not far sooner. Nevertheless, the month after it privately clarified the nature of the Khaldan camp and Zubaydah’s lack of al-Qaeda connections, President Bush used the story of Zubaydah’s capture and interrogation in a speech to the nation justifying the CIA’s “enhanced interrogation” program. He then claimed that Zubaydah had “helped smuggle Al Qaida leaders out of Afghanistan.”

In the same speech, Bush told the nation, “Our intelligence community believes [Zubaydah] had run a terrorist camp in Afghanistan where some of the 9/11 hijackers trained” (a reference presumably to Khaldan). Perhaps the CIA should have been looking instead at some of the people who actuallytrained the hijackers — the operators of flight schools in the United States, where, according to a September 23, 2001 Washington Post story, the FBIalready knew “terrorists” were learning to fly 747s.

In June 2007, the Bush administration doubled down on its claim that Zubaydah was involved with 9/11. At a hearing before the congressional Commission on Security and Cooperation in Europe, State Department Legal Adviser John Bellinger, discussing why the Guantánamo prison needed to remain open, explained that it “serves a very important purpose, to hold and detain individuals who are extremely dangerous… [like] Abu Zubaydah, people who have been planners of 9/11.”

Charges Withdrawn

In September 2009, the U.S. government quietly withdrew its many allegations against Abu Zubaydah. His attorneys had filed a habeas corpuspetition on his behalf; that is, a petition to excercise the constitutional right of anyone in government custody to know on what charges they are being held. In that context, they were asking the government to supply certain documents to help substantiate their claim that his continued detention in Guantánamo was illegal. The new Obama administration replied with a 109-page brieffiled in the U.S. District Court in the District of Columbia, which is legally designated to hear the habeas cases of Guantánamo detainees.

The bulk of that brief came down to a government argument that was curious indeed, given the years of bragging about Zubaydah’s central role in al-Qaeda’s activities.  It claimed that there was no reason to turn over any “exculpatory” documents demonstrating that he was not a member of al-Qaeda, or that he had no involvement in 9/11 or any other terrorist activity — because the government was no longer claiming that any of those things were true.

The government’s lawyers went on to claim, bizarrely enough, that the Bush administration had never “contended that [Zubaydah] had any personal involvement in planning or executing… the attacks of September 11, 2001.” They added that “the Government also has not contended in this proceeding that, at the time of his capture, [Zubaydah] had knowledge of any specific impending terrorist operations” — an especially curious claim, since the prevention of such future attacks was how the CIA justified its torture of Zubaydah in the first place. Far from believing that he was “if not the number two, very close to the number two person in” al-Qaeda, as Secretary of Defense Donald Rumsfeld had once claimed, “the Government has not contended in this proceeding that [Zubaydah] was a member of al-Qaida or otherwise formally identified with al-Qaida.”

And so, the case against the man who was waterboarded 83 times and contributed supposedly crucial information to the CIA on al-Qaeda plotting was oh-so-quietly withdrawn without either fuss or media attention.  Exhibit one was now exhibit none.

Seven years after the initial filing of Zubaydah’s habeas petition, the DC District Court has yet to rule on it. Given the court’s average 751-day turnaround time on such petitions, this is an extraordinary length of time. Here, justice delayed is truly justice denied.

Perhaps we should not be surprised, however. According to the Senate Intelligence Committee report, CIA headquarters assured those who were interrogating Zubaydah that he would “never be placed in a situation where he has any significant contact with others and/or has the opportunity to be released.” In fact, “all major players are in concurrence,” stated the agency, that he “should remain incommunicado for the remainder of his life.” And so far, that’s exactly what’s happened.

The capture, torture, and propaganda use of Abu Zubaydah is the perfect example of the U.S. government’s unique combination of willful law-breaking, ass-covering memo-writing, and what some Salvadorans I once worked with called “strategic incompetence.” The fact that no one — not George Bush or Dick Cheney, not Jessen or Mitchell, nor multiple directors of the CIA — has been held accountable means that, unless we are very lucky, we will see more of the same in the future.

Source: New feed

Clintonism screwed the Democrats: How Bill, Hillary and the Democratic Leadership Council gutted progressivism

Hillary Clinton today promotes herself as a “reformer with results,” and she’s relied on a widespread impression that she and Bernie Sanders aren’t really that far apart on major issues. After the last round of primaries in the Northeast, she expressed it again:

“Because whether you support Senator Sanders or you support me, there’s much more that unites us than divides us. We all agree that wages are too low and inequality is too high, that Wall Street can never again be allowed to threaten Main Street, and we should expand Social Security, not cut or privatize it. We Democrats agree that college should be affordable to all, and student debt shouldn’t hold anyone back.”

Of course, it’s not just Democrats. The points she touched on have broad popular support, despite elite hostility, or at best neglect, which is a large part of why Sanders went from 3% support in the polls to near parity in some April polls [FOX, NBC/WSJ, IPSOS/REUTERS].

But Clinton is a skilled politician, so she’s artfully re-aligned herself to blur their differences, with overwhelming support from the elite punditocracy.  When the dark side of the Clinton record from 1990s is raised—NAFTA,  Defense Of Marriage Act, “welfare reform,” mass incarceration, Wall Street deregulation, etc.—two defenses come readily to mind: “Hillary didn’t do it!/Bill was president” and “times change/you’re forgetting what it was like.”

These are both effective narratives in the establishment echo chamber, which is designed and intended for horse-race politics at the expense of political understanding (as well as factual accuracy).  But Hillary Clinton wouldn’t be here today if she hadn’t been aligned with those policies—and with helping to create the environment in which they came to pass.  Even before entering the White House with her husband, who had promised voters  “two for the price of one” during the 1992 campaign, the pair had cast their lot in with those who moved the party to the right, most notably when Bill Clinton became head of the DLC—the Democratic Leadership Council, or as Jesse Jackson called it, “Democrats for the Leisure Class.”

The DLC was crucial to the Clinton’s rise to power, so it’s absolutely essential to understand it, if one wants to understand their politics—and that of the party they’ve so profoundly reshaped—all the way up through Hillary Clinton’s most recent rearticulation of the day. 

An excellent starting point for understanding this comes via the much broader focus of Thomas Ferguson and Joel Rogers’s book, Right Turn: The Decline of the Democrats and the Future of American Politics. While the book makes references going back to the Carter era, it opens with a meeting of twenty top Democratic Party fund-raisers three weeks after Walter Mondale’s landslide loss in the 1984 election, where they discussed1988 and how they could have more policy influence in that campaign, how they might use their fund-raising skills to move the party toward their business oriented, centrist viewpoints,” as the Washington Post reported the next day. 

It goes on to describe how, two days later, a closely-related group, the Coalition for a Democratic Majority, sponsored a similarly-themed public forum that drew national press attention, dominated by speeches given by Arizona governor Bruce Babbitt and Virginia governor Charles Robb, who, in turn, were also prominent founding members of the Democratic Leadership Council in the following spring, along with Missouri Representative Richard Gephardt and Georgia Senator Sam Nunn:

“The moderate and conservative Democrats didn’t make it past the first round in its primaries in 1984 and we want to change that,” said Nunn, a major Democratic proponent of increased military spending who had backed John Glenn in the 1984 race.

Right Turn makes it abundantly clear that the DLC was just one facet of a much broader mosaic of elite political reorientation—a reorientation profoundly out of step with the American people, as the book also takes pains to point out.  Salon contributor Corey Robin recently illuminated this broader elite shift in a blog post, “When Neoliberalism Was Young: A Lookback on Clintonism before Clinton,” citing in particular A Neoliberal’s Manifesto” by Charles Peters, founder and editor of The Washington Monthly, in which “The basic orientation is announced in the opening paragraph,” Robin notes:

We still believe in liberty and justice for all, in mercy for the afflicted and help for the down and out. But we no longer automatically favor unions and big government or oppose the military and big business. Indeed, in our search for solutions that work, we have to distrust all automatic responses, liberal or conservative.

This captures neoliberalism in a nutshell: a disavowal of New Deal liberalism in the posture of open-mindness, which (“Ooops, I did it again!”) repeatedly lends itself to conservative cooptation. It quickly became a popular stance in the Democratic donor class, spread further by the publications they financed and other political infrastructure.

Still, the DLC emerged to play a much more central role than most of the other forces involved, specifically because of Bill Clinton. Al From tells the story like this:

A little after four o’clock on the afternoon of April 6, 1989, I walked into the office of Governor Bill Clinton on the second floor of the Arkansas State Capitol in Little Rock.

“I’ve got a deal for you,” I told Clinton after a few minutes of political chitchat. “If you agree to become chairman of the DLC, we’ll pay for your travel around the country, we’ll work together on an agenda, and I think you’ll be president one day and we’ll both be important.” With that proposition, Clinton agreed to become chairman of the Democratic Leadership Council, and our partnership was born.

Clinton was a natural fit for DLC, From said. Both Clintons, in fact:

He was not afraid to challenge old orthodoxies. In the early 1980s, long before I knew him, he and Hillary Clinton pushed cutting-edge education reforms, like pay for performance and public-school choice, against the opposition of the powerful Arkansas Education Association.

Fighting teachers unions! Just like Bernie Sanders, I’m sure!

As far as the DLC was concerned, Joan Walsh put things a little more realistically here in 2003:

Clinton…. took the DLC’s shelves of policy-wonk manifestoes and dark warnings about special-interest politics, and turned it into an agenda for winning elections and governing, with his own charm and his own brand of compromise and conciliation, not DLC founder Al From’s. The DLC thinks it made Bill Clinton, but in fact Clinton made the DLC. Without his charisma and political smarts, its earnest, castor-oil approach to politics and policy would never have won a national election.

The same, of course, is true of Hillary Clinton as well: however smart, educated, and otherwise well-qualified she may be—as much as anyone in her generation, arguably—she would never have been where she is today without her husband’s charisma and political smarts, which in turn undermines her retroactive efforts to disavow the path they blazed together. And that path was “progressive” because From decided to label it so—as push-back against journalists’ more accurate recognition that it represented a conservative force within the Democratic Party. As Paul Star wrote in 2014:

In 1991, Clinton told a DLC conference in Cleveland: “Our New Choice plainly rejects the old ideologies and the false choices they impose. Our agenda isn’t liberal or conservative. It is both, and it is different.” This denial of labels was a way of getting people to listen. Eventually, though, needing a label, From settled on “progressive,” an ironic choice. During the Cold War, “progressive” had meant left of liberal (as in Henry Wallace’s Progressive Party), but it now came to refer vaguely to any viewpoint left of center. From says he called the DLC’s policy arm the Progressive Policy Institute because he was tired of his organization being described by journalists as conservative.

Even the claim of being ‘vaguely left of center’ is a questionable one, considering the vast differences between elite and mass opinion which have so shaken and confused elites this cycle. It’s arguably more instructive to recall that in 1896, running against the Populist/Democratic Party alliance headed by William Jennings Bryant, William McKinley’s big business Republicans successfully portrayed themselves as representing the forces of progress. It’s an extremely ambiguous term, to say the least. Clinton’s description of their agenda as neither liberal nor conservative, but “both” and “different” perfectly exemplifies this ambiguity.

While it’s true the DLC’s formation was born out of a widespread Democratic donor class revolt, and was intended to combat forces pushing the party to the left, that’s not the full story of its genesis, and it’s misleading to ignore that there were some genuinely progressive motivations involved.   We need to understand that side of the story, too, if we’re to understand the limitations that live on today in Hillary Clinton’s continuing claims to be a progressive. And for that, we can turn to Mark Schmitt’s look back in 2011, “When the Democratic Leadership Council Mattered,” just after the DLC closed its doors. “The real DLC was far more complicated — though not necessarily more benign — than its caricature in the 2000s, when it became best known for blind support of the Iraq War and for founder Al From’s simmering anger at anti-Iraq War liberals like Howard Dean and Ned Lamont.” Schmitt wrote.

To understand the real DLC, it’s useful to know the name Gillis Long,” the Louisiana congressman (cousin of the legendary Huey Long) who chaired the House Democratic Caucus after Reagan’s election. “Both DLC co-founder Will Marshall — who now runs the thriving and independent think tank the Progressive Policy Institute (PPI) — and From had worked for Long and remained devoted to him after his death, on the day of Reagan’s second inauguration.”

The DLC was, in significant ways, an effort to keep Long’s style of politics alive:

But chasing the chimera of a South that was going to elect more than the occasional Long or [Florida Governor Lawton] Chiles led the DLC into a cul-de-sac, in which the pursuit of white Southern votes became an end in itself, and so the fight to eliminate affirmative action and reform welfare (neither of which would much affect the economic well-being of the working middle class that was already losing ground) became the organization’s touchstone issues in the mid-1990s. Racial politics, not “corporatism,” was the more controversial aspect of the DLC at the time Jesse Jackson called it “Democrats for the Leisure Class.”

Which is why it’s so ironic to see Hillary Clinton depending so heavily minority support (especially Southern blacks) to not only keep her candidacy alive, but also her reputation as a progressive.   Schmitt goes on to say, “But at least the organization was thinking about how to construct a working majority with progressive ideas at the heart of it,” but there three distinct problems here: First, how progressive were those ideas? Second, were they really at the heart of what the DLC was doing? And third, what working majority? The third problem is far less subject to obfuscation than the other two: The fact that Democrats lost the House in a landslide two years after Clinton’s election for the first time in 40 years, and held on to it for 12 years after that does not square at all with notion that Clinton “saved the Democratic Party,” or that DLC politics constructed “a working majority with progressive ideas at the heart of it.”

In fact, they did the exact opposite: they destroyed the Democratic House majority which had long been a bastion for progressive ideas and political leaders.  That fact alone casts doubts about the whole thrust of the DLC’s progressive claims. After all, if their argument was—like Clinton’s today—that they are pragmatic progressives, then their failure to build an enduring political majority undermines the very core of their argument.

The fact that the same pattern of record-breaking Congressional losses (and state legislative ones as well) repeated itself with Barack Obama should tell us something. Obama had nothing to do with the DLC, directly. But he grew up politically in the world that the DLC did so much to create, and he espoused a similar desire to be neither liberal nor conservative, neither “blue state” nor “red state,” but “both” and “something different.” Bill Clinton and Barack Obama were both successful politicians individually, but neither was successful in constructing “a working majority with progressive ideas at the heart of it,” even if you don’t question how progressive their ideas really were. Perhaps the best way to understand their success, as well as the limits of this brand of “progressive ideas” is through analytic lens of Augustus Cochrane III’s 2001 book, Democracy Heading South: National Politics in the Shadow of Dixie.

Cochrane argued that the same sorts of maladies which afflicted the South circa 1950, diagnosed in V.O. Key’s classic, Southern Politics in State and Nation, had come to afflict the nation as a whole. The specific structures might differ—lungs vs gills—but the functions, or dysfunctions were strikingly similar, he argued, with political power held tight by wealthy elites while the majority of voters were confused, disengaged, or entirely absent, with politics serving them primarily as entertainment. In the 1950s-era South, its one party system was functionally a no-party system, operating somewhat differently from state to state. In the country at large, the same result later came from a dealignment of politics—the White House controlled by one party, congress by another—a frequent, but not dominant pattern in American politics until 1968, after which it’s become the normal state of affairs. The intensified role of money and media served to accelerate the breakdown of party bonds and further entrepreneurial politics, in which individual politicians thrive by branding themselves, regardless of how party allies may fare. 

This is the environment in which Bill Clinton and Barack Obama proved so successful, even as their parties crumbled. Their branding worked first and foremost with the donor class, and then the broader political elite which provides guidance to the mass public public in ordinary times. But this system fails to really engage the public directly, or respond to their needs, which is why participation falls off so sharply during mid-term elections, leaving the possibility of a working majority—with a well-thought out, reality-based policy agenda—increasingly out of reach. 

The DLC brand of progressivism was perfectly crafted within this corrupt system of politics to enable certain individual politicians to succeed with their targeted messages and well-honed promises combining “responsibility” on the one hand an “compassion” on the other.  The Clinton’s early 80s fight against the Arkansas teachers union was a textbook example of how this worked, which is why From cited that example as showing that the Clintons were made for the DLC. The problem Cochrane described is not about corrupt individuals, necessarily, but it is about systems in which mass organizations like teachers unions are automatically labeled corrupt.  In an upside-down world like that, things are bound to be confusing. 

Which is why that world favors clear, crisp messaging more than almost anything else. “Reformer with results” is a powerful branding message, regardless of how meager those results may be, or even how toxic they are now seen to be two decades on down the road.  In the end, the real problem with Bill and Hillary Clinton-style progressivism is not only what a mixed bag its results have proven to be. There’s also the further problem of how it muddles our vision of what a truly successful progressive politics might look like. Now, more than ever, we need to go back and ask ourselves, what were the roads not taken? Where could they have lead us instead of here? And how can we create similar alternatives going forward?  That’s a conversation we’ve barely even begun to have.

Source: New feed

Our gun myths are all wrong: The real history behind the Second Amendment clichés that have sustained our lethal gun culture

An abridged history of the American gun culture, told from legend and popular memory, might go like this: We were born a gun culture. Americans have an exceptional, unique, and timeless relationship to guns, starting with the militias of the Revolutionary War, and it developed on its own from there. Some celebrate and some condemn this relationship, but it is in either case unique. Guns have long been a commonplace part of American life, which is why guns pretty much sell themselves. The Second Amendment, ubiquitous to contemporary gun politics, was a prominent presence historically and is a source of the gun’s unique stature, while the idea of gun control is more recent. The American gun story is about civilians and individual citizens, and they are its heroes or its villains—the frontiersman, the Daniel Boone “long hunter” who trekked far into the wilderness alone, the citizen-patriot militiaman, the guiltily valorized outlaw, and the gunslinger. The gun’s mystique was forged most vividly on the violent western frontier of the 1800s, and this mystique is about individualism: guns protect citizens against overzealous government infringement of liberties; they protect freedom and self-determination.

This book tells the story of American guns from the perspective of what the gun was—in essence, an object, produced by businesses, to be sold. The story that highlights the Second Amendment, frontiersmen, militias, and the desires and character of the American gun owner is not to be found in the pages of this book. Or, more accurately, my work deliberately skews the story of the gun in another direction: it focuses on the missing element of the gun culture rather than reworking the familiar themes. As such, it has different characters, motivations, plot twists, highlights, and timelines, and all of these elements call into question the gun clichés that animate contemporary politics.


Perhaps the most powerful cliché is gun exceptionalism. Many people on both sides of the debate about guns believe that America has a unique and special relationship to guns, and that this exceptional relationship—whether celebrated or condemned—is a foundation of American gun culture. Americans have always loved guns, common wisdom holds, or, “guns are part of the American identity.”

A main thesis of this book is a simple but important one. We became a gun culture not because the gun was symbolically intrinsic to Americans or special to our identity, or because the gun was something exceptional in our culture, but precisely because it was not. From the vantage point of business, the gun was a product of non-exceptionalism. Perhaps not in the earliest years of its manufacture, when the government construed the gun as an exceptional instrument of war and common defense, whose more efficient production merited guaranteed contracts and markets, generous funding, protective tariffs, and a freewheeling exchange of innovation across public armories to germinal private industry, but in the key years of its diffusion, and for many years thereafter, it was like a buckle or a pin, an unexceptional object of commerce. No pangs of conscience were attached to it, and no more special regulations, prohibitions, values, or mystique pertained to its manufacture, marketing, and sale than to a shovel. Indeed, there were no special rules concerning the international trade of guns until modest presidential embargo powers became effective in 1898. By that time, Winchester’s company sat at the center of its own web of gun commerce that radiated outward to six continents. No exceptional regulations existed when Winchester and his competitors were first “scattering the guns,” in his terms, to create US markets. Although the gun industry produced an exceptional product—designed to injure and kill—it followed the ordinary trends and practices of the corporate industrial economy in the nineteenth and twentieth centuries. In short: the gun was no exception.

Ironically, had the gun been perceived in its early commercial years as a unique and extraordinary thing in society, we might never have become a gun culture. Under those circumstances, politics, law, and other regulatory forces might well have stepped in early on to circumscribe or shape the gun’s manufacture and sale, as they did in some other places around the world. For the United States, the gun culture was forged in the image of commerce. It was stamped, perhaps indelibly, by what historian John Blum called the “amorality of business.” America has an estimated 300 million guns in circulation today, but the gunning of the country started extemporaneously, and it was etched strongly by the character, ambition, and will of gun capitalists rather than by diplomats, politicians, generals, and statesmen. Gun politics today are consumed by Second Amendment controversies, but the Second Amendment did not design, invent, patent, mass-produce, advertise, sell, market, and distribute guns. Yet the gun business, which did, and does, is largely invisible in today’s gun politics.

In the context of business amorality and unexceptionalism, Winchester cast his industrial lot and fortune on a faster and mechanically improved rifle, and he did so not as a gunsmith or even as a gun enthusiast, but as a nineteenth-century capitalist. Others later recalled that Winchester had never personally owned a gun, had never displayed guns in his home, and had never shot a gun before he built his family and corporate fortunes off of them. He was, at the beginning, a men’s shirt manufacturer. If he identified with any group, it would have been with the vanguard class of self-made men who scorned esoteric learning, but, with the help of enterprise, mechanization, and technology, took the world in hand, shattered it into discrete pieces, and then redesigned and reassembled it into more profitable versions of itself. Spellbound by the hows of industrial production, and indifferent to the whats—the industry’s object—Oliver Winchester went into the gun business the way his compatriots went into corsets or hammers.


A second cliché of American gun culture holds that with guns, “demand creates its own supply,” in the words of sociologist James Wright. In a nation of gun whisperers, so believers say, guns were commonplace, and their later industrial production was a reflection of pre-existing demand. The view from the ledger book is different, however. The creation, discovery, invention, and reinvention of gun markets— the visible hand of the gun industrialist at work—was a recurrent, bedrock project of the gun business. It is also a recurrent theme of this book.

To start at the beginning: Historians of the colonial period have stumbled into controversy when they have attempted to count guns; but studies find, in general, that guns, in various states of repair or disrepair, were neither ubiquitous nor rare. Most find a higher rate of gun ownership in the southern colonies—a few placing it around two-thirds of households—and lower rates in the northern ones—anywhere from one-third to just under a half.

In the craft phase, America had as many guns as were inherited, requested, or required. In the industrial phase, after Samuel Colt and Oliver Winchester founded private armories that made six-shot revolvers and rapid-firing repeater rifles—new and patented firearms—it had as many as could be mass-produced by machine. From the gun industrialist’s perspective, supply creates the need for demand: volume production required volume consumption. And the gun was no different from other commodities in this arithmetic of industry.

The US military was certainly the most convenient market. Colt wooed government support with champagne feasts at Washington, DC’s, Willard and National hotels. His cousin, the company treasurer, bristled, “I have no belief in undertaking to raise the character of your gun by old Madeira.” But both Colt and Winchester struggled to acquire government contracts from the 1840s up to the Civil War, and so they saw the wisdom of cultivating other markets. To woo a civilian customer, Colt demonstrated his arm at the Battery Park in New York, to little avail, since, as one gun expert noted, “multi-firing arms were not needed by the average man.” Winchester’s own compatriot New Haveners thought he had “lost his reason” when they learned that his new manufactory “was equipped to produce 200 rifles a day.” A letter book in the archives of Colt’s Patent Fire-Arms Manufacturing Company reveals detailed, fractious bickering between the company and its favored sales “allies” over how many guns the dealers should be expected to “push.” After World War I, saddled by massive wartime plant expansion and burdened by debt, the Winchester Repeating Arms Company (WRAC) had to push sales again, especially through what its executives shorthanded as an ambitious national “boy plan,” with a goal of reaching “3,363,537 boys” ages ten to sixteen. “When the boys and girls of your town arrive at the age of twelve years, they become your prospects,” the company’s internal sales letter explained. It was a new refrain in an old song. At this time the company announced the largest nationwide marketing campaign ever undertaken for guns “in the history of the world.” As it was in the beginning, so it was in 1922: gun markets and demand could never be taken for granted. It was the gun business’s business to create them.

This example leads to another key point. Company records puncture the compelling assumption that guns just “sell themselves” in America, or that the gun industry is fueled by a pristine demand unsullied by the need for promotion, salesmanship, or marketing. While it is true that some demand always existed, and will always exist, it is also true that the gun industry needed to sell at adequate volumes to support mass-production, and that selling in such volumes took work. Moreover, it took work recurrently, in different eras and under different conditions, as the gun industry grew.

This work of inventing and cultivating markets did not occur because gun titans were nefarious “Merchants of Death,” intent on the business of “Death, Inc.,” as a later generation of disarmament activists

xvi Introduction: “The Art and Mystery of a Gunsmith”

would proclaim. Rather, they were almost the opposite of that caricature: they were businessmen. Their dispassionate and ironclad grammar was the agnosticism of commerce.

The exploration of gun markets fractures the monolithic idea of an American gun culture into cultures, in the plural, because it reveals the very distinct market segments that gun industrialists courted, some of whom were barely on speaking terms. The WRAC itself was especially proud of its capacity to recognize and stimulate desire for the thing it made, even when that desire was dormant, insufficient, latent, or indiscernible.


When the gun industrialists could not find sufficient domestic civilian markets or secure US military contracts, they looked far and wide to find markets and customers elsewhere. Indeed, one of the most striking findings of this book is the degree to which all four of the major gun capitalists relied on international markets for their very survival in the mid-1800s. This finding points to another myth that the business records dispel: the idea that America’s gun culture is, in a word, American, with its geography confined to the national singularities of the Revolutionary War or the frontier. Today, gun-control advocates look longingly to Western Europe, which has lower homicide and gun violence rates than America, and Europeans occasionally puzzle disapprovingly over the American gun “obsession.” But the countries that today condemn the United States’ relationship with guns kept US guns in business in the 1800s. It was very much European bellicosity and imperial ambition by regimes and governments that provided viable markets for America’s mass-produced arms.

When viewed from the perspective of business, the American gun culture is better understood as an international, global phenomenon on the leading edge of the first wave of globalization in the 1860s and 1870s. Winchester survived initially by selling internationally—as did Colt’s, E. Remington & Sons, and Smith & Wesson. In the chapters that follow, I describe the WRAC’s globe-trotting “gun men” and the expatriate American gun community in Europe. Before it was—or could be—the “gun that won the West,” the Winchester repeater was the gun that armed the Ottoman Empire, and that traveled in an oxcart to the Juárez revolutionary forces in 1866 in Mexico. Before Remington armed sportsmen of the American outdoors, it armed everyone from the acting war minister for the papacy (5,000 sold) to various actors in Egypt (55,000) and Cuba (89,000). The Colt revolver, the “peacemaker” of the American frontier, was before that the gun of the king of Sardinia, and Smith & Wesson’s contract with the Russian Empire in the 1870s to supply Model 3’s kept the company afloat for five years.


Another powerful cliché in twenty-first-century politics is that gun love is timeless, or at least as old as American history.* To be sure, gun mavens have existed as long as guns, and there will always be anecdotal, narrative evidence to corroborate a variety of feelings toward guns, from love to revulsion, across time. But in the Winchester company’s early ads, the gun comes across as closer to a plow than a culturally charged object, more on the tool tide of the equation than the totem side. For the most part, although not exclusively, the ads emphasized functionality, and gun titans sought markets in places such as the American Agriculturalist. Other evidence from the gun archive suggests that Americans purchased far more relatively inexpensive, secondhand, cast-off rifles that were unglamorous yet workable than they did the Winchesters or Colts prominent in legends of the frontier. The WRAC envisioned their primary customer as the “ordinary shooter”—a farmer or rural hunter.

In the early 1900s, the tone of the gun industry changed. The country was more urbanized. The martial phase of western conquest was over. Logically, sales should have dropped, but the WRAC did quite well from 1890 to 1914. The company added $15.5 million to its net worth from 1889 to 1914. Its annual gun sales were almost thirty times greater in 1914, at 292,400, than in 1875, at 9,800, and eleven times greater than the 1880 sales of 26,500. Although best known for its Model 1873 of legend, the company’s bread-and-butter model with much larger sales, antique gun experts will point out, was the Model 1894. Overall in the gun industry, twenty-seven gunmakers in 1910 produced over $8 million in guns and $25 million in ammunition. With less practical utility, the gun became—and to some extent had to become—an object with emotional value. One answer to the question “Why do Americans love guns?” is, simply, that we were invited to do so by those who made and sold them at the moment when their products had shed much of their more practical, utilitarian value. What was once needed now had to be loved. This observation suggests in turn that the notion of an emotional and political affinity for the gun was perhaps a post-frontier phenomenon of the twentieth century talking about the nineteenth.

Modern advertising fascinated the Winchester executives: again, the gun was no exception to the business trends of the day in a new consumer culture, whether the product was soap or a rifle. The WRAC’s internal bulletins instructed the sales force on how to seduce otherwise indifferent customers who had little need for rifles as tools. Winchester pushed the modern American gun in two seemingly opposite directions, aiming to make it an object of luxury—a nonessential but gratifying commodity like “Packards, or golf clubs, or diamonds”—as well as an object of natural or “subconscious” instinct—something all “real boys” wanted, in more modern terms of psychology. But, in both cases, the gun was an object of desire, and the customer was to be seduced to want the product. In a daily sales bulletin sent to distributors, for example, the company thought to emulate “the liquor people,” who “tried to perpetuate their business by ‘educating’ young men to the use of their products. Very immoral of them, of course, but mighty good business.”13

In the late 1910s, in short, the targeted customer began to shift from the “ordinary shooter” to the “gun crank.” The latter, who emerged in company correspondence and the gun press, was a customer with a deep psychological bond with his gun. This was a transition from imag- ining a customer who needed guns but didn’t especially want them to a customer who wanted guns but didn’t especially need them.

Obviously, the gun industry’s sales efforts did not begin in a vacuum—no good advertising does—but rode on the weedy proliferation of gun legends across popular media. Therefore, this book does not just follow the money of the gun business; it also follows the trail of the gun legends. Beginning with a largely fabricated 1867 account of Wild Bill Hickok in Harper’s, for example, I trace forensically how this and other American gun legends took a reverse migration, beginning as fiction and hardening into fact and highbrow history as the decades progressed. On the way, they passed through scores of dime novels, movies, and fictionalized real histories. The “West that won the gun” is a collective legend of the American gunslinger that has consistently, across characters and decades, exaggerated both the quantity of gun violence in America—our ancestors were not actually as trigger-happy as twentieth-century moviegoers and readers of pulp fiction were led to believe—and the “quality” of that gun violence: in the legends, guns were tied to honor rather than intoxication, justice rather than impulsivity, and homicide rather than suicide, even though suicide for many decades has accounted for the majority of gun deaths.

Above all, the West that won the gun is almost always a narrative about American individualism. Paradoxically, an industry that first perfected interchangeability and machine production, and that mass-produced its products to within 1/1,000th of an inch of each other, created one of the most enduring twentieth-century icons of this American trait.

Excerpted from “The Gunning of America: Business and the Making of American Gun Culture” by Pamela Haag. Copyright © 2016. Available from Basic Books, an imprint of Perseus Books, a division of PBG Publishing, LLC, a subsidiary of Hachette Book Group, Inc. All rights reserved.

Source: New feed

“Grief sedated by orgasm, orgasm heightened by grief”: Beyoncé, “Lemonade” and the new reality of infidelity

“Are you cheating on me?” Beyoncé asks in her visual album “Lemonade,” which premiered last weekend on HBO. She throws open a door, and water gushes forth—an apt metaphor for the flood of emotions that her question, and its implied answer, unleashes.

As a couples therapist, I’ve sat with hundreds of women, and men, in the turbulent aftermath of infidelity. For the past decade, I’ve been traveling the globe listening to tales of betrayal from every side. What struck me about Beyoncé’s album was both the universality of its themes and the unusual way in which it presented them. Whether autobiography or simply art, her multimedia treatise on unfaithful love represents a refreshing break with this country’s accepted narratives on the topic.

In the American backyard, adultery is sold with a mixture of condemnation and titillation. Magazine covers peddle smut while preaching sanctimony. While our society has become sexually open to the point of overflowing, when it comes to infidelity even the most liberal minds can remain intransigent. We may not be able to stop the fact that it happens, but we can all agree that it shouldn’t.

Another thing most Americans seem to agree on is that infidelity is among the worst things that can happen to a couple. The dialogue here is framed in terms borrowed from trauma, crime and religion: victims and perpetrators; injured parties and infidels; confession, repentance and redemption. As a European, I can testify that in other cultures, the betrayal is no less painful, but the response is more philosophical and pragmatic. Americans do not cheat any less than the supposedly lascivious French; they just feel more guilty about it, because the experience here is framed in moral terms.

As Brazilian couples therapist Michele Scheinkman has pointed out, the notion of trauma provides a legitimizing framework for the pain of betrayal, but it limits the avenues for recovery. This clinical approach denudes the pain of its romantic essence and its erotic energy—the very qualities that must be reignited if a relationship is to not only survive but thrive. Jealousy, rage, vengeance and lust are as central to the story as loss, pain and shattered trust—something European and Latin cultures will more readily admit than Americans. Infidelity is not just about broken contracts; it is about broken hearts.

These erotic aspects of the drama are unapologetically displayed in Beyoncé’s fierce performance. She does not present herself as victim, but as a woman invigorated and empowered by love. She even voices one of the great unspoken truths about the aftermath of affairs: the hot sex that often ensues. “Grief sedated by orgasm,” she intones, “orgasm heightened by grief.” Perhaps most strikingly, she is unashamed to announce to the world that she intends to remain Mrs. Carter. “If we’re gonna heal, let it be glorious.”

Once upon a time, divorce carried all the shame. Today, choosing to stay when you can leave is the new shame. That’s not to say we don’t do it—research indicates that most couples will stay together after an infidelity​—but we do it stoically and silently. Betrayed women only get to sing songs of rage and retribution and wield baseball bats after they’ve walked out the door. Politicians’ wives stand mute beside their contrite husbands at press conferences, and they are judged for doing so. From nationally televised presidential debates to the privacy of the voting booth, Hillary Clinton continues to be held in contempt of the court of public opinion for choosing to stay when she was free to go.

There’s no question that the cultural conversation surrounding affairs reinforces some of America’s most deeply held values: love, honesty, commitment and responsibility—values that have been the cornerstones of our society. But the intensity of the reactions that the topic provokes can also generate narrowness, hypocrisy and hasty responses. The dilemmas of love and desire don’t always yield to simple answers of black and white, good and bad, victim and perpetrator.

Our current American bias is to privatize and pathologize infidelity, laying the blame on deficient couples or troubled individuals. Beyoncé counters this tendency to individualize sweeping social realities, placing her own story within the cultural legacy of black men, black women and the violence against both. Her lyrics are intensely personal, but she frames them with imagery and poetry that reminds us that historical forces shape our transgressions.

Adultery has existed since marriage was invented, and so too the taboo against it. It has been legislated, debated, politicized and demonized throughout history. When marriage was an economic enterprise, infidelity threatened our economic security. Now that marriage is a romantic arrangement, infidelity threatens our emotional security—our quest to be someone’s one­ and­ only. Today, infidelity is the ultimate betrayal, for it shatters the grand ambition of love.

Yet infidelity has a tenacity that marriage can only envy. Whether we like it or not, it seems to be here to stay. It happens in good marriages and it happens in bad marriages. It happens to our friends and neighbors, and it happens to international sex symbols and superstars—to the “most bomb pussy,” as Beyoncé puts it. It happens in cultures where it’s punishable by death and it happens in open relationships where extramarital sex is carefully negotiated beforehand. Even the freedom to leave and to divorce has not made cheating obsolete.

Given this reality, it’s time for American culture to change the conversation we’re having about infidelity—why it happens, what it means and what should or should not happen after it is revealed. The subject of affairs has a lot to teach us about relationships—what we expect, what we think we want, and what we feel entitled to. It forces us to grapple with some of the most unsettling questions: How do we negotiate the elusive balance between our emotional and our erotic needs? Is possessiveness intrinsic to love or an arcane vestige of patriarchy? Are the adulterous motives of men and women really as different as we’ve been led to believe? How do we learn to trust again? Can love ever be plural?

These are uncomfortable dilemmas, but important ones. That’s what my work is dedicated to: generating conversations about things we don’t like to talk about. Infidelity is still such a taboo, but we need to create a safe space for productive dialogue, where the multiplicity of experiences can be explored with compassion. Ultimately, I believe, this will strengthen relationships by making them more honest and more resilient. I applaud Beyoncé for her courageous contribution to this conversation.

Betrayal runs deep but it can be healed. Some affairs are death knells for relationships; others will jolt people into new possibilities. When a couple comes to me in the aftermath of a newly disclosed affair I often tell them this: Today in the West, most of us are going to have two or three significant relationships or marriages. And some of us are going to do it with the same person. Your first marriage is over. Would you like to create a second one together?

Source: New feed

Moms deserve better than this: The shameless pandering of “Mother’s Day” is an argument for the death of “hyperlink” films

From a marketing perspective, “Love Actually” was such an ingenious idea that it’s a wonder film executives didn’t think of it sooner.

The 2003 film brought the ’70s disaster movie model—in which films like “The Towering Inferno” and “The Poseidon Adventure” amassed a litany of notable celebrities to narrowly avoid death—to the romantic comedy. Instead of Faye Dunaway, Steve McQueen and Paul Newman fleeing a burning building, director Richard Curtis, previously known for writing the screenplays to “Bridget Jones’s Diary” and “Notting Hill,” offered audiences a montage of British actors falling in love just in time for Christmas. The cast list, featuring Emma Thompson, Alan Rickman, Hugh Grant and Colin Firth, was so massively expansive that the average moviegoer statistically had to like someone in it.

By its very design, “Love Actually” is made to appeal to everyone. The strategy appears to have worked: Upon the film’s release, it earned $246 million worldwide. It was the second-highest-grossing comedy of the year—behind Nancy Meyers’ latest ode to middle-aged white women standing in kitchens, “Something’s Gotta Give.” Since then, Curtis’ film has far surpassed its competition in cultural reach. “Love Actually” is one of the millennium’s more improbable cult phenomenons, a yearly staple of cable programming around the holidays. (Whether or not the film is actually any good appears to be another matter.)

There’s an old saying in Hollywood that no one wants to be the first to something, but everyone wants to be the second. That means that when a movie makes money, audiences are burdened with a series of carbon copies designed to cash in on its success, a strategy subject to increasingly diminishing returns. After “The Hunger Games” popularized the teen dystopia genre, there were the young adult adaptations “The Host,” “The Giver,” and “The Maze Runner.” The last iteration of the “Divergent” series, starring Shailene Woodley as yet another reluctant rebel fighting to topple a totalitarian regime, was the series’ lowest-grossing yet. “Allegiant” made less than half of what its first installment earned in theaters.

The rush to cash in on the box office fortunes of “Love Actually” went in two directions. First, there was the “Christmas in bulk” genre, epitomized by ensemble comedies like “The Family Stone” and the later “Love the Coopers.” The most successful entry was the poorly received “Four Christmases,” which is a cast list in search of a movie. Amazingly, the movie starred four Oscar winners: Robert Duvall, Mary Steenburgen, Sissy Spacek, and Reese Witherspoon. For the kids, there was Vince Vaughn, coasting on the success “Wedding Crashers,” and for Mom and Dad, two country superstars rounded out the impressive cast list: Dwight Yoakam and Tim McGraw. The movie took in $120 million domestically, more than twice what “Love Actually” made in the United States.

The holiday hyperlink rom-com, however, took the goal of packing as many A-listers as possible into one movie to its furthest possible extreme. Directed by Garry Marshall, these films span from 2010’s “Valentine’s Day” to this year’s “Mother’s Day,” starring Julia Roberts, Kate Hudson, Jason Sudeikis and Jennifer Aniston. Each of these films is more or less the same movie: A group of (mostly) white people must decide who they want to snuggle up with during this iteration’s festivity. In a memorable “30 Rock” parody, Emma Stone and Andy Samberg try to find a date for Martin Luther King Day. Samberg would ask his officemate out, but temporary circumstance would get in the way. “Too bad we’re just platonic friends,” he sighs.

This spoof follows an earlier send-up from “Saturday Night Live,” which reimagines 2011’s “New Year’s Eve” as “The Apocalypse,” in which famous people rush to find love before the world ends. If these films have become a punchline as the genre reaches its inevitable nadir, however, their ironic treatment masks how absurdly popular the holiday hyperlink movie was. Aside from the first “Sex and the City” movie, “Valentine’s Day” boasts the biggest opening weekend of any romantic comedy in history. Even the less successful “New Year’s Eve” still made back twice its budget.

The term “hyperlink cinema” was originally coined by Alissa Quart in an essay for Film Comment, describing Don Roos’ indie comedy “Happy Endings.” The term would be popularized by Roger Ebert in his review of Stephen Gaghan’s “Syriana,” to describe movies that feature interlocking stories woven together by a meditation on a central motif. In “Crash,” released the same year as these two features, writer/director Paul Haggis posits that Los Angelenos have to get into car accidents in order to “feel something”—and bring to the surface the city’s deep-seated racism. The use of parallel storylines was also explored in movies like “Nine Lives,” “Babel,” and most recently “Cloud Atlas,” although the trend has died out in recent years.

If an underlying morality is intended to unite these disparate plot threads, Curtis’ and Marshall’s movies only spell out their themes in the most basic of ways. In “Love Actually,” the film’s thesis is right on the label: The meaning of life is to love… actually. That’s admittedly difficult to square with the film’s more unsavory elements. One subplot follows Colin Frissell, a horny caterer who craves the flesh of nubile American girls. He procures a ticket to Wisconsin, where he meets a squad of bored, lonely supermodels (Elisha Cuthbert, Shannon Elizabeth and January Jones) and later absconds with them back to Britain. (They like his accent.) There’s very little question that when it comes to the fairer sex, romance is the last thing on Colin’s mind.

There’s even less of a connective thread in Marshall’s films. “Valentine’s Day,” the best reviewed of the director’s holiday trilogy, is like a collection of short stories that are less driven by narrative than the inevitable end goal of finding each of its characters happiness. For Sean (Eric Dane), a closeted pro football player, his quest for fulfillment entails coming out in order to keep his partner (Bradley Cooper) from leaving him. Liz (Anne Hathaway) is a phone sex operator who meets the perfect guy (Topher Grace), except for one catch: He doesn’t approve of her career. If the theme seems to be that love conquers all, that doesn’t quite extend to its director’s taste for conventionality. Like “New Year’s Eve,” the movie is so syrupy that it’s practically brought to you by Log Cabin.

As “Mother’s Day” proves, the reason that these movies exist isn’t for the sake of art. They are an act of well-orchestrated corporate synergy, designed as a product to get butts in the theater on a given day each year. Marshall’s most recent effort makes this embarrassingly clear. In the song’s introduction, the ubiquitous Meghan Trainor sings “Mom,” a pandering ode to the mothers in the audience. Take these choice lyrics: “You might have a mom, she might be the bomb/ But ain’t nobody got a mom like mine/ Her love’s ’til the end, she’s my best friend/ Ain’t nobody got a mom like mine.” Never one to let sap go to waste, Marshall actually recycles the number for the closing credits.

Aside from its shameless mommy-pandering, “Mother’s Day” is a clever bit of marketing designed to appeal to every possible demographic. Throughout the film, Marshall shoehorns people of color into the background, giving the picture an air of diversity and inclusion. Loni Love, who you might remember from VH1’s “I Love the 90s” series, appears as the token black friend to Bradley (Jason Sudeikis), a handsome gym owner grieving the loss of his wife. There’s also Max (Cameron Esposito) and Gabi (Sarah Clarke), a lesbian couple who build a womb float for a Mother’s Day parade. “Where in the world has such a gathering ever existed?” one might ask. The movie might be quietly radical if it made even a lick of sense.

Marshall’s films have always struggled to transcend their holiday-themed premise, but “Mother’s Day” is perhaps his most desperate effort yet. The film’s lowest moment entails Sudeikis’ character getting injured in a “hip-hop related accident” in order to have a meet-cute in the hospital with Sandy (Jennifer Aniston), a frazzled super mom whose son is having an asthma attack. When the two lock eyes, Sandy has her arm trapped in a vending machine, struggling in vain to free herself from its clutches. “Mother’s Day” no longer needs “Saturday Night Live” to lampoon its formula, now well past the expiration date. The movie is practically its own parody, except that no one appears to be in on the joke.

“Mother’s Day” is getting savaged by critics, and as tracking suggests, the film will become its director’s lowest-grossing film in recent memory. But if its failure symbolizes the final nail in the coffin of the holiday hyperlink movie, Marshall’s film is nothing if not a perfect symbol for today’s cinema. In the 13 years since the release of “Love Actually,” movies are increasingly driven by horizontal integration, corporate spectacles that appeal to both everyone and no one. The holiday hyperlink movie might be a punchline, but the template these films popularized is increasingly the norm. After all, “Mother’s Day” is nothing but “Batman v Superman: Dawn of Justice” for women.

Staring down another decade of superhero mashups (see: “Justice League”) driven by the same business model, “Mother’s Day” is the end of an era, but it’s also the terrifying beginning of a new one.

Source: New feed

From “sob sister” to badass: Lois Lane’s long, amazing journey out of Superman’s shadow

Lois Lane has been around as long as Superman, but she hasn’t had as easy a time. While the Man of Steel has had his difficulties—including his current ridiculous portrayal in Zack Snyder’s movies—ol’ Kal-El hasn’t been defined by ludicrous love stories, treated like an annoying pest who should be spanked, and killed repeatedly just to make another character sad. As recounted in Tim Hanley’s excellent recent book “Investigating Lois Lane: The Turbulent History of the Daily Planet’s Ace Reporter,” Lane’s ups and downs have been numerous and cyclical, and they mirror the struggles of real women. Tellingly, the biggest strides for this character have taken place outside the boys club of comics entirely, especially in Gwenda Bond’s young adult novel “Fallout,” which focuses entirely on Lane and has a sequel, “Double Down,” coming out May 1.

The progress of Lois Lane as a character should sound familiar: one step forward, three steps sideways, two steps back, a giant leap now and then, etc. As Hanley describes in his extremely well-researched book, her journey is a zigzag through almost 80 years of American history, involving a parade of comic book editors and creators—nearly all male. Her first appearance back in Action Comics #1 (also the first appearance of Superman) showed ambition and intelligence, but she spent years as a damsel, buzzkill and plot device. While Lane started as a sob sister (an actual title for a writer who responded to letters from the “lovelorn”) and gradually made her way to the front page of “The Daily Planet,” it was much harder for the character to get her own spotlight. Even when there was a Lois-centric comic, it had a diminishing title, like “Superman’s Girl Friend Lois Lane.”

The most preposterous of many preposterous things involving Lane may have been when a slew of letter writers in the early 1960s called for Superman to spank the pesky reporter. Yes, you read that correctly: actual letters were written demanding a “super-spanking” for an adult woman. As Hanley points out, it’s not just that Superman readers were jerks who infantilized women: the stories themselves put her in a position where readers would be annoyed by her, mostly because of her repeated attempts to prove Superman was really Clark Kent. Lois did get spanked by Superman indirectly at one point—by one of his robots—but the real culprits were oblivious writers.

Many advances for the character seemed to happen by accident, as Hanley recounts. Strong portrayals on TV and in movies, by Phyllis Coates and Margot Kidder, respectively, made a difference. Since Lane was constantly putting herself in danger, often for a story, she became defined by her bravery. Also, as the Sixties progressed, DC began to embrace serialization, getting away from single-issue stories and building longer plots similar to those of rival Marvel. This led to some positive changes for Lois, since in a heavily serialized story there has to be some progress, even if the reset button gets hit in the end. Superman started treating Lois with more respect, and since he stopped infantilizing her, readers stopped calling for her to be super-spanked.

A perfect example of the continual progress/regression of Lois Lane was the brief tenure of Dorothy Woolfolk as editor of DC’s romance line in 1971. This included the Lois Lane solo comic, and Woolfolk pushed Lois and her world in the direction of feminism. Instead of being hamstrung by the boys club of the Daily Planet, Lane went freelance. She was surrounded by other strong female characters; this might be the first era of superhero comics that would pass the Bechdel test. Woolfolk was fired for supposedly being frequently late, but as Hanley explains, there’s no evidence of her being bad at her job, and it’s more likely she lost her job due to “her take-charge, staunchly feminist attitude not sitting well with the old boys’ club that ran DC.” The real-life DC office appears to have been even less women-friendly than that of the Daily Planet.

Lois Lane has survived World War II, the Vietnam War and 9/11, but it’s been harder to escape the shadow of Superman. While her journalistic career has gotten progressively more successful, she’s also had a tendency to die: By Hanley’s count, Lane died eight times during a six-year-period in the early 2000s, proving a perfect example of Gail Simone’s “women in refrigerators” idea, in which female characters perish so male characters can suffer. Fittingly, Lane has made the most progress in a totally different medium: the young adult novel. Gwenda Bond’s 2015 novel “Fallout” reinvents Lane as an Army brat and Metropolis newbie who doesn’t have to stand in the shadow of any doofus in a cape.

While “Fallout” is in many senses a departure, it’s also consistent with the Lois Lane who’s been present (and repressed) since the beginning. As Hanley notes, “She was tough, she was ambitious, she was fearless, and she had very little respect for authority. Through every reboot and adaptation, these basic facts have remained the same.” In fact, Lane retains these qualities even when she’s been transformed into another character entirely.

One of the better recent Lois Lane stories in comics isn’t even about Lois Lane. Over the years, there have been many Superman analogues meant to pay homage to or satirize Superman—these include Hyperion, Apollo, Omni-Man, the Homelander—and Supreme. Supreme wasn’t much of a character when he was created by Rob Liefeld, but he became a celebration of everything great about Superman during a run by Alan Moore. Recently, in the series “Supreme Blue Rose” by Warren Ellis and Tula Lotay, Supreme is barely there; it’s all about Diana Dane, the Lois Lane analogue. This Lane-like character is a journalist and investigator, and everything is seen through her eyes (and the gorgeous art of Lotay). It’s a refreshing series that DC could learn from, and maybe they will; as part of their umpteenth reboot, “Rebirth,” there will be a new “Superwoman” series starring Lane.

Long-running characters like Lois Lane have a life of their own. Comics writer Grant Morrison has made some points about Batman that apply to Lane as well: “I love the fact that you can delve into a fictional character like this and get so much depth and so much history. He’s kind of alive. He’s been around longer than me and he’ll be around when I’m long gone, so he’s kind of more real than me.” In the same way, Lois Lane is more real than you or me, which is oddly comforting. She’s had to deal with a lot of crap, but she’s a survivor who might do anything in the future.

Source: New feed

E.U. forcing refugees into “concentration camps” as economic crisis fuels far-right, warns Greek ex-finance minister

Greece’s outspoken former finance minister says the European Union is putting refugees in what are essentially “concentration camps,” and warns the festering economic crisis is exacerbating the xenophobia, fueling the rise of far-right movements.

In the past two years, hundreds of thousands of refugees and migrants have landed in Greece, hoping to find asylum in Europe.

Yanis Varoufakis, the ex-minister and longtime economics professor, said in an interview on Democracy Now this week that the E.U. has been exerting “tremendous” pressure on Greece, forcing it “to, effectively, intern the refugees.”

Meanwhile, the far-right is on the rise “everywhere in Europe,” and there are chilling signs of the return of fascist ’30s-era politics, Varoufakis cautioned.

Many of the refugees and migrants trying to enter Europe are fleeing war, violence or repression in the Middle East and South Asia. Most are coming from Syria, Iraq and Afghanistan.

The E.U. has been overwhelmingly hostile to their arrival, even while it fuels and intervenes in some of the conflicts they are fleeing.

Europe has setup “hotspot” registration centers for refugees in Greece. Varoufakis says, “When you see the word ‘hotspots,’ just translate it to ‘concentration camps.’ It’s very simple.”

“George Orwell would be very, very proud of Europe and our capacity for doublespeak and creating new terms by which to hide the awful reality,” he added.

“Instead of treating them like human beings in need of support, in need of food, in need of medicine, in need of psychological assistance, they are going to be treated, according to Brussels, as illegals, aliens, that are going to be enclosed in those hotspots, concentration camps,” Varoufakis said in the interview.

“The Greek government, which is, of course, fiscally completely and utterly impecunious, is being told, ‘The only way you are going to get money is if you intern them. So if you let them free and loose, even within Greece, you’re getting not a penny in order to help feed them.’”

Varoufakis also blasted the “ponzi austerity” scheme he says the E.U. and International Monetary Fund are imposing on Greece, while banks rake in billions and the Greek people suffer.

The former finance minister says this “cruel, self-defeating, irrational, inefficient, mind-blowingly inane austerity” is also preventing the Greek government from helping refugees.

While he applauded the “magnificent array of NGOs and volunteers who are looking after the refugees,” Varoufakis noted the “Greek state is in a state of disarray, because it just can’t afford even to look after the Greeks, who are suffering a seven year-long great depression.”

“The Greek state is trying to do something, but is being pushed by Europe to treat inhumanely those refugees,” he said.

Earlier this month, the E.U. began to deport refugees and migrants en masse to Turkey, in a plan that human rights experts say likely violates international law.

NATO also announced a new plan this week to impose a blockade on Libya, five years after bombing the country and essentially destroying the government in the oil-rich North African nation, in order to prevent refugees and migrants from entering Europe. A Human Rights Watch official told Salon this plan also likely violates international law.

In the meantime, far-right groups like Greece’s neo-Nazi Golden Dawn party are growing in popularity.

And “it’s not just Golden Dawn,” Varoufakis warned. “It’s everywhere in Europe.”

“We have a neofascist government in Hungary. We have Marine Le Pen, who’s going to top the presidential race next year in France. I mean, you just have to state this to panic,” he said.

“You have UKIP, the United Kingdom Independence Party, in Britain. You’ve got Austria; in Vienna, the beautiful city of Vienna, 42 percent voted for a neofascist party in the last municipal election,” Varoufakis continued.

He said the reasons behind the rise are “very simple: Great Depression, national humiliation—put them together, like in the 1920s and ’30s in Germany, and you end up with the serpent’s egg hatching.”

Varoufakis argues there is an alternative to these detention camps, mass deportations and blockades, but Europe is not willing to take it.

“This should not be a problem. Europe is large enough. It is rich enough. We should be able to handle this refugee crisis humanely, efficiently, without this even being something we discuss,” he said on Democracy Now.

After the Soviet Union dissolved in 1991, Varoufakis pointed out that Greece, a country of 10 million people, accepted 1 million refugees within a few months.

“Do you know what happened?” he asked. “Nothing. It was all fine. They still live there. Their kids come to the university where I teach. They are amongst some of my best students.”

“Greece has become enriched. Our culture has become stronger. Our food has become even better,” Varoufakis added.

“And if a small, middle-, lower-middle-income country like Greece can accept a 10 percent influx of refugees in a few months and do quite well out of it and actually be stronger as a result of that, Europe, which is aging pathetically, should accept these refugees, like Angela Merkel initially said in September, October.”

Varoufakis concluded the interview warning that Europe is seeing the terrifying signs of a resurgence of fascism.

“The European Union is disintegrating under its postmodern 1930s,” he said. “This is what we’ve been experiencing the last 10 years due to the economic crisis.”

Source: New feed

“Ponzi austerity” scheme imposed by E.U. and U.S. bleeds Greece dry on behalf of banks, says ex-finance minister

The former finance minister of Greece says the European Union and international financial institutions are imposing unjust “ponzi austerity” on his country, while banks rake in billions and the Greek people suffer.

For years, Greece has faced enormous economic hardship. In the wake of the 2008 Great Recession, the country plunged into a debt crisis. In return for large loans, the Troika — which consists of the European Commission, the European Central Bank and the International Monetary Fund — has demanded that Greece impose harsh austerity measures, cutting social services, slashing government programs and privatizing state assets.

Yanis Varoufakis, the former finance minister and longtime economics professor, says Europe is “confusing butchery for surgery” by continuing to demand crippling austerity policies, also known as structural reforms.

“Greece is being trampled upon,” he said in an interview on Democracy Now. Varoufakis accused the Troika of trying “to turn Greece into a desert” and condemned the past three bailouts as money-making opportunities for German and French banks.

He also slammed the Obama administration for continuing to support this “cruel, self-defeating, irrational, inefficient, mind-blowingly inane austerity.”

The effects of austerity on Greece have been nothing short of catastrophic.

From 2008 to 2013, Greeks became on average 40 percent poorer. More than one-third of Greeks, 36 percent of the population, are at risk of poverty and social exclusion. Poverty is worse in Greece than it is in Latvia, and among the worst in the eurozone, surpassed only by Bulgaria and Romania.

One out of every four Greeks is unemployed, the highest unemployment rate in the E.U. Among Greeks under age 25, the problem is even worse, with roughly one out of every two people unemployed.

Despite this dire situation, in negotiations this week, international creditors demanded that Greece impose further austerity measures. The International Monetary Fund is ordering the Greek government to cut pensions and eliminate income-tax exemptions.

“A great depression, with no end in sight”

Varoufakis argues this is all part of a “Ponzi austerity” scheme.

After the financial crash in 2008, “there was a cynical transfer of private sector, private bank losses onto the shoulders of the weakest of taxpayers, the Greeks,” he said on Democracy Now.

Varoufakis opposed the bailouts, which he described simply as “typical extend-and-pretend loans.”

He explained: “A second predatory loan is enforced upon the Greek government in order to pretend that it is making its payments for the first loan, and then a third one, and then a fourth one. And the worst aspect of it is that these loans, which were not loans to Greece, were given, extended, on condition of stringent austerity that shrunk our incomes.”

“So we entered a debt deflationary cycle, a great depression, with no end in sight.”

Varoufakis blasted the distortions “peddled by the mainstream media.” Many media reports claim the negotiations between Greece and its creditors have stalled because the Greek government is resisting structural reforms. Varoufakis said “nothing could be further from the truth.”

“The fact of the matter is that the Greek government last summer, in July of 2015, surrendered to the creditors,” Varoufakis explained. Since then, Greece “is simply told what to do. And it’s trying to do it. It is trying to implement it.”

“This is why I’m no longer the finance minister,” he added. Varoufakis stepped down in July when the Troika refused to budge on austerity.

The left-wing Syriza party was elected by a landslide in January 2015 on an anti-austerity platform. Varoufakis, who served as its first finance minister, says Europe and financial institutions refused to allow the Greek government economic autonomy, forcing it to continue implementing harsh austerity measures.

In July, Greeks held an historic referendum, voting overwhelmingly against austerity. Europe ignored this referendum, effectively overriding the democratic will of the Greek people.

In a statement explaining his resignation in July, Varoufakis noted that eurozone finance ministers were frustrated with his firm opposition, and heavily pressured him to leave.

“I shall wear the creditors’ loathing with pride,” Varoufakis wrote at the time.

Unsustainable debt

Today, Varoufakis says the reason there is another impasse in the negotiations is because the International Monetary Fund, or IMF, and the German government “can’t see eye to eye.”

The IMF is frustrated with Germany, Varoufakis says, because — although it frequently supports austerity programs — it at least acknowledges that Greece’s debt is ultimately unsustainable.

“They know that the numbers that the European Commission, on behalf of Germany, is imposing upon Greece are numbers that will explode, they will fail,” Varoufakis explained. “In six months’ time, we’ll have another failure of the program, like we’ve had in the last six years.”

The IMF wants Greece to devalue labor and pensions, to close down small businesses and replace them with large corporations, but understands that this would shrink Greece’s national income, making it impossible to repay the gargantuan debt.

“They’re like a quasi-numerate villain, who wants to turn Greece into a desert, call it peace,” Varoufakis remarked.

Germany, on the other hand, does not want to talk about the debt, because it would then have to admit that “the so-called bailout loans for Greece were not loans for Greece. They were loans, bailout loans, for German and French banks,” Varoufakis added.

The records of an internal meeting released by whistleblowing journalism organization WikiLeaks show IMF officials asking themselves what it will take Europe to recognize that Greece’s debt must be cut. They conclude that some kind of even, another crisis, is needed.

“The elephants in the room are tussling, and the little pipsqueak mouse, Greece, is being trampled upon,” Varoufakis said. “It’s got nothing to do with the Greek government stalling on structural reforms.”

“By the way, there are no structural reforms in question,” he added. “Cutting down pensions is not reform. It’s like confusing butchery for surgery. It’s not the same thing.”

U.S. support for “Ponzi austerity”

The U.S. has joined in pressuring Greece to continue to implement the crippling austerity measures.

After President Obama met with German Chancellor Angela Merkel this week, White House spokesperson Josh Earnest told reporters “we’re very supportive of the efforts that members of the E.U. have made to deal with the financial challenges posed by Greece’s finances.”

The White House insisted Greece must follow “through on a number of structural reforms,” stressing “Greece has a responsibility to do that.”

In his interview on Democracy Now, Varoufakis criticized President Obama.

“We have, yet again, this typical disconnect in the American administration,” he said. “It’s quite astonishing and saddening.”

Obama publicly claims he opposes harsh austerity, Varoufakis explained, yet, at the same moment, “his spokesman comes out and supports cruel, self-defeating, irrational, inefficient, mind-blowingly inane austerity.”

Varoufakis calls this “Ponzi austerity.”

A Ponzi scheme is a scam in which a business pretends to grow its income by taking on more and more unsustainable debt.

In a similar fashion, the Troika has continued to impose more and more debt on Greece, which can only be used “to pretend to repay the previous debts.”

Austerity then “continuously reduces national income, because when you reduce pensions, when you reduce investment, when you reduce all the determinants of aggregate demand, income of the nation shrinks.”

“And you keep tightening that belt through more pension cuts, more reductions in public health and so on, and public education, and you keep on taking new unsustainable loans in order to pretend that you’re not insolvent,” Varoufakis explained. “That’s Ponzi austerity for you.”

Bailout for the banks

Varoufakis still has hope that things can be turned around, but not from within. He emphasized that much of the German people stands with Greece in opposing harsh austerity measures. Rather, it is the German government that is so insistent on continuing down the same path.

In 2015, Varoufakis launched the Democracy in Europe Movement 2025, or DiEM25, a pan-European political movement that warns “The European Union will be democratised. Or it will disintegrate!”

The former finance minister insists the bailouts have not just been bad for Greece; they have also been bad for Germany.

“The problem is not that Germany has not paid enough. Germany has paid too much, in the case of the Greek bailout,” Varoufakis explained on Democracy Now. “We had the largest loan in human history. The question is, what happened to that money?”

“It wasn’t money for Greece. It was money for the banks. And the Greek people took on the largest loan in human history on behalf of German and French bankers, under conditions that guaranteed that their income, our income in Greece, would shrink by one-third.”

According to Varoufakis, 91 percent of the first bailout and 100 percent of the second bailout went to German and French banks. The money did not end up in taxpayers’ pockets; it ended up in bankers’ pockets.

“That is ‘Grapes of Wrath,’ John Steinbeck material,” Varoufakis quipped. “One-third of national income, poof, disappeared.”

Source: New feed