Opinion

Jack Shafer

The battle over Benghazi

Jack Shafer
Nov 2, 2012 22:35 UTC

When Washington bureaucracies rumble, they often avoid directly savaging one another by using the press as proxies. By leaking selectively to news outlets they believe will give them the most sympathetic hearing, they hope to shape the news by making it. The strategy doesn’t always work. Sock puppetry revolts good reporters and some bad ones, too, because they know carrying tainted water for a source today may stain their reputations tomorrow.

The Benghazi story hasn’t turned any reporters into absolute dummies—yet—but as the tag-team match of blame being played by the White House, the State Department, a congressional committee, and the CIA escalates–and with the Romney campaign eager to pounce on anything that makes the administration look bad–don’t be surprised if unnamed sources start spinning the facts in a self-serving manner.

You shouldn’t feel bad if you’re confused about Benghazi and have no idea who should be sacked for not doing his job: Press accounts and comments from President Barack Obama and Secretary of State Hillary Clinton have all congealed into one murky, confusing stew. Now, clarity has arrived in a new ultra-narrative given yesterday to Washington Post columnist David Ignatius, the New York Times, Reuters, the Wall Street Journal, the Washington Post, and others, sourced to a “senior administration official,” “senior intelligence officials,” “a senior American intelligence official,” “a senior U.S. intelligence official,” and “U.S. intelligence officials,” respectively.

It’s hard to believe these officials  are anyone other than CIA brass, something Ignatius’s column said quite literally when it was first posted with a headline reading, “CIA Offers Detailed Account of Attack in Libya.” Later, that headline was swapped out with the more generic “U.S. Offers Detailed Account of Attack in Libya,” as this screenshot tweeted by Jeremy Scahill attests.

In the time line put out by the CIA, the agency essentially takes the blame for the deaths of Ambassador Christopher Stevens and three other Americans in Benghazi on Sept. 11. But it also takes credit for the rescue of 30-plus other U.S. officials. The Journal reports:

The U.S. effort in Benghazi was at its heart a CIA operation, according to officials briefed on the intelligence. Of the more than 30 American officials evacuated from Benghazi following the deadly assault, only seven worked for the State Department. Nearly all the rest worked for the CIA, under diplomatic cover, which was a principal purpose of the consulate, these officials said.

The CIA version attempts to extinguish the flaming Fox News account of Oct. 26, which accused agency superiors of preventing the rescue of Americans in Benghazi. The piece cites anonymous “sources who were on the ground in Benghazi” who say the “CIA chain of command” allegedly told “CIA operators twice to ‘stand down’ rather than help” the besieged ambassador and his team.

“There were no orders to anybody to stand down in providing support,” an anonymous official tells the Times.

In hogging the blame and the credit for the Benghazi defense, the CIA version slightly deflates the Republican-led congressional investigation of the attack, which has criticized the White House and the State Department for poor security planning in Benghazi. In hindsight, the dueling narratives about exactly what happened in Benghazi can now be attributed to alleged secret arrangements that the State Department made with the CIA for security in Benghazi, where the CIA was operating an “annex” near the public diplomatic mission,and the classified nature of the CIA’s work. (The CIA was chasing terrorists, as well as shoulder-fired missiles and other military gear liberated from Libyan Army arsenals.) This security arrangement wasn’t widely known outside of Libya, the Journal reports, with “many top officials at the State Department in Washington [not] initially aware that the annex had a security force that answered to the CIA and provided backup security for the consulate.”

By subtly accusing the State Department of its left hand not knowing what its right hand was doing, the CIA account invites a State Department counter-briefing of reporters that slags the CIA for its deficiencies. Will Secretary of State Clinton accept or decline? Another player in the Benghazi bureaucratic battle is the FBI, which is investigating the Benghazi attacks. The FBI has cause to be irate with the CIA because the agency delayed in turning over surveillance tapes for its investigation, the Journal reports. Meanwhile, the CIA is allegedly sore with the FBI for dallying before in giving it copies of FBI witness interviews.

The White House itself comes out relatively unscathed—and unmentioned—in today’s news stories. If you were the charitable sort, today’s accounts justify President Obama’s and his spokesmen’s disingenuous comments about the hows and whys of the attack as an attempt to provide cover for the CIA’s Libya secret operations. But surely somebody in the bureaucracy has a score to settle with them, too.

Every bureaucracy despises the other bureaucracies for the usual Machiavellian reasons: They are impediments and competition. When the uneasy peace between bureaucracies is upset, as with the Benghazi attacks, scores both new and old get settled on Page One with briefings, leaks, and interviews. The media battle of Benghazi has only begun.

******

Leaks from all aggrieved Benghazi parties accepted at Shafer.Reuters@gmail.com and dispensed by my Twitter feed. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

Is it ever okay for journalists to lie?

Jack Shafer
Nov 1, 2012 16:13 UTC

This article originally appeared in the September/October issue of the Columbia Journalism Review.

In 2007, investigative journalist Ken Silverstein went undercover to test Washington lobbyists’ taste for sleaze. Using an alias, Silverstein created a fictitious energy firm that ostensibly did business in Turkmenistan and approached professional lobbyists to see if they could help cleanse the regime’s neo-Stalinist reputation. The bill for services rendered—newspaper op-eds bylined by established think-tankers and academics, visits to Turkmenistan by congressional delegations, and other exercises in public relations—would have been about $1.5 million. (Disclosure: I consider Silverstein a friend.)

But when Silverstein’s piece, “Their Men in Washington: Undercover With DC’s Lobbyists for Hire,” was published in the July issue of Harper’s, the resulting uproar had less to do with craven lobbyists than with journalistic impropriety. Various critics assailed Silverstein for his charade: Washington Post media reporter Howard Kurtz, an ethics expert at the Poynter Institute, a CBS News blogger, an American Journalism Review writer, and other notables. Journalists shall not lie, the critics mainly agreed. Doing so diminishes their credibility and that of the entire profession.

But Silverstein’s subterfuge was no outlier, as Brooke Kroeger demonstrates in her comprehensive history and exercise in soul-searching, Undercover Reporting: The Truth About Deception. For more than 150 years, American journalists have been playing make-believe to get themselves thrown into jails and loony bins; conniving their way into punishing factory jobs; and posing as high school students, Ku Klux Klan members, and even pregnant women in search of abortionists.

Journalists have even fashioned Mission: Impossible scenarios to snare wrongdoers, as the Chicago Sun-Times did in 1978, when it acquired a downtown bar, named it The Mirage, and staffed it with reporters. The paper documented, in a 25-part series, payoffs to city health inspectors, shakedowns by state liquor inspectors, tax fraud, kickbacks, and other crimes. The series was regarded as both a sensation and an abomination—although a Pulitzer Prize jury tapped it for an award in the Local Investigative Specialized Reporting category, the Pulitzer board overturned the jury’s selection because it disapproved of the Sun-Times’s methods.

Kroeger approaches the genre as a fan and champion. Her goal, largely accomplished with this book, is to polish undercover’s tarnished image and restore it to the place of respect (or semi-respect) it once enjoyed. Kroeger aims to establish undercover reporting as a common technique, not just the work of a few rogue reporters—and to convince journalists that it ought to be used more often.

Her restoration project does not suffer for raw material. A reporter for Horace Greeley’s New York Tribune used a variety of cloaking strategies while covering the American South just before the Civil War, including lying to sources about where he was from and changing “names, places, and dates” in his dispatches to avoid detection. In 1887, New York World reporter Nellie Bly became famous for impersonating a lunatic to gain admittance to a madhouse so she could report on its awful conditions. Between the 1960s and the 1990s, Kroeger writes, a variety of mainstream newspapers exposed housing discrimination by having black and white reporters pose as home buyers or prospective tenants. Gloria Steinem scored a cultural exposé in the early 1960s when she used her grandmother’s name and Social Security number to get a job as a Playboy bunny and wrote about it for Show magazine. In 1992, ABC News exposed substandard meat-handling practices at Food Lion, but told a raft of lies to get its reporters inside as employees.

Kroeger’s point with all this historical research is to put undercover reporting on a continuum, from no-lie investigations on one end to Mirage-style theatrics on the other. But she gives what I consider to be an over-expansive definition of undercover reporting. Almost any project that has relied on any combination of deception and subterfuge to expose a story of public importance seems to qualify. For instance, her lead example is the 2007 Washington Post series by Anne Hull and Dana Priest about the deplorable treatment of patients at Walter Reed Army Medical Center. As Kroeger acknowledges, Hull and Priest never assumed false identities in their reporting. They told no lies and donned no disguises. Yes, they shunned personnel who might ask them nosy questions, and they didn’t ask officials for permission to report at the hospital. But if such evasive maneuvers equal “undercover reporting,” I would hypothesize that 75 percent of all working reporters have at one time or another in their career gone “undercover,” too.

Kroeger juxtaposes the Hull and Priest investigation (and “no-lie” investigations like it) with the work of aggressive journalistic liars because she finds unity in the two techniques, and wants to explore “whether there is really a difference for a journalist between not ever telling a lie—emphasis on the word telling, because lies, to qualify as lies, are verbalized—and the deliberate projection of a false impression with the clear intention to mislead, to deceive.”

But placing the Walter Reed investigation inside the same journalistic genus as the Mirage series constitutes a grievous taxonomical error. The “deliberate projection of a false impression” is something reporters do almost daily. When an official inadvertently spills the beans during an interview, the smart reporter suppresses his excitement and caps his pen in hopes that the official will dig himself in deeper. Such deliberate projections, by the way, must rank among the most common human activities. Parents scowl at their children to make them behave when they’re really not angry; buyers feign nonchalance to convince sellers to lower their prices; and so on.

In my mind, there is a world of difference between the two undercover genres. Compare, for example, the famous workplace investigations conducted by Tony Horwitz, Charlie LeDuff, and Barbara Ehrenreich, which Kroeger neatly summarizes, and the works of Silverstein, the Chicago Sun-Times, and James O’Keefe and Hannah Giles, the BigGovernment.com contributors who pretended to be a pimp and prostitute in order to embarrass the advocacy group ACORN.

When Horwitz and LeDuff went to work at slaughterhouses for The Wall Street Journal and The New York Times, respectively, and Ehrenreich took menial jobs for her book about the working poor (Nickel and Dimed), the lies they told were de minimis. Ehrenreich, for example, omitted from job applications parts of her distinguished academic record. They created no new scenarios by their actions—they merely slipstreamed themselves into a story already in progress. Once there, they did little or nothing to contaminate the reportorial soil.

Silverstein, the Sun-Times, and O’Keefe, meanwhile, didn’t just contaminate the soil, they created it by telling their fictions. They then invited people from K Street, the Chicago bureaucracy, and ACORN to join the casts of their improvisation. The difference is between writing about a world that already exists, and conjuring one to embarrass the potentially guilty, a distinction Kroeger seems not to want to accept.

I would be misrepresenting Kroeger if I implied that she defends every lying reporter ever to carry out a convoluted sting. In her preface she writes that “at its best, undercover reporting achieves most of the things great journalism means to achieve. At its worst, but no worse than bad journalism in any form, it is not only an embarrassment but can be downright destructive.” I wish she were more judgmental about some of the techniques some reporters use, but Undercover Reporting is not that kind of book. If you are the type who seeks a thumbs-up or thumbs-down on every controversy, Kroeger is not writing for you.

You can sense her disapproval when she compares James O’Keefe’s ACORN sting with Ken Silverstein’s, but she doesn’t come out and say what I think she’s thinking: that O’Keefe is a crackpot and Silverstein is a genius. Instead, she writes, “What is most important in these cases is the exercise of sound journalistic judgment: to establish first if the deception was important enough to perpetrate, and after that, if accepted journalism standards have been fully adhered to and met, and if that can be reliably verified.”

If you’ve ever reported a story, you automatically understand the appeal of telling lies to get to the truth—they can be a wonderful shortcut! Conventional shoe-leather reporting requires time, sources, and energy, and must produce genuine findings in order to get published. But lie-based journalism works like EPO on both journalists and readers—it permits journalists to write bigger and faster (as if they’re writing a review of their own improv drama!), and the cat-and-mouse quality of the deception gives readers an extra, entertaining thrill. Not that giving readers a thrill is a bad thing, but I draw the line at turning news stories into episodes of Punk’d. Kroeger appreciates this, writing that even the well-intended sting can backfire and “veer off into the ridiculous or purely sensationalistic.”

Undercover Reporting intends to provoke its readers, and it did me. If an editor thinks he should launch an investigation with overt lies because overt lies pave the quickest path to the hard-to-get truths, why should he stop there? Why not have his reporters pepper whole, hard-to-get stories with plausible lies and half-truths as long as they propel a story to the ultimate truth? Why not lie to readers, too? We journalists don’t trust sources who lie. Then why should we trust reporters who do the same? When I was an editor, I occasionally had trouble keeping my less-scrupulous writers on the straight and true. When I imagine giving them—or even my most conscientious reporter—a license to make things up in order to get a story, my mind derails. Kroeger’s thoughtful openness to telling direct lies has turned me full-force against the technique.

The strange allure of disaster porn

Jack Shafer
Oct 30, 2012 21:52 UTC

Like me, you’ve probably been flipping from the Weather Channel to CNN with one hand and raking the Web with the other, searching for scenes of maximum destruction from Hurricane Sandy. Long after satisfying your basic news needs about the horrific body counts, power outages, travel advisories, school closings, and surges of tidal and river water to come, you’ve likely been loitering around your screens for more. Somebody tweets about a live video feed of a construction crane gone limp in midtown Manhattan, and we go there. Emails from friends direct us to videos of vehicles floating through lower Manhattan like derelict bumper cars and the shattering of the Atlantic City boardwalk into toothpicks. Next up, toppled trees, washed-out rails, flooded streets, subways, and tunnels, and the sinking of HMS Bounty.

Oh, the horror! Pass the popcorn.

Advanced voyeurs (you know who you are) understand that shame, rather than being a deterrent, actually works to reinforce both the urge to look and to share what we’ve seen. I’d have continued watching TV and scanning the Web until the early a.m., messaging to my friends and family what I’d seen, had not the pop and flash of a nearby transformer killed my electric power at 9:30 p.m. on Monday.

What impels us to watch, to hunger for more disaster and mayhem, and to keep on watching long after we’ve learned all there is to know? Wake Forest University English Professor Eric C. Wilson gathers some clues in his new book, Everyone Loves a Good Train Wreck: Why We Can’t Look Away. We never feel more alive than in times of distress, danger, and calamity, Wilson writes, whether we experience it directly or at a televised remove, watch it dramatized in a movie, or read it in a novel. He cites a psychologist to theorize that our morbid curiosity has an evolutionary function: Being well-informed about dangers and potential dangers helps us survive; finding points of empathy through which we can connect with those who have suffered allows us to build lasting bonds. Wilson discusses the cultural appeal of fairy tales, horror films, and “documentaries” like Faces of Death; he recycles the now-standard view that gruesome and graphic stories prepare the young for adulthood; and he reminds us of how Aristotle schooled us in the value of catharsis to explain our fascinations with the perverse.

Or is our connection with the macabre more about animal arousal than it is evolution? Wilson, backed by Kant and Burke, surmises that as long as we can watch from a safe vantage point—but the closer the better—we can “undergo a sublime experience” while observing the suffering of others or a catastrophe. I suspect that the sublime experience is a learned one—that the first time you rubberneck a car crash you don’t quite understand it but over time, by poking dead cats flattened on the highway and going to your grandmother’s open-casket funeral, you eventually get it. From there—at least for boys—emerge new horizons, the delights of setting off firecrackers taped to robin’s eggs and of breaking schoolroom windows after hours. As P.J. O’Rourke once put it, “making things and breaking things” brings the only true joy in life. When nature builds something as powerful as a hurricane that breaks things in new and inventive ways, how can we not gawk? We’re all transported back to the sandbox where we, young creator-destroyers, obliterated the cities of sand we’d carefully constructed.

Proximity to the action is essential for us to experience the sublime, Wilson argues, and I agree.  Natural disasters in Asia made for dry reading back in the day when news was transmitted by telex and newsreel. But 24-hour-news satellites, cheap video cameras, and the Internet have made all disasters local, whether they be tsunamis in Thailand and Japan, earthquakes in New Zealand, or terrorist attacks in New York and London. Television and the Web place us in the comfortable zone between too-far-away-to-feel-the-rush and I’m-so-damned-close-I-got-splattered-with-blood. As I noted above, the media buzz I got last night from the Hurricane Sandy coverage could have kept me up for hours beyond my usual bedtime. Had my electric power been restored by morning, I don’t have to tell you what my first act would have been upon awakening.

Our appetite for destruction does know limits. One might be wise to decline the offer of viewing a “beheading” video, Wilson writes, unless the viewer can erect a sufficient a psychological “buffer” between himself and the images or has another way to decipher the rawness of the havoc into something that has meaning. “[W]ithout this buffer, we risk the transformation of morbid curiosity into trauma.” (Wilson says he’s never watched one.)  One way to tame indecipherable images of death is to experience them as a group. I doubt if many witnesses to public hangings, even first-timers, ever had trouble sleeping the next night—such are the comforts of being a part of a mob. Another way to suppress the direct power of the images is to add the element of a story to the action, something that nobody seems to have tried to do with beheadings. If we can tie “a horrific eruption to a coherent narrative, then [we] can understand the terror as part of a larger and purposeful structure,” he writes.

That’s one of the reasons we couldn’t stop watching the Trade Center towers burn and fall after 9/11. Each narration of the events, each new view, helped us integrate the disaster into something more containable than the first viewing, for who among us had ever seen a skyscraper filled with people collapse? There have been attempts to ease the trauma of 9/11 with memorials in Manhattan, at the Pentagon, and at a crash site in rural Pennsylvania, but I don’t know that they’re working. Maybe someday they’ll be like the Khmer Rouge’s killing fields, the site of the Chernobyl disaster, and the “preserved” devastation of Katrina of which Wilson writes, places where people bear witness, bury demons, and, yes, do a little shameful rubbernecking.

If you’ve seen more of the Hurricane Sandy disaster than you really need, you should be ashamed. But not too ashamed. The only thing worse than looking too much is not looking at  all.

******

Tame me with email to Shafer.Reuters@gmail.com or witness the horror of Twitter feed. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

PHOTO: A large uprooted tree lies on a house following a night of high winds and rain from Hurricane Sandy in Bethesda, Maryland October 30, 2012. REUTERS/Gary Cameron

Mergers alone won’t save book industry

Jack Shafer
Oct 26, 2012 21:55 UTC

News of merger talks between book publishers Random House and Penguin has shaken loose alarmist responses from the book industry: howls from agents and authors that they’ll have fewer publishers to pitch to, and hence their incomes will fall; warnings that editors and marketers face huge layoffs; fears that reducing the number of big publishers from six to five will bestow upon the survivors unprecedented cultural hegemony.

Somewhere somebody must be describing the impending merger and the increased concentration of book power in fewer New York hands as an assault on democracy.

If the admonitions seem familiar, it’s because they’ve been sounded for a half century. The book industry has been consolidating steadily since the early 1960s, when independent publishers–many of them run by families–swarmed. A July 31, 1960, New York Times article (subscription required) chronicled that era’s merger-mania, as independent publishers Holt, Rinehart, and Winston had hooked up to create a new company—Holt, Rinehart, and Winston—that sounded like a law firm. In other transactions, Random House had acquired Knopf and Crowell-Collier had taken Macmillan, presaging the coming days when conglomerates would eventually swallow the industry’s major players.

“The mergers have, in some cases, meant consolidation of clerical and shipping staffs,” the Times article reported.

Yet the number of big publishers has remained fixed at six since the mid-1980s suggesting that for all the shouting, the consolidation of the industry has been exaggerated.

Some publishing executives are on record advocating greater consolidation. In 2009, for example, Arnaud Nourry, head of Hachette (then the world’s No. 2 publisher by sales) called consolidation the best way to compete with Barnes & Noble, Amazon.com, Apple, and Google, all of which have been vying to become the digital tails that wag the publishing dog. (See this recent Bloomberg Businessweek story about Amazon’s hardball tactics against publishers.) Nourry and other publishers regard bulking up with mergers and acquisitions as a business offensive. But from my vantage point, it looks like a defensive crouch that says We’ve got the books, Amazon, you got bupkis, and we’re going to set the terms for the digitization of the book industry, not you!

Like other industries, the big publishers feel the pull of the “consolidation curve,” the term of art devised in this 2002 Harvard Business Review article to describe the trend toward greater and greater consolidation of mature companies. If individual industries don’t consolidate, it’s because they’ve evolved into something new (from buggy-maker to automaker) or they’ve just disappeared.

Nobody thinks book publishing will disappear, especially given the ubiquity and ease of e-books. After all, even in a world of cheap self-publishing, somebody has to find and market books to the masses. Recognizing this business reality (and good for them) is Penguin, which recently purchased the self-publishing company Author Solutions for $116 million. But the long-term prognosis for books is still not super. A New Yorker writer was probably right to fret in 2007 that pleasure reading may “one day be the province of a special ‘reading class,’ much as it was before the arrival of mass literacy, in the second half of the nineteenth century.” Indeed, the industry has fallen into a stagnant funk. A year ago, the Association of American Publishers claimed as positive news that 4.1 percent more books were sold in 2010 over 2008. 4.1 percent! That kind of growth kills.

According to the Harvard Business Review wizards, a company that possesses a terrific first-mover advantage, a technological edge, or a patent portfolio strong enough to protect it from the competition can build scale and dominate its sector. Penguin was that sort of revolutionary company when publisher Allen Lane founded it in the mid-1930s. Lane intuited correctly that there was a market for quality literature printed inexpensively in paperback form. But his success inspired so much instant competition that his first-mover advantage was fleeting, and he couldn’t patent the paperback.

The Penguin-Random House merger would theoretically give the new company more leverage in the pricing fights with Amazon et. al. But as important as that struggle for control might be, it still leaves Penguin-Random House operating in a moribund and hidebound enterprise that looks and acts like something out of the 18th century. Book publishers are playing against a stacked deck. They don’t own the distribution channels, they don’t own the stores, they don’t control any proprietary technologies or patents, they’re terrible at inventing new products, and the market value of their brands is dwindling. Plus, their most valuable properties, their writers, are free agents who don’t really belong to them.

This merger—and other book industry consolidation to come—is less about winning than it is losing more slowly.

******

“Random House & Penguin Publishers are negotiating a merger … please let the new company be called Random Penguin,” tweeted Shara Morris today. I can’t beat that! Send email to Shafer.Reuters@gmail.com. See also, the right hemisphere of my personality on Twitter. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

PHOTO: Leona, 7, poses inside a labyrinth installation made up of 250,000 books titled “aMAZEme” by Marcos Saboya and Gualter Pupo at the Royal Festival Hall in central London July 31, 2012. REUTERS/Olivia Harris

The New York Times, the BBC and the Savile sex scandal

Jack Shafer
Oct 25, 2012 23:02 UTC

Before he has even had time to measure his office windows for draperies, incoming New York Times Co. CEO Mark Thompson is in the media crosshairs. No less a figure than Times‘s public editor, Margaret Sullivan, implored the paper this week to investigate what role, if any, Thompson had in a burgeoning scandal at the BBC, which he headed for eight years until late this summer.

The BBC scandal is so long-running, so multifaceted and so sordid that it could potentially injure everyone who has worked at the organization over the past 40 years—up to Thompson but including the janitors who clean the BBC’s studio dressing rooms—even if they’re guilty of nothing.

The scandal’s center is Jimmy Savile, the longtime host of a variety of BBC radio and TV programs for kids and young people (including the Top of the Pops), a celebrity fundraiser and friend to politicians and royalty. Late last year, shortly after Savile died, the BBC’s Newsnight program readied an investigative piece about Savile’s alleged sexual abuse of young girls. But just as the findings were about to be broadcast, Newsnight‘s top editor gave it the spike.

A BBC competitor, ITV, picked up where the BBC left off, and at the beginning of this month broadcast its 50-minute expose titled “The Other Side of Jimmy Savile.” Long rumored to have had a thing for young girls, Savile allegedly used his professional perch as a BBC TV host to ingratiate himself with and then sexually abuse young girls (some of them underage) in hospitals, in BBC dressing rooms, in his Rolls Royce and elsewhere. The ITV program got five accusers and three witnesses to give their accounts of sexual abuse, some of which date to 1968.

Although the spiking of the Newsnight segment had been rumored as early as February, the ITV program upended the BBC, making it look craven and self-protecting for not running its Savile investigation. Since the ITV program aired, Scotland Yard and other police forces have opened criminal investigations of the allegation; the BBC has commenced its own internal investigation; members of Parliament are threatening to lay siege to BBC Television Centre; newspapers are ripping the BBC for allegedly abetting Savile; and the editor who killed the BBC investigation, Peter Rippon, has been forced out.

Even the BBC is beating on the BBC for its shortcomings. On Monday its Panorama program broadcast a 60-minute documentary titled “Jimmy Savile: What the BBC Knew,” exploring Savile’s alleged sex crimes and the BBC’s abandonment of the original Newsnight segment about the story. Panorama‘s inconclusive findings suggest that pressure may have come from above—from the BBC’s editorial brass and maybe even management. Not since 2003, when the New York Times investigated its own institutional breakdown in the Jayson Blair scandal, has a major news organization performed such a painful and public act of self-analysis. The Telegraph‘s Neil Midgley got it right in his review, writing that the program’s existence is a “testament to the tenacity of the BBC’s journalists.”

(Confused? Here’s a who’s who of the scandal produced by the BBC. And here’s a time line of events from the Independent)

Which brings us back to Mark Thompson, who left the top job of BBC director general in September. Although no one —not ITV or Panorama or the newspapers—has located a smoking gun, suspicions abound that top news executives at the BBC. and perhaps business-side executives such as Thompson, may have nudged Newsnight‘s editor into dropping the Savile investigation. Based on Panorama‘s findings, another BBC executive, George Entwistle, who replaced Thompson after leading the office that oversees BBC production and scheduling, may have played a part in forging the spike. But that’s circumstantial, based on a 10-second conversation he had with the BBC’s head of news. (For what it’s worth, Entwistle* has apologized to Savile’s alleged victims.)

Circumstantial or not, BBC bosses interested in self-preservation would have had several excellent corporate reasons to kill the Savile program because it would bring extended shame on the BBC. For one thing, the expose arrived at an inconvenient moment: Savile had just died and the BBC had scheduled holiday tributes to the immensely popular broadcaster. Convicting him of statutory rape one night and revering his memory on another would have looked foolish. Also, no matter what the timing of the expose, the Savile revelations are so pervasive—with many BBC employees knowing about or having heard about Savile’s predatory ways over the decades—that the BBC must have turned an institutional blind eye to his alleged crimes. According to a BBC report today, the number of Savile’s alleged victims has reached 330, including one of the entertainer’s great-nieces, who says he put his hands in her underwear when she was 12.

The chairman of the BBC Trust, Lord Patten, extended no cover to executives today when he said he would be “not surprised” if the Savile scandal and its fallout resulted in resignations.

For his part, Thompson vehemently denies having done anything wrong. A few hours after Public Editor Sullivan demanded that the paper investigate Thompson, the Times published a story in which its reporters questioned him. “I did not impede or stop the Newsnight investigation, nor have I done anything else that could be construed as untoward or unreasonable,” he told the Times. Times Co. Chairman Arthur Sulzberger Jr. continues to back Thompson, who starts his CEO job on Nov. 12.

It’s reasonable to assume that Thompson was so removed from BBC programming decisions that he bears no culpability for the killing of the Newsnight expose. It may not seem fair, but Thompson won’t get off the hook by maintaining that he’s done nothing untoward or unreasonable in the immediate Savile episode, even if true. Shouldn’t he and his co-executives have known about the Savile allegations before Newsnight started digging into them, some will ask. Shouldn’t he have known what Newsnight was up to and supported their investigation, others will chime in.

With the big investigative machine whirring, and New York Times investigative chief Matt Purdy now “in London covering BBC story,” according to Public Editor Sullivan’s tweet, Thompson is in for the scrutiny of his life. If the Times and other outlets don’t find Thompson’s fingerprints on the Savile spiking, they’ve got his entire eight-year tenure at BBC director general to examine. Given the nature of the allegations and the way scandal investigations billow, the press may ultimately make Thompson pay for what he didn’t do—not for what he did do.

*CORRECTION: This piece originally attributed the comment “tsunami of filth” to George Entwistle. It was Lord Patten who called the scandal a tsunami of filth.  The erroneous quotation has been deleted.

******

I wouldn’t want to be Thompson right now. I’d rather be Shafer.Reuters@gmail.com. See also, the right hemisphere of my personality on Twitter. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

PHOTO: The late Jimmy Savile at the unveiling of a monument, commemorating the fighter pilots who fought in the Battle of Britain, Sept. 18, 2005. REUTERS/Paul Hackett

 

Why we vote for liars

Jack Shafer
Oct 9, 2012 20:43 UTC

The great fact-checking crusade of 2012 by FactCheck.org, PolitiFact, The Fact Checker, CNN Fact Check, AP Fact Check, etc. has told us something very important about the workings of democracy that we already knew: Candidates bend the truth, distort the facts, fudge the numbers, deceive, delude, hoodwink, equivocate, misrepresent, and, yes, lie, as a matter of course.

Both major-party presidential candidates and their campaigns routinely lie, as a Time magazine cover story recently documented, although the publication gave Mitt Romney’s campaign top honors for lying more frequently and more brazenly. Time is not alone in its assessment: Romney also leads Barack Obama in the Washington Post‘s Fact Checker “Pinocchio” sweepstakes. But the lies will continue until Nov. 6, after which the chief mission left to the checkers will be to determine whether the winner was a bigger liar than the loser.

The candidates lie about each other, they lie about themselves, they lie about issues they know intimately, and they lie about issues they barely understand. Of Romney, the Washington Post‘s Dana Milbank writes today that the candidate has changed, reversed and obliterated his views so many times that “Whatever Romney’s positions were, they are no longer.”

If either presidential candidate met you, he’d tell you a lie within 15 seconds of shaking your hand, and if he knew he were going to meet your mother, he’d invent a special set of lies for her. Politicians lie not because they’re wicked – though some are – but because they’ve learned that political markets rarely reward honest campaigners. Say what you will about Ralph Nader and H. Ross Perot, but they ran relatively honest campaigns on the issues, and the voters rejected them. The political market spoke many years ago and continues to speak: Telling the truth is not great for campaigns – and if it were, more people would be doing it.

The one presidential candidate in recent memory to win the White House posing as a truth teller was Jimmy Carter, who famously promised early in his campaign: “I’ll never tell a lie” and “I’ll never knowingly make a misstatement of fact” as president. These promises drew instant fire from the press, most notably Steven Brill, who flayed him in a March 1976 Harper’s piece titled “Jimmy Carter’s Pathetic Lies” (subscription required). Carter, who told no fewer lies than the average candidate, paid a political price for his promise, as everyone turned up their radar. “By saying that he would never tell a lie, Carter decided for himself that that’s going to be his standard,” said Alan Baron, George McGovern’s press secretary. “Well, fine, let’s hold him to it.” As soon as they could, voters replaced the non-lying liar with Ronald Reagan, a man so smooth even he didn’t know when he was lying.

Some of the lies the candidates tell are innocuous and are not held against them, as Kathleen Hall Jamieson and Paul Waldman write in their 2003 book, The Press Effect: Politicians, Journalists, and the Stories that Shape the Political World. For example, “It’s great to be in Kansas City” is a completely acceptable lie, as is the platitude, “Nothing is more important to me than the future of our children,” Jamieson and Waldman write. Nor do voters care much if candidates claim to have “led the fight” for a piece of legislation if all they did was vote for it or sign it. Moving up the ladder of lying, candidates rarely are forced to pay a political price when they butcher the truth, even in presidential debates. ”You can say anything you want during a debate and 80 million people hear it,” said Vice President George H.W. Bush’s press secretary Peter Teeley in 1984, adding a “so what?” to the fact that reporters might document a candidate’s debate lies. ”Maybe 200 people read it or 2,000 or 20,000.”

Campaigns can survive the most blatant political lies, but candidates must be careful not to lie about themselves – or even appear to lie about themselves, as Jamieson and Waldman demonstrate in a long chapter about Al Gore’s image problems. Gore never claimed to have invented the Internet or to have discovered Love Canal. He did, however, falsely claim during the 1988 presidential contest to have gotten “a bunch of people indicted and sent to jail” while working as a reporter. Voters demand authenticity in their presidential candidates, even if the authenticity is fake, as was George W. Bush’s just-folks manner. To lie about an issue is to be a politician. To lie about a corporation is to be a public relation executive. To lie about a legal matter is to be a lawyer. To lie about international power relations is to be a diplomat. But to lie about who you are is to be a hypocrite, and voters despise hypocrites.

The telling of durable, convincing lies signals to voters that a candidate possesses the political skills to run the Executive Branch. “In American politics today, the ability to lie convincingly has come to be considered an almost prima facie qualification for holding high office,” Eric Alterman writes in When Presidents Lie: A History of Official Deception and Its Consequences. More in sadness than in anger, Alterman beats up on Franklin Roosevelt, Harry Truman, John Kennedy, Lyndon Johnson and Ronald Reagan for their presidential lies. So much of governance is about deception, bluff, and double-dealing.

Voters especially don’t mind if their presidential candidate tells a lie that appears to repudiate the party’s most sacred principles. For instance, in the first of the 2012 presidential debates, Mitt Romney claimed to be for economic regulation. “Regulation is essential. You can’t have a free market work if you don’t have regulation,” said Romney. Few Romney supporters flinched at their man’s endorsement of government intervention into business, because they knew he knew his lie was designed to make himself look palatable to easily duped Democrats and independents. If they’ve hung with him this long, Romney supporters know that his presidential campaign has been one long lie – first to convince the Republican Party that he was an honest conservative and now to convince voters in the general election that he’s a devoted moderate.

The pervasiveness of campaign lies tells us something we’d rather not acknowledge, at least not publicly: On many issues, voters prefer lies to the truth. That’s because the truth about the economy, the future of Social Security and Medicare, immigration, the war in Afghanistan, taxes, the budget, the deficit, and the national debt is too dismal to contemplate. As long as voters cast their votes for candidates who make them feel better, candidates will continue to lie. And to win.

******

I’m an honest man only because my memory isn’t good enough to remember all of my lies. Send your lies to Shafer.Reuters@gmail.com and fact-check my Twitter feed at your own peril. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

PHOTO: Pencils in the colors of the Italian flag with the head of Pinocchio are displayed for sale in Rome, July 23, 2010.  REUTERS/Alessandro Bianchi

The 0.3 percent hysteria

Jack Shafer
Oct 5, 2012 23:17 UTC

When was the last time the inhabitants of wonkville got so hot over a federal statistic dropping three-tenths of a percent?

This morning – after the Bureau of Labor Statistics released its monthly jobs report stating that the unemployment rate had fallen from 8.1 percent in August to 7.8 percent in September – everybody started shouting about the numbers. President Barack Obama used them as evidence of economic progress, challenger Mitt Romney swatted them aside and scoffed that this “is not what a real recovery looks like,” and Jack Welch, former CEO of General Electric (and current Reuters Opinion contributor) tweeted that Obama’s “Chicago guys” had fudged the encouraging numbers to make up for the poor performance of their boss in the Oct. 3 debate. This prompted the proprietors at @PuckBuddys to tweet, “Truthers, Birthers and now Welchers.”

Ezra Klein, the mayor of wonkville, rushed to defend the integrity of the numbers in his Washington Post blog, pointing to a Mar. 9, 2012, Post story about the secret-agent measures taken by the BLS statisticians to prevent tampering with the data or the results. Computers: encrypted and locked. Office windows: papered over. Confidentiality agreements: signed each morning. Emails and phone calls from unknowns: unanswered during the eight days of lockdown preceding the job report release. Visitors: none permitted without security clearance. Trash cans: not emptied by custodians during the period.

Helping Klein repel the doubters were Secretary of Labor Hilda Solis – ”I’m insulted when I hear that, because we have a very professional civil service,” she said on CNBC – and Keith Hall, former Bureau of Labor Statistics chief under President George W. Bush, whose position was summarized in a Wall Street Journal blog item titled “Impossible to Manipulate Labor Survey Data – Former BLS Head.” Welch’s most prominent allies were Tea Party inspiration Rick Santelli, who implied on CNBC that the numbers were rigged, and Monica Crowley, who sarcastically tweeted: ”the rate miraculously drops to 7.8%. Ahem.”

But as Megan McArdle pointed out today at the Daily Beast, even if you believed that the Bureau of Labor Statistics was capable of such a number-inventing conspiracy, the subtle swing in the employment numbers would be too vague to build a conspiracy from. They neither vindicate Obama’s economic policies nor refute them. It would be like breaking into a bank and stealing just the pennies. On the other hand, just because Obama hasn’t played games with BLS numbers doesn’t mean it’s impossible for him or another president or politician to manipulate data to political advantage. Back in 2004, the New Yorker‘s James Surowiecki, no member of the tin-foil-hat crowd, accused President George W. Bush of futzing with hallowed government numbers. He wrote:

Statistical expediency and fiscal obfuscation have become hallmarks of this White House. In the past three years, the Bush Administration has had the Bureau of Labor Statistics stop reporting mass layoffs. It shortened the traditional span of budget projections from ten years to five, which allowed it to hide the long-term costs of its tax cuts. It commissioned a report on the aging of the baby boomers, then quashed it because it projected deficits as far as the eye could see. The Administration declined to offer cost estimates or to budget money for the wars in Afghanistan and Iraq. A recent report from the White House’s Council of Economic Advisers included an unaccountably optimistic job-growth forecast, evidently guided by the Administration’s desire to claim that it will have created jobs. And a few weeks ago the Treasury Department put civil servants to work—at Tom DeLay’s request—evaluating a tax proposal identical to John Kerry’s, then issued a press release saying that the proposal would raise taxes on “hardworking individuals.”

Such statistical shenanigans may fall just shy of the cooked-books charges being flung today, and Surowiecki doesn’t even claim that they are common, maintaining that White Houses have traditionally kept their thumbs off the scale, and “good economics has trumped politics.” (Disclosure: Surowiecki is a friend whom I edited a couple of times at Slate.) But Bush exceptionalism – if his behavior was genuinely exceptional – should make skeptics of every consumer of government data.

It’s worth noting that the Bureau of Labor Statistics owes its origin to “two decades of advocacy by labor organizations that wanted government help in publicizing and improving the growing industrial labor force.” That’s not Jack Welch howling. It’s from the opening page of The First Hundred Years of the Bureau of Labor Statistics by historians Joseph P. Goldberg and William T. Moye and published in 1985 by the BLS. (Here’s the PDF. It’s big.)

The first state to establish a bureau of labor statistics was Massachusetts in 1869. Organized labor agitated in the states and at the federal level for similar agencies. The labor movement’s leaders believed that the collection of federal statistics would provide them with a path to political power, and said so. Testifying before Congress in 1883, labor leader Samuel Gompers spelled out that sentiment. The bureau, he said, would educate members of Congress about “the condition of our industries, our production, and our consumption, and what could be done by law to improve both [sic].”

Thanks to labor’s prolonged politicking, the bureau was established in 1884 after overcoming opposition from Southern legislators. The statutory mission of the BLS was to “collect information upon the subject of labor, its relation to capital, the hours of labor and the earning of laboring men and women, and the means of promoting their material, social, intellectual and moral prosperity.” If you can’t smell the politics in that passage, you need sinus surgery.

What the bureau’s early proponents made overt, its current supporters make covert. Governments can pretend that they are like research institutions or universities, neutrally scouring the universe for valuable data that will lead to knowledge and enlightenment. But the data sweeps and number crunchings commissioned by government are almost always political in nature, intended to justify some government action or inaction. That goes for the numbers produced by the Bureau of Labor Statistics, the Pentagon, the Centers for Disease Control and Prevention, the Department of Transportation, and all the other federal, state and local appendages.

Rare is the government data set that results in a diminution of government power or does not start a political fight, as today’s job numbers did. If you didn’t get your fill of contention today, check back for the rematch on Friday, Nov. 2, four days before the election, when the Bureau of Labor Statistics’s next monthly job numbers come out.

******

Wouldn’t it be great if the plural of anecdote was data? Send anecdotes and data to Shafer.Reuters@gmail.com. I would so love Jack Welch to follow my Twitter feed. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

PHOTO: A yardstick measures the depth of rising water in Butte LaRose, Louisiana, May 19, 2011.  REUTERS/Lee Celano

Why we can’t stop watching the stupid presidential debates

Jack Shafer
Sep 28, 2012 22:30 UTC

The 2012 Presidential (and Vice Presidential) Debates, a four-part miniseries, will debut on televisions and computer screens around the world on Oct. 3 and continue weekly through the month. The program will feature presidential candidates Barack Obama and Mitt Romney in three episodes, and their understudies, Joe Biden and Paul Ryan, in one.

I can’t promise excitement or even enlightenment: As viewers of The 2008 Presidential (and Vice Presidential) Debates and its antecedents will recall, the events resemble 90-minute quiz shows in which there are no correct answers, just strong opinions. We come to the debates expecting dramatic oratory and political persuasion, but don’t even get a spritz of hot air. That’s because the debates are primarily designed to unite, not divide.

Highly formatted to begin with, this year’s debates will be even more highly formatted, as Elizabeth Flock reported last week in U.S. News & World Report. The Commission on Presidential Debates – the cutout the two major parties have been using to run the debates since 1988 – has for the first time issued cheat sheets to the candidates listing what topics will be up for debate in their first meeting: the economy, healthcare, the role of government and governing. This will make the study and rehearsal sessions, in which the candidates spend hours practicing their debate sound bites, a lot easier.

As usual, the commission’s debate rules are limiting enough to be called stringent. Open Debates, a non-profit advocacy that wants the debates released from the clutches of the Democrats and Republicans, complains of how “dreary” the events have become, comparing them at their worst to a joint press conference. In the first debate, each topic segment will run 15 minutes (there will be three “economy” segments). Moderator Jim Lehrer will begin each topic set with a question that the candidates get two minutes to answer, and at evening’s end both contestants will get two minutes for a closing statement.

Veteran debate moderator Gwen Ifill notes in the Washington Post that the debates don’t have much of an effect on the presidential election. “Gallup polls going back decades show precious little shift in established voter trends before and after debates,” she writes. Nor does anyone say much of enduring consequence, as Time magazine inadvertently showed with a recent video slideshow of “Top 10 Memorable Debate Moments.” None of the moments cited – Ford’s gaffe, Quayle’s Kennedy pandering, Gore’s body language, etc. – really changed anything.

Yet the debates still play a vital role in what anthropologist James R. McLeod liked to call the “ritual sociodrama” of the presidential campaign. During the primaries, candidates avoid acting presidential as they spew “powerful rhetoric of unity, disunity, order, anarchy, and chaos.” They don’t just throw mud – they irrigate and excavate whole new mud fields and construct new mud-delivery systems for the annihilation of their opponents in televised events that are called debates but more resemble rhetorical food fights. To pinch another of McLeod’s slick phrases, the high-sticking and peak emotion of the primaries render the nation “disarticulated politically.”

Then come the nominating conventions of the major parties, which exist nowadays mostly to rubber-stamp the primary rumbles. The conventions reduce the noise of the Republican and Democratic choruses to two soloists, which theoretically frees them to isolate their verbal firepower on each other. But by the time the debates arrive, the candidates generally refrain from howling at their opponent. Instead, they seek to appear more measured, more statesmanlike, more presidential. They take advantage of the conflict-averse debate rules to croon easy-listening music past one another and into the ears of the television audience. For them, the debates are twinned press conferences and twinned infomercials.

Whether by design or accident, the presidential debates commence the “reintegration” (a last hat-tip to McLeod) of the national political psyche. Rhetoric runs cooler as the parties creep toward the center. Although the television networks, newspapers and the Web obsess about what the two candidates say in the debates, the battle is largely visual, according to academics Mark Goodman, Mark Gring and Brian Anderson, co-authors of a 2007 paper about the visual style of “town hall” presidential debates. The debates are as much about what you don’t see as what you do. Goodman, Gring and Anderson write:

Naturally, campaign staffs and the growing ranks of for-hire media consultants have tried to maximize the candidate’s visual impression by: avoiding unconventional clothing and hairstyles; training the candidate to address camera lenses as well as the audience; featuring the candidate’s family in campaign appearances; and a host of other now-standard considerations. Understandably, the candidate’s visible actions in debates, that is, his or her non-verbal style, has become ever more vital in determining quality of performance in those important encounters.

None of these stylistic moves are designed to deepen the voters’ political understanding or convey anything substantive about the candidates’ characters and values, add the authors. Newscasts rely on similar visual tricks to hold and mold audiences, making TV anchors reluctant to critique the visual hocus-pocus of the debates. If Goodman, Gring and Anderson are right – and I think they are – the best, and in many ways the intended, way to watch the presidential debates is with the sound off.

The political reintegration continues on election night, as the network anchors deliberately soothe viewers and lend legitimacy to the vote-taking and -counting process. “Winners are larger than life heroes, losers are gallant and noble; democracy works in the United States as it does nowhere else in the world,” write Marc Howard Ross and Richard Joslyn about the election night telecast in a Winter 1988 paper in Polity. “Network commentators not only tell us who won and lost different contests, but also offer potent messages about political conflict, system legitimacy, regime norms, and citizen roles.”

The bunting-mad spectacle of the January inauguration seals the deal as solid as any Westminster Abbey coronation. At the risk of exhausting your anthropological patience, our election rituals cool even the hottest contests. By the end of the 1968, 1972, 1980 and 2004 election cycles, the nation seemed prepared to go to war with itself over the presidential election. But by Inauguration Day, the rituals have persuaded most – even the opposition – that the man taking the oath has a legitimate claim to the presidency. But disturb the ritual’s continuity – I’m thinking here of the Bush-Gore election interruptus in 2000 – and the magic vaporizes. A dozen years later Democrats are still howling about how Bush and the Supreme Court stole that election.

Part theatrical performance, part quiz show, part fencing match, part transition ceremony, the words and gestures of the 2012 debates will be picked over with tweezers by the TV commentariat as soon as the candidates’ microphones go dead. They’ll struggle to locate the momentum and import in the 90 minutes just passed, they’ll scrutinize the gaffes and they’ll even rate the moderator. Like all widely observed rituals and ceremonies – baptisms, weddings, funerals, the World Cup – the presidential debates open themselves to mockery. But the public craving for pageants and contests will not be stilled by our contempt and sarcasm. We can crack the debates’ code, but we can’t rewrite it.

******

All of my rituals – column writing, head scratching and animal sacrifice – begin with coffee drinking. How about you? Send your ritual rundown to Shafer.Reuters@gmail.com. My Twitter feed is a ritual bath for all who seek my wisdom. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

PHOTO: U.S. Democratic presidential nominee Senator Barack Obama (D-Illinois) (L) and Republican presidential nominee Senator John McCain (R-Arizona) (R) interact during their presidential debate at Hofstra University in Hempstead, New York, October 15, 2008. REUTERS/Gary Hershorn

Banning quote approval sounds good, but can it work?

Jack Shafer
Sep 21, 2012 22:53 UTC

New York Times reporter Jeremy W. Peters rolled a stink bomb into the church of journalism in July with his Page One story revelation about the widespread practice of “quote approval.” It turns out that reporters from many top news outlets covering the White House and the Obama and Romney campaigns – including the Times, Bloomberg, the Washington Post, Reuters, Vanity Fair, and others – regularly allow Obama and Romney staffers and strategists to dictate terms for interviews that permit them to rewrite or even spike things they’ve said.

Former CBS News anchor Dan Rather called the quote approval “a jaw-dropping turn in journalism” and a “Faustian bargain,” warning that it could make reporters “an operative arm of the administration or campaign they are covering.” Edward Wasserman, incoming dean of the University of California at Berkeley journalism school told NPR’s On the Media that it reduced an interview to “a press release.” Others compared the practice to “quote doctoring,” and editors at National Journal, Associated Press, McClatchy Newspapers and the Washington Examiner promptly banned it from their pages.

Yesterday, after an influential column by David Carr, one of its own, and a prodding blog item by Margaret Sullivan, its new public editor, the Times issued its own prohibition against after-the-fact “quote approval.”

Erik Wemple spotted the very visible loophole in the Times policy shortly after it was promulgated and drove his Washington Post blog through it. All reporters need do, explained Wemple, is call White House sources to talk about an issue; wait for the sources to agree to a “background” interview; agree to attribute the quotations to a “White House official;” then ask the source for additional quotations on the record. As Wemple notes, this arrangement would not violate the new Times policy, which appears to ban quote approval only as a precondition for an interview.

Thus, quote approval is reborn!

As best as I can tell, quote approval thrives in the places where reporters vastly outnumber sources, creating a scarcity arrangement that sources can – and do – capitalize on. Scarce sources in such places as the White House, Capitol Hill, some federal agencies, Silicon Valley, Hollywood and the entertainment industry, and on Wall Street have the necessary leverage to extract concessions out of the reporters covering them. In recent years, with the rise of a zillion websites covering politics, business, entertainment and tech, reporters on these beats have become more plentiful, making sources ever more scarce.

Reporters who work on beats where sources outnumber them have the easiest time waving off ridiculous sourcing demands. When scarce sources leave their Washington cocoons for flyover country, they’re often shocked at the way outside-the-Beltway reporters treat them. My favorite anecdote dates to 2004, when Deputy Secretary of Defense Paul Wolfowitz traveled to the Plains states to observe a military ceremony and give a speech in Omaha, Nebraska. His office invited reporters from the Kansas City Star, the Des Moines Register, the Lincoln Journal Star and the Omaha World-Herald to chat with the deputy secretary, and his public affairs officer began the session by asking that Wolfowitz comments be attributed to a “senior Defense Department official.” The reporters rebelled. One explained that the interview would be of no professional value if he couldn’t name Wolfowitz. Another said there was no point to the charade of attributing the remarks to a senior Defense Department official as Wolfowitz was the only senior Defense Department official in the region. Wolfowitz folded, agreeing to stay on the record unless he felt pressed to say something on background, which he did a couple of times to no real consequence, according to the reporters.

I can’t recall having ever agreed to quote approval in order to win an interview – but mine is a narrow boast. I don’t think I’ve ever been asked. I’ve agreed to read back quotations – especially when reporting on technical, scientific, medical or legal topics where thin slices of fact loom fat in the greater argument. But that’s to serve accuracy, not to help a source disavow something he said on the record and wished he hadn’t. (I resist going off the record, but that’s another column.)

While many inside and outside the Times praised the development of a formal and public policy to repel control-freak sources, in practice it’s hard to imagine it making much difference. Besides, there are a dozen other ways sources can make reporters dance to their tune. Freeze them out. Give them kibble while giving other reporters sirloin. Talk ill of them to other potential sources. Sabotage them socially, which spells devastation for a certain breed of Washington reporter. Cooperate with the junior beat reporter to undermine the senior beat reporter on the same publication. Let other reporters know what scoops he’s working on.

Washington’s permanent government plays the long game and can discipline even the most valiant reporter. For example, over the past decade, editors at the nation’s top newspapers (Washington Post, New York Times, Los Angeles Times and the like) have been making all the right noises about reducing anonymous sources from the page, and yet these anonymice continue to thrive, notably in Washington stories. But at least readers can independently count the number of anonymous sources being used. While the Times‘s new policy on quote approval looks good on paper, readers will have no way to judge whether it’s being rigorously enforced. To my ears, it sounds as if the Times hasn’t solved the problem as much as put all of its reporters on double-secret probation.

******

No prior approval required to email me at Shafer.Reuters@gmail.com. My Twitter feed, on the other hand, can only be accessed through an Open Table invitation. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

Willard Milhous Romney

Jack Shafer
Sep 19, 2012 21:06 UTC

Be careful about writing Mitt Romney’s political obituary before they fill him with formaldehyde and pour him into his mahogany condo. Like that other frequent Republican presidential candidate, Richard Nixon, Romney has a remarkable talent for stepping into it, sinking and soiling himself rotten as he extricates himself. Romney’s latest stumble — complaining to rich donors about the “47 percent,” which was Webcast by Mother Jones yesterday — would bury a less tenacious candidate. But Romney’s talent for powering past his embarrassments ranks up there with that of Nixon, a champion of compartmentalization who believed that as long as he had a pulse he had a chance of winning the White House.

Like Nixon, Romney is not only at war with the Democrats but also with the base of his own party, which has never been convinced that he’s a true conservative. Both Nixon and Romney have dressed their pragmatist campaigns in conservative clothing, but with the exception of their cultural biases against sex, drugs and pornography — and their instinctual disrespect for disrespecters of authority — none of it has ever rung true. The stink of inauthenticity wafts so heavily from both that their early biographers have incorporated it into the titles of their books, as historian David Greenberg pointed out to me in an interview. The Real Romney, published this year, and 1960′s The Real Nixon, both posit that what you see is not what you get with these two men.

“Romney is the most patently phony presidential candidate since Nixon,” says Greenberg, author of Nixon’s Shadow: The History of an Image. “The most talented politicians express a natural ease, by backslapping or chit-chatting with people. Nixon and Romney don’t have that skill, but they try anyway.” The failures of Nixon and Romney to connect, to seem “real” or to appear likable have resulted in both doubling their efforts to be personable and human, making even the sympathetic cringe.

The camera hated Nixon, and it showed. In 1968, Roger Ailes, now head of Fox News Channel, worked on the Nixon campaign as a consultant and improved the candidate’s stagecraft. Yet the camera still magnified Nixon’s internal discontent. Romney, a more handsome version of Nixon, doesn’t sweat or glower when facing the lens, but press encounters tend to give him the yips, jamming his efforts to pave a communications groove with voters. Like Nixon, Romney reflexively despises the press, which he blamed for the disaster that was his July foreign policy trip.

Had either Nixon or Romney grounded himself in ideology — conservative or otherwise — realness wouldn’t be as conspicuous a problem. They’d be dull politicians, reciting from their catechisms like Rick Santorum, if you seek a flesh-and-blood example. But say what you will, nobody ever doubted whether Santorum had an anchor, and nobody will ever write a book titled The Real Santorum. Pragmatists like Nixon and Romney, who have few core beliefs beyond the personal, require staff pollsters and strategists to tell them where they should be on issues.

Liberal writers such as Paul Krugman and Jonathan Chait would have you believe that the Mother Jones video reveals the true, inner Romney, somebody who regards the poor, the sick and the retired as grifters. If only that were true. He doesn’t even have that conviction. As a pragmatist politician speaking to wealthy donors behind closed doors, Romney is content to say what they want to hear: That the 47 percent are parasites and the donors are exalted beings.

Romney owes much of his early campaign reputation as an unprincipled waffling weasel to his major accomplishment as governor of Massachusetts, Romneycare. Romney distanced himself from the measure during the primaries, as the Washington Post reported in early August, but once he secured the nomination, his campaign cited the legislation as a political plus, evidence that he had the skills to “reform” the healthcare industry. This sort of calculated duplicity brings us back to Nixon, who campaigned as a conservative but who once in the White House supported the creation of the Environmental Protection Agency, wage and price controls, Amtrak, affirmative action and other codicils to Lyndon Johnson’s Great Society.

Obviously, every politician over-flatters supporters, makes voodoo dolls out of reporters and reverses himself. Nixon had a pretty good excuse for his flip-floppery: He cared primarily about foreign policy and would do almost anything to avoid domestic policy battles. But what does Romney really care about? He’s been running for president non-stop, since 2007, and I still haven’t a clue.

******

Here’s a good roundup of conservatives who denounced Romney’s 47 percent riff. If you understand the inner Romney, riff me at Shafer.Reuters@gmail.com or let me riff you at my Twitter feed. Sign up for email notifications of new Shafer columns (and other occasional announcements). Subscribe to this RSS feed for new Shafer columns.

PHOTO: This 1967 Lincoln-Continental convertible parade car, used by former U.S. President Richard Nixon, waits to be sold at the 34th annual Barrett-Jackson Classic Car Auction at Westworld in Scottsdale, Arizona, January 28, 2005. REUTERS/Jeff Topping

  •