Walter Pincus (Columbia Journalism Review, May/June ’09)
American journalism is in trouble, and the problem is not just financial. My profession is in distress because for more than a decade it has been chasing the false idols of fame and fortune. While engaged in those pursuits, it forgot its readers and the need to produce a commercial product that appealed to its mass audience, which in turn drew advertisers and thus paid for it all. While most corporate owners were seeking increased earnings, higher stock prices, and bigger salaries, editors and reporters focused more on winning prizes or making television appearances.
Some long-term reporting projects have been undertaken, and multiple-part series published, simply because they might win prizes. Over the past ten years, The Washington Post has won nineteen Pulitzer Prizes. But over that same period, we lost more than 120,000 readers. Why? My answer, unpopular among my colleagues, is that while many of these longer efforts were worthwhile, they took up space and resources that could have been used to give readers a wider selection of stories about what was going on, and that may have directly affected their lives. Readers have limited time to spend on newspapers. The number has been twenty-five minutes, on average, for more than thirty years. In short, we have left behind our readers in our chase after glory.
Editors have paid more attention to what gains them prestige among their journalistic peers than on subjects more related to the everyday lives of readers. For example, education affects everyone, yet I cannot name an outstanding American journalist on this subject. Food is an important subject, yet regular newspaper coverage of agriculture and the products we eat is almost nonexistent unless cases of food poisoning turn up. Did journalists adequately warn of the dangers of subprime mortgages? I don’t think so. (CJR’s answer to that question is on page 24.)
We have also failed our readers in the way we cover government. The First Amendment not only guaranteed freedom of the press from government interference, it also gave American journalists the opportunity—I believe the responsibility—to find and present facts on issues that require public attention. Our press is not protected in order to merely echo the views of government officials, opposition politicians, and so-called experts. Too often, though, that’s what occurs.
One of my basic concerns is that American journalism has turned away from its own hard-won expertise, and at the very time when readers are looking to us to explain the context of what is happening and what will happen next.
Most newspapers and the broadcast media have cut the number of reporters on beats. Meanwhile, young reporters are increasingly shifted from beat to beat, never having enough time to master complex subjects such as health care, public education, or environmental policies. As a result, more of their stories are based not on reportorial expertise, but on pronouncements by government sources or their critics.
Reporters are shifted around in part because of decreasing resources, and in part because within the profession, reporters are encouraged to become editors, editors to become publishers, and publishers of small papers pushed to manage bigger ones. This results in less expertise at the most important level—where reporters gather information.
Meanwhile, we have turned into a public-relations society. Much of the news Americans get each day was created to serve just that purpose—to be the news of the day. Many of our headlines come from events created by public relations—press conferences, speeches, press releases, canned reports, and, worst of all, snappy comments by “spokesmen” or “experts.” To serve as a counterpoint, we need reporters with expertise.
Consider the worst of recent examples. I believe the Bush administration sold the March 2003 invasion of Iraq to the American people beginning with a public-relations campaign that started in August 2002. Vice President Dick Cheney kicked it off with a series of speeches on the growing threat from Saddam Hussein, and it continued almost daily, with key members of the administration giving speeches, statements, or press conferences. The result was that the threat from Saddam Hussein—his alleged nuclear weapons, the idea that he would give chemical or biological weapons to terrorists—dominated news coverage right up to the time the first missiles hit Baghdad on March 19, 2003.
Manipulation of the media was taken to its highest form by George W. Bush’s administration. It built, however, on what went on before.
In 1922, Walter Lippmann, in his book Public Opinion, wrote:
The enormous discretion as to what facts and what impressions shall be reported is steadily convincing every organized group of people that whether it wishes to secure publicity or to avoid it, the exercise of discretion cannot be left to the reporter. It is safer to hire a press agent who stands between the group and the newspapers.
In 1968, Joe Alsop, discussing the Vietnam War, wrote that “facts influence events.” The increasing reports by war correspondents of U.S. failures in the war gradually undermined public support for the fighting. Five years later, facts presented by Bob Woodward and Carl Bernstein and published day after day in The Washington Post proved Alsop’s words in dramatic fashion. The Post’s newspaper stories led to the resignation of a president.
Watergate, I believe, was the high-water mark for newspapers as vehicles for bringing the public previously unknown information about serious matters. But I also think that, in many ways, it has been downhill ever since.
The celebrity of Woodward and Bernstein, along with financial rewards that accompanied Bob’s continued hard work, set new goals for others in the profession. At the same time, the impact an aroused press could have on government and politics was not missed by conservative supporters of the Nixon administration. Their response was twofold: demand more conservative columnists on newspaper op-ed pages and equal treatment in news columns for politicians and experts from “both sides” of issues. It was an informal way of applying the fairness doctrine, which was required of the electronic media, to print.
In 1981, at the beginning of the Reagan administration, Michael Deaver—one of the great public-relations men of our time—began to use early-morning “tech” sessions at the White House, which had been a way to help network producers plan the use of their camera crews each day, to shape the television news story for that evening. Deaver would say that President Reagan will appear in the Rose Garden to talk about his crime-prevention program and discuss it in terms of, say, Chicago and San Francisco. That would allow the networks to shoot B-roll. The president would appear in the Rose Garden as promised, make his statement, perhaps take a question or two, and vanish.
After a while, the network White House correspondents began to attend these sessions, and later print reporters began showing up, too. On days when the president went off to Camp David or his California ranch, Sam Donaldson, the ABC News White House correspondent, began his shouted questions to Reagan, and Reagan’s flip answers became the nightly news—and not just on television. The Washington Post, which prior to that time did not have a standing White House story each day (publishing one only when the president did something newsworthy), began to have similar daily coverage.
At the end of Reagan’s first year, David Broder, the Post’s political reporter, wrote a column about Reagan being among the least-involved presidents he had covered. In response, he got an onslaught of mail from people who said they saw Reagan every night on TV, working different issues. It was a triumph of public relations.
When President George H. W. Bush succeeded Reagan and occasionally drifted off the appointed subject, criticism began to appear that he “couldn’t stay on message.” When Bill Clinton did two, three, or four things in a day, critics went after him for “mixing up the daily message.” Being able to “stay on message” is now considered a presidential asset, perhaps even a requirement. Of course, the “message” is what the White House wants to present to the public.
These two elements on the editorial side of journalism—a move away from expertise and the growth of public relations in government—have been facilitated, in part, by the changing nature of newspaper ownership.
Newspapers across the U.S. were often begun by pamphleteers, political parties, or businessmen who wanted to get involved in local, state, or even national affairs. The founding editors of The New York Times started that newspaper as supporters of the Whig party and later switched to the Republican party. Adolph Ochs, who bought the Times in 1896, was helped in his negotiations by a letter from President Grover Cleveland, who wrote that Ochs’s management of The Chattanooga Times had “demonstrated such a faithful adherence to Democratic principles that I would be glad to see you in a larger sphere of usefulness.” The Washington Post’s publisher Phil Graham helped put Lyndon Johnson on the ticket with John F. Kennedy.
They used their presses to influence government, but that is what the founding fathers contemplated when they wrote the First Amendment. The idea was that citizens in a democracy were to read more than one paper or pamphlet, weigh all opinions and facts as presented, and make up their own minds.
Today, mainstream print and electronic media want to be neutral, presenting both or all sides as if they were refereeing a game in which only the players—the government and its opponents—can participate. They have increasingly become common carriers, transmitters of other people’s ideas and thoughts, irrespective of import, relevance, and at times even accuracy.
When is the last time you saw a major newspaper or television network set out its own agenda for candidates to take up? At a time when it is most needed, the media, and particularly newspapers, have lost their voices.
Beginning in the 1960s, papers large and small started being bought for large sums, first by newspaper chains, which in turn became controlled by outside financial interests. A few papers remained privately owned, but eventually almost all sold stock to the public. With that financial change came monopoly ownership, one newspaper per city or town, and the notion that the newspaper that survived should be neutral, presenting all points of view in each controversial story. As I said, the fairness doctrine has been transferred from radio and television to the newspaper. How ironic is it today, then, that there are dozens of competing electronic voices in almost every city, most of which now have only one newspaper.
The Graham and Sulzberger families’ ownership of The Washington Post and The New York Times is, I believe, a major reason why these newspapers continue to provide quality journalism. But even they and their editors are nervous when accused of showing favoritism or antipathy toward one party or another.
My post-Korean War generation entered journalism because we wanted to change the governmental system. Our role models were James Reston of The New York Times, whose column I proofread during the five months I was a copyboy at the Times; Edward R. Murrow; Richard Rovere, then writing the Washington Letter for The New Yorker; and even playwright Arthur Miller. They were among the journalists and writers who led the challenge to Senator Joe McCarthy’s red-baiting at a time when most mainstream journalists were being “objective” and reporting, uncritically, his accusations about Communist infiltration of government and his unproven allegations about individuals.
As a copyboy in 1954, fresh out of college, I delivered mail to Hanson Baldwin, then the Times’s highly respected military correspondent. When Baldwin wrote a news story or a piece of analysis, it was read in the Pentagon and in Congress. They had to read him because his years of coverage and his insights made him as expert as top generals and civilian defense officials. I didn’t know it then, but those days had a major influence on my approach to journalism.
I am a Democrat, and everyone knows it. No one is more aware of it than I am as I write stories for The Washington Post. I worked for Senator J. William Fulbright twice in the 1960s, when I was lucky to run two eighteen-month Foreign Relations Committee investigations for him. The first grew out of magazine articles I had written about lobbying in the U.S. by foreign governments. The second focused on military involvement in foreign policy, and grew out of discussions I had with Fulbright during my initial time with him. Those two sabbaticals were among the most important and enlightening years of my life, and influenced my view of reporting on government. They showed me how little I knew as a reporter about how government really worked.
Part of the explanation for this lack of knowledge is the emergence of the idea, among reporters in Washington and perhaps elsewhere, that we should avoid socializing or developing friendships with public officials—even those who are our peers. As a result of this artificial separation, public figures remain one-dimensional to many journalists; they have no wives, children, or lives outside their professional positions.
Not to me. After fifty years of living and working in Washington, I’ve had personal friends in Congress, on federal court benches, in high government positions, even in the White House. We should be measured by our work, not by what we say or do elsewhere. I certainly hope that as witnesses to wars, civil-rights riots, peace marches, famines, and terrorist events these past decades, we all have developed opinions which at times we may discuss or even argue about—or we just are not human.
Such experiences make us better observers and thus better reporters. With more and more PR peddled as news, journalists need the experience to sort out what really is news, and to deliver it in context.
As we’ve seen, fewer national and local newspapers are in the hands of fewer companies that in turn provide newspapers that are less appealing and relevant to people who have limited time to read them. And with the arrival of first the Internet and now the financial downturn, advertisers have panicked. The result is far less money to support serious journalism.
Although I have primarily been a reporter the past fifty years, I have also been a close observer of the financial side of the media business. In my college days, I bought stock in The New Yorker. I was an initial investor when Clay Felker started New York magazine, and when The New York Review of Books began. In 1970, I spent a couple of years unsuccessfully trying to start a national newspaper to be produced in Washington and printed on local presses in college cities across the country. From 1971 through 1975, I was executive editor of The New Republic, and put together a group that was to purchase the magazine after the 1976 presidential election, though we were outbid. I returned to The Washington Post in 1975 and today am a consultant to the company as well as a reporter for the newspaper.
I describe this background to justify talking about the finances of the newspaper business. In the early 1960s, Phillip Graham, at the time the president of The Washington Post Company, told me that the Post had just begun to turn a profit, but that he and Eugene Meyer, his father-in-law, who had originally purchased the paper, considered it a business much like a public utility. And as such, they thought making a profit of 7 percent would be a more than fair return on investment. This philosophy has guided me ever since.
Family-owned newspapers were the foundation of American journalism in the 1960s. Like the Post, most were started by businessmen who wanted a voice in their communities. Few were begun as the way to make a fortune. That began to change with the arrival of radio, and then television. The electronic media involved government licenses, which carried with them the requirement for delivery of public-affairs programming, starting with news. Newspapers became the obvious applicants, and many publishers suddenly became owners of local broadcast stations who stood to make a lot of money as network affiliates.
In the 1950s and 1960s, when newspapers made single-digit profits, radio and television affiliates could make up from 40 to 50 percent. Newspapers large and small started being swallowed by publicly owned corporations. With that trend came monopoly ownership. Gannett became the biggest. In 1977, as its purchasing of family papers moved into high gear, Gannett stock was around $8 a share. By 1990, it was at $75, and in 2004, it hit $90. At its height, Gannett produced earnings of more than 22 percent on its gross income, and set a standard that other newspaper corporations tried to emulate. When Knight-Ridder showed only a 14 percent profit, its major investors demanded it be sold.
I believe most corporate owners of newspapers made terrible business decisions over the past decade, thinking that the growing profits of the 1980s and early 1990s would continue. Chains paid excessive prices for family-owned papers and went deep into debt. The New York Times Company finds itself in trouble after paying $1 billion for The Boston Globe, over $2 billion to buy back its own stock at the height of its price, and another $600 million for a new building.
And now there is the economic downturn. In this environment, the Web has become both the threat and, to some, the savior. But I look at this differently than some in my profession. The Web has certainly taken an important chunk of classified advertising, but the broader threat seen by many is to me another sign of our own self-involvement. Journalists, probably more than any other group outside the financial community, are mesmerized by the Web. They closely watch it, so they believe others are doing the same.
Let me clarify that I am talking primarily about mass media—newspapers, television, and radio that traditionally have reached more than 80 percent of the American public. I am not talking about the thousands of Web sites and blogs that aggregate other people’s stories or present their own editorial material. They talk of thousands of unique visitors, but remember that these totals, often inferred rather than accurately measured, reflect monthly figures. When divided by thirty days in a month, they become smaller than individual newspaper circulations, which cumulatively sit at 110 million daily readers, even with recent losses.
Meanwhile, most consumers of online news do it from roughly 10 a.m. to 4:30 p.m. They are at work, and what they have time to see primarily are headlines. They don’t pay for what they see and probably won’t. And because the daily readership numbers are relatively small and the audience often geographically dispersed, the advertising hardly covers the cost of gathering the original stories. As Washington Post President Stephen P. Hills said recently, the Post newspaper is a $600 million business; its Web site is a $50 million business.
Nevertheless, there has been an outburst within the journalism community that the end is near. Serious people have proposed what in time will be considered absurd ideas—turn papers into nonprofit organizations; charge for each downloaded story; turn into Web-based publications; make Web aggregators, such as Google and Yahoo, pay for carrying newspaper stories.
NYTimes.com had some twenty million unique users for the month of October, making it the fifth-ranked news site on the Internet in terms of total visitors. The newspaper is sold to 800,000 readers a day, rising on Sunday to over 1 million. Without thinking, someone might say the Times Web site readership far exceeds the newspaper’s. But the definition of unique visitor is someone who within a month’s time visits the Web site more than once. It is not apples to apples, but by dividing the twenty million a month by thirty you get at best roughly 667,000 readers a day, which is short of the paper’s daily circulation.
I recognize that journalistically I am old-fashioned. I was going to say, an old fogey. But thanks to Microsoft Word, I have learned that a “fogey” is a reactionary. And Microsoft tells me its antonym is “activist,” which is a title I embrace. So I have to stand by Microsoft.
Like other industries caught up in today’s economic downturn, newspapers, which just a few years ago were rapidly expanding, have to reduce expenses, including staff. We also should look for other ways to use the materials we already collect and produce. The Post and other publications have taken first steps in joint ventures with network television news. I believe we will see a time when a major newspaper and a major television network jointly produce a daily news show.
But when it comes to editorial content, meaningful news about government, politics, and foreign policy is only one of the saleable elements. Good newspapers have to go back to delivering a daily product that our mass audiences want, and which provides to advertisers a unique means to reach consumers. Like supermarkets, newspapers must deliver quality in all departments.
Yet at the same time, owners, editors, and reporters should push issues they believe government is ignoring. They should do it factually and in articles short enough to read daily, but spread over time. That is how Americans absorb information—by repetition.
They should remember that “newsmakers” are intent on using the media to influence readers, listeners, and viewers to take up their ideas. The electronic and print media today probably have more power over public opinion—and thus government—than they had fifty years ago. But I fear they turn much of that power over to those who create news events to get coverage.
The press should play an activist role. That’s the reason a free press is important. Mine is a romantic and unfashionable view of journalism, but that is why many of us took up the profession in the first place.
John Walcott (Neiman Foundation IF Stone Award Speech, October 7, 2008)
John Walcott, now the Washington bureau chief for McClatchy Newspapers, was awarded the first I.F. Stone Medal for Journalistic Independence on Oct. 7, for his Knight-Ridder bureau’s coverage of the run-up to the war in Iraq. Following is the text of his acceptance speech. He was introduced by Bob Giles, curator of the Nieman Foundation.
Thank you, Bob. I must say that when you told me that the Nieman Foundation had decided to make me the first recipient of this extraordinary honor, my first reaction was something between shock and surprise. In fact, I haven’t been so surprised since my wife, Nancy, said, “yes” to my marriage proposal some 36 years ago. I can imagine no higher honor for anyone in our profession than to be compared to I.F. Stone. I do not believe that I deserve that honor, or that I’ve earned it, but I must confess that I’m grateful that someone, somewhere, for some reason, thinks that I do.
When you then told me that the price of this priceless medal was giving a speech, my second reaction was a troubling recollection of something that David Eisenhower, Richard Nixon’s son-in-law, once said: “Newspaper reporters aren’t as interesting as they think they are.” I hope that what I have to say today isn’t confirmation of that. There is confirmation enough on television.
My third reaction, when you told me that my selection was based on an anonymous nomination to a somewhat mysterious jury, was to wonder whether Dick Cheney and David Addington had somehow had a hand in it. It is, after all, at least vaguely reminiscent of the selection process for Guantanamo Bay.
That, of course, is the skeptical reporter that remains in me, even after 19 years as an editor. I’ll come back to that subject in a moment, because I think it was at the heart of who I.F. Stone was, what his legacy to us is and what’s been missing in American journalism in recent years, not just in the coverage of the Bush administration’s case for war in Iraq, but also in our coverage of the Wall Street machinations, congressional abdications and regulatory alterations that have brought our economy and the well-being of so many Americans to the present precipice.
First, though, I want to acknowledge one obvious difference between I.F. Stone and me. Stone wrote that he was “an independent capitalist, the owner of my own enterprise, subject to neither mortgager, factor or patron . . . beholden to no one but my good readers.”
He wrote that in 1963, and he added that at that time, ” . . . young men, setting out in a career of journalism, must find their niche in some huge newspaper or magazine combine.” As you can see, I’m no longer young, but that pretty much describes my career: For more than a decade, I’ve worked for two publicly-owned newspaper companies, Knight Ridder and McClatchy, and before that I worked for Newsweek, for The Wall Street Journal and for U.S. News & World Report.
Those associations, however, have been a blessing, not a curse, and I would like to take a moment to thank a few of the people who were foolish enough to hire me at those places and patient enough to teach me how to do my job. We would be here all day and all night if I thanked all of them, but they include the late Bob King at the Bergen Record; Mel Elfin, Henry Hubbard and Jim Doyle at Newsweek; Al Hunt, Ken Bacon and Walt Mossberg at the Journal; Mimi McLoughlin and Mike Ruby at U.S. News; Clark Hoyt, Kathleen Carroll and the late Gary Blonston at Knight Ridder; and David Westphal and Howard Weaver at McClatchy.
Stone also wrote that before he became an independent capitalist, he was “fortunate in my employers,” and I feel the same way. I want to single out two remarkable CEOs for whom I’ve been privileged to work. Tony Ridder and Gary Pruitt have come in for a lot of criticism in recent years, but let me say this as plainly as I can: First Tony and now Gary have never been anything but supportive of the journalism that we’ve tried to do in the Knight Ridder and McClatchy Washington Bureaus. Not once _ not one time _ has either of them second-guessed our editorial decisions or our stories, even when bigger and better known news organizations were reporting the opposite. Not once has either of them bent to any political or economic pressures or let any of the heat that they’ve taken trickle down to us. If you’ve seen Buying the War, Bill Moyers’ documentary on the run-up to the Iraq war, then you know that wasn’t true everywhere.
I also want to take a moment to tell you why this award is a little bit embarrassing to me. It’s because it rightly belongs to so many other people who worked long and hard to scrutinize the Bush administration’s pronouncements about Iraq and al Qaida, about Iraq’s nuclear program and about Saddam Hussein’s weapons of mass destruction. Day-in and day-out, Jonathan Landay, Warren Strobel, Joe Galloway and their editor, Renee Schoof, pried open doors, pried loose documents and pried out the truth. We withheld Joe’s byline from much of what we wrote to protect his sources, and this is the first time that any of us has talked publicly about the crucial role he played in this reporting. Sadly, he can’t be here today; happily, that’s because he’s in Canberra at an Australian Army conference.
This award belongs to Jon, Joe, Renee and Warren at least as much as it belongs to me, maybe more, and to me it confirms the virtues of old-fashioned “beat” reporting, of getting to know the people and the issues over a long period of time, rather than hopping up some career ladder from job to job as quickly as you can. In 2001, Joe, Jon, Warren and I had a combined 84 years of experience covering foreign affairs.
It also belongs in part to a number of officials in the Bush administration, some of whose names many of you know well, some of whose names you may never know. They worked in the CIA; in the DIA; in other intelligence agencies; in the uniformed military; in the State Department, in the Defense Department, in the Energy Department and elsewhere. It may be true, as a wonderful book about I.F. Stone is titled, that “all governments lie,” but it is not true that everyone in government lies.
In fact, one of the striking things about what in October 2002 we called “the Bush administration’s double-time march to war in Iraq” was that President Bush, Vice President Cheney, defense secretary Rumsfeld and the others who were calling the tune ignored the advice, the dissents and the warnings of so many of the experts in their own government.
Why, in a nutshell, was our reporting different from so much other reporting? One important reason was that we sought out the dissidents, and we listened to them, instead of serving as stenographers to high-ranking officials and Iraqi exiles. I’m afraid that much the same thing may have happened on Wall Street. Power and money and celebrity, in other words, can blind you. Somehow, the idea has taken hold in Washington journalism that the value of a source is directly proportional to his or her rank, when in my experience the relationship is more often inverse.
That brings up a larger point, and one that I think is another part of what went wrong back in 2002, and what may have gone wrong on Wall Street. Instead of being members of the Fourth Estate, too many Washington reporters have been itching to move up an estate or two, to become part of the Establishment or share in the good times. I.F. Stone, on the other hand, knew well that reporters, by definition, are outsiders. After Stone died, Pat Oliphant drew a marvelous cartoon of him standing at the gates of heaven, holding a pencil and a notebook. Like all great political cartoons, it says more than words ever could. St. Peter is on the phone to a Higher Authority, and he’s saying: “Yes, that I.F. Stone, Sir. He says he doesn’t want to come in — he’d rather hang around out here, and keep things honest.”
Being an outsider, a gadfly, a muckracker, isn’t always as much fun as being an insider, a celebrity journalist on TV and the lecture circuit. Worse, in these troubled economic times for the news media, it makes enemies, sometimes powerful ones, and it can offend readers, advertisers — and, as conditions in our business continue to worsen — potential employers in public relations and other industries.
But there were much bigger problems with the media after 9/11 than just too-cozy relationships with the wrong sources and timidity about challenging a popular president in the wake of an attack on all of us.
There was simple laziness: Much of what the Bush administration said, especially about Iraq and al Qaida, made no sense, yet very few reporters bothered to check it out. They were stenographers; they were not reporters. Ahmad Chalabi is an intelligent man, but many of the stories he spun made no sense. One of his sources, for example, was a Kurd who claimed he had the run of Saddam’s secret weapons facilities. As a test subject, perhaps. Another failed a polygraph test. Yet very few reporters checked out their stories, and too many just ran with what they were handed. Instead, they handed bullhorns to people who already had megaphones.
Some of my colleagues have cited the fact that the Democrats in Congress failed to challenge the administration on Iraq as a reason for their lapses, but the Democrats’ dereliction of duty doesn’t explain our own, much less excuse it.
Finally, the most highly regarded news organizations in the country, led by The New York Times and the editorial page of The Washington Post, were wrong about Iraq, and too many others simply followed them, like lemmings, over the cliff.
Nevertheless, concentration in the news business is getting worse, not better, especially when it comes to foreign and national news. Only a handful of print news organizations — The Times, the Post, the LA Times, the Chicago Tribune, the Christian Science Monitor, the Associated Press and McClatchy — still maintain significant foreign news operations. The Boston Globe, the Baltimore Sun, The Philadelphia Inquirer and Newsday have abandoned their foreign bureaus and are cutting back in Washington. So have the major television networks. Newhouse and Copley are eliminating their Washington bureaus, and Hearst and others are downsizing here.
We’ve reached a point, I fear, where the journalism of I.F. Stone is now very much at risk from a combination of economic, technological, political and philosophical developments. I’ve talked a little about the pressures on news companies to cut costs to compensate for falling revenues, and I know there are those out there who think that’s fine, that the traditional “Mainstream Media” have so discredited themselves on Iraq and other issues that we should all say “good riddance” to them and turn to the blogosphere for all our news and analysis.
There are two reasons why I think that’s foolish, at least for now.
The first is that the blogosphere, like cable television and talk radio, reflects the political polarization in our country today. People go to Fox News or MSNBC, to O’Reilly or Air America, to Daily Kos or to Redstate.org, to have their biases reaffirmed, not to have their assumptions challenged or unpleasant truths exposed. It’s been a good business model, but I don’t think that it’s good journalism, and I suspect that I.F. Stone, as hard to pin down ideologically as he was, might agree.
That same political polarization, coupled with the economic weakness of the news media, has both encouraged and enabled the government, and here I mean the Bush administration, to restrict the amount of information that’s available to the media, the kind of information that was such grist for I.F. Stone’s mill. Some of it’s been done using the time-honored “national security” excuse, but nowadays Freedom of Information requests frequently are answered either with, “It’ll take us a year to find that” or “Sue us,” and suing the federal government, of course, can get expensive, as the government well knows.
The second reason that I don’t think that even the best blogsites are a substitute for the mainstream media is that bloggers cannot do all of the things that mainstream media companies can do, at least for now. In June, for example, we completed an investigation of the Bush administration’s detainee program, which you may have read about in a piece by Tony Lewis in The New York Review of Books. I think I.F. Stone might have liked it, but it took the lead reporter, Tom Lasseter, other reporters and Travis Heying, a photographer for the Wichita Eagle, to 11 countries on three continents over the course of eight months. It’s hard for bloggers, no matter how good they are, to do that kind of work.
Technology and economics, I think, have taken another toll on the kind of reporting that Stone did, the kind that we need so desperately today. Virtually every reporter today, with a few precious exceptions, has become a wire service reporter, a slave to the old UPI motto of “a deadline every minute.” In the scramble to try to generate more revenue online, they’ve also become cameramen, soundmen, bloggers and producers. All of that producing leaves less time for digging, and worse, it leaves less time for thinking.
That brings me to may last point: Relying on The Times, or McClatchy or any other news source, for all the truth is dumb, but it’s infinitely preferable to the pernicious philosophical notions that there is no such thing as truth, that truth is relative, or that, as some journalists seem to believe, it can be found midway between the two opposing poles of any argument.
My father, who’s with us today, made his living designing navigational instruments for aircraft, missiles and submarines, and although my mathematical and engineering skills are, shall we say, less evident than his, I learned two important lessons from his work.
The first is that if you want to know where you are, it’s helpful to know where you started. The second is a concept that’s called “ground truth,” which in a nutshell means checking your calculations against information collected on the ground. In other words, reporting.
I know that I’m wading into deep and muddy water here, but I’m doing so in deference, or rather, in reverence, to the fact that I.F. Stone was a scholar as well as a journalist. He taught himself ancient Greek to write about the trial of Socrates, and I still struggle with modern French, but I’ll wade in nevertheless.
Does the truth lie halfway between say, slavery and abolition, or between segregation and civil rights, or between communism and democracy? If you quote Dietrich Bonhoeffer or Winston Churchill, in other words, must you then give equal time and credence to Hitler and Joseph Goebbels? If you write an article that’s critical of John McCain, are you then obligated to devote an identical number of words to criticism of Barack Obama, and vice versa?
The idea that truth is merely a social construct, that it’s subjective, in other words, first appeared in academia as a corruption of post-modernism, but it’s taken root in our culture without our really realizing it or understanding its implications.
It began with liberal academics arguing, for example, that some Southwestern Indians’ belief that humans are descended from a subterranean world of supernatural spirits is, as one archaeologist put it, “just as valid as archaeology.” As NYU philosophy professor Paul Boghossian puts it in a wonderful little book, “Fear of Knowledge”: “ . . . the idea that there are many equally valid ways of knowing the world, with science being just one of them, has taken very deep root.”
Although this kind of thinking, relativism and constructivism, started on the left, many conservatives now feel empowered by it, too, and some of them have embraced it with a vengeance on issues ranging from global warming and evolution to the war in Iraq.
“Journalists live in the reality-based world,” a White House official told Ron Suskind, writing for The New York Times Magazine back in the headier days of 2004. “The world doesn’t really work that way any more. We’re an empire now, and when we act, we create our own reality.”
I respectfully disagree.
The Church was wrong, and Copernicus and Galileo were right.
There is not one truth for Fox News and another for The Nation. Fair is not always balanced, and balanced is not always fair.
No matter how devoutly they may have believed their own propaganda, Kenneth Lay and Jeffrey Skilling were wrong about Enron, and a whole lot of very smart, very rich people were very wrong about mortgage-backed securities and credit default swaps.
President Bush was wrong to think that it would be a simple matter to make Iraq the mother of all Mideast democracy.
Or, as the French Prime Minister Georges Clemenceau said when he was asked what he thought historians might say about the First World War: “They will not say that Belgium invaded Germany.”
I’m not talking here about matters of taste or of partisan politics or, heaven help us, of faith: Whether Monet or Manet was a better painter or whether Jesus was the Messiah, a prophet or a fraud. Those are personal matters, beliefs, opinions and preferences of which we all must learn to be more tolerant.
Harry G. Frankfurt, an emeritus professor of philosophy at Princeton, puts it this way in a marvelous little book called, “On Truth” (which is the sequel to “On Bullshit”): “It seems ever more clear to me that higher levels of civilization must depend even more heavily on a conscientious respect for the importance of honesty and clarity in reporting the facts, and on a stubborn concern for accuracy in determining what the facts are.”
That is what I.F. Stone always sought to do, and I think it’s what journalists should always strive to do. If, in the short run, doing so seems costly, I think we’ve all seen, in Iraq, in Afghanistan and now on Wall Street and on Main Street, that the costs of not doing so are far greater.
Jay Rosen (PressThink, April 12, 2009)
He Said, She Said Journalism: Lame Formula in the Land of the Active User
Any good blogger, competing journalist or alert press critic can spot and publicize false balance and the lame acceptance of fact-free spin. Do users really want to be left helpless in sorting out who’s faking it more? The he said, she said form says they do, but I say decline has set in.
There I am, sitting at the breakfast table, with my coffee and a copy of the New York Times, in the classic newspaper reading position from before the Web. And I come to this article, headlined “Ex-Chairman of A.I.G. Says Bailout Has Failed.” I immediately recognize in it the signs of a he said, she said account.
Quick definition: “He said, she said” journalism means…
- There’s a public dispute.
- The dispute makes news.
- No real attempt is made to assess clashing truth claims in the story, even though they are in some sense the reason for the story. (Under the “conflict makes news” test.)
- The means for assessment do exist, so it’s possible to exert a factual check on some of the claims, but for whatever reason the report declines to make use of them.
- The symmetry of two sides making opposite claims puts the reporter in the middle between polarized extremes.
When these five conditions are met, the genre is in gear. The he said part might sound like this:
Mr. Greenberg asserted that he would have reduced or at least hedged A.I.G.’s exposure to credit-default swaps in 2005, when A.I.G.’s credit rating was reduced.“A.I.G.’s business model did not fail; its management did,” he asserted.
Followed by the “she” said…
That provoked another scornful counterattack from his former company, saying that Mr. Greenberg’s assertions were “implausible,” “not grounded in reality” and at odds with his track record of not hedging A.I.G.’s bets on credit-default swaps.
I had read enough of the Times coverage of Mr. Greenberg to wonder why the editors would run something so lame. Their business columnists have been (excuse the expression) kicking ass on meltdown coverage, including A.I.G. But here there was no attempt to assess clashing truth claims, even though Times journalism was available to do just that. Instead Hank Greenberg got to star in a game of “you say black, I say white.”
It seemed strange to me that in 2009 stories like that were still being waved on through. On Twitter I sometimes talk to Ryan Chittum, who writes The Audit column for Columbia Journalism Review. It’s a running critique of the business press after the banking meltdown. So I asked Ryan, “is this the best the Times can do?” because he knows a lot more about the coverage than I do. A few hours later he answered me at CJR.
This one’s easy: No. The Times’s story offers no analysis and forces readers—95 percent of whom know little or nothing about Greenberg’s tenure at AIG—to try to guess who’s right.
Which is why these stories are so frustrating: we’re left helpless by them. I want to quote the rest of his judgment because it helps nail down what is meant by he said, she said, not just at the New York Times, which has no special purchase on the form, but anywhere. The means are available to do better, but these are not employed. Chittum:
There’s no attempt to try to separate out who’s right here, even though everybody but Hank Greenberg knows he has major responsibility for driving AIG into the ground.Here’s some stuff that helps explain why. I just culled it from the excellent Washington Post three-parter on AIG in December (if you haven’t read that yet, make sure you do):
He created the Financial Products division in 1987 with traders from soon-to-be disgraced Drexel Burnham Lambert, approved its entry into the credit-default swap market in 1998, empowered Joseph Cassano, oversaw FP when it set up “sham” companies that resulted in tens of millions in fines, was an unindicted co-conspirator in a huge fraud at AIG, oversaw the company’s credit downgrade from AAA, was in charge when half of the company’s $80 billion in CDS on subprime CDOs were written. Apparently, Cassano and FP stopped issuing CDS within months of Greenberg’s exit in 2005.
How much more evidence do you need to tell your readers that this guy has significant responsibility for the disaster that came to his his company and the entire economy—to not let him spin away?
“How much more evidence do you need?” is the kind of exasperation a lot of us have felt with what he calls “false balance,” which is another name for the pattern I’m describing.
So far so good. I told you what he said, she said is, and gave you an example. CJR chimed in, and told the New York Times it could do way better, showing how. Press criticism lives! (Twitter helps.) But this does not tell us why he said, she said reporting still exists, or ever existed. To understand that we have to cut deeper into news practice, American style.
Turn the question around for a moment: what are the advantages of the newswriting formula I have derisively labeled “he said, she said?” Rather than treat it as a problem, approach it as a kind of solution to quandaries common on the reporting trail. When, for example, a screaming fight breaks out at the city council meeting and you don’t know who’s right, but you have to report it, he said, she said makes the story instantly writable. Not a problem, but a solution to the reporter’s (deadline!) problem.
When you kinda sorta recall that Hank Greenberg is a guy who shouldn’t necessarily get the benefit of the doubt in a dispute like this, but you don’t know the history well enough to import it into your account without a high risk of error, and yet you have to produce an error-free account for tomorrow’s paper because your editor expects of you just that… he said, she said gets you there.
Or when the Congressional Budget Office issues a report on ethanol and what it’s costing us in higher food prices, the AP reporter to whom the story is given could just summarize the report, but that’s a little too much like stenography, isn’t it? So the AP adds reactions from organized groups that are primed to react.
This is a low cost way of going beyond the report itself. A familiar battle of interpretations follows, with critics of ethanol underlining the costs and supporters stressing the benefits. Of course, the AP could try to sort out those competing claims, but that would take more time and background knowledge than it probably has available for a simple “CBO report issued” story. “Supporters of ethanol disagreed, saying the report was good news…” gets the job done.
These are some of the strengths of the he said, she said genre, a newsroom workhorse for forty years. (Think it’s easy? You try making any dispute story in the world writable on deadline…)
The best description I’ve read of the problem to which devices like he said, she said are a solution comes from former Washington Post reporter Paul Taylor, who covered national politics. Here’s a comment about it that I left at the New York Times Opinionator blog. It was an attempt to explain a phrase I use to describe the kind of distortion that he said, she said can produce: “regression toward a phony mean.”
Journalists associate the middle with truth, when there may be no reason to.In his 1990 book, See How They Run, former Washington Post reporter Paul Taylor (once seen as heir to David Broder) explained why regression toward a phony mean is so common in journalism. It answers to a need for what he calls “refuge.” Here is what he said:
“Sometimes I worry that my squeamishness about making sharp judgments, pro or con, makes me unfit for the slam-bang world of daily journalism. Other times I conclude that it makes me ideally suited for newspapering– certainly for the rigors and conventions of modern ‘objective’ journalism. For I can dispose of my dilemmas by writing stories straight down the middle. I can search for the halfway point between the best and the worst that might be said about someone (or some policy or idea) and write my story in that fair-minded place. By aiming for the golden mean, I probably land near the best approximation of truth more often than if I were guided by any other set of compasses– partisan, ideological, pyschological, whatever… Yes, I am seeking truth. But I’m also seeking refuge. I’m taking a pass on the toughest calls I face.”
Clearly, there can be something extreme about this squeamishness, too. Clearly, the desire for refuge can get out hand. Writing the news so that it lands somewhere near the “halfway point between the best and the worst that might be said about someone” is not a truthtelling impulse at all, but a refuge-seeking one, and it’s possible that this ritual will distort a given story.
Like the “straight down the middle” impulse that Taylor writes about, he said, she said is not so much a truth-telling strategy as refuge-seeking behavior that fits well into newsroom production demands. “Taking a pass” on the tougher calls (like who’s blowing more smoke) is economical. It’s seen as risk-reduction, as well, because the account declines to explicitly endorse or actively mistrust any claim that is made in the account. Isn’t it safer to report, “Rumsfeld said…,” letting Democrats in Congress howl at him (and report that) than it would be to report, “Rumsfeld said, erroneously…” and try to debunk the claim yourself? The first strategy doesn’t put your own authority at risk, the second does, but for a reason.
We need journalists who understand that reason. And I think many do. But a lot don’t.
He said, she said reporting appears to be risk-reducing, but this is exactly what’s changing on the press. For a given report about, say, former counter-terrorism official Richard Clarke, “the halfway point between the best and the worst that might be said about someone” is no more likely to be accurate than the one-fifth mark, especially when you factor in the reality of the Overton Window and the general pattern we know as “working the refs.” The halfway point is a miserable guideline but it can still sound pretty good when you are trying to advertise to all that you have no skin in the game. This is how I think of he said, she said reporting. Besides being easy to operate, and requiring the fewest imports of knowledge, it’s a way of reporting the news that advertises the producer’s even handedness. The ad counts as much as the info. We report, you decide.
“Ex-Chairman of A.I.G. Says Bailout Has Failed” was a text most likely intended for the print edition of the New York Times business pages. The newswriting formula that produced it dates from before the Web made all news and reference pages equidistant from the user. He said, she said might have been seen as good enough when it was difficult for others to check what had previously been reported about the ex-chairman of A.I.G., but that is simply not the case for Times reporter Edmund L. Andrews in April, 2009.
There has been a loss of refuge. And this is why he said, she said journalism is in decline, even though you still see plenty of it around. Today, any well informed blogger, competing journalist or alert press critic can easily find the materials to point out an instance of false balance or the lame acceptance of fact-free spin. Professional opinion has therefore shifted and among the better journalists, some of whom I know, it is no longer acceptable to defend he said, she said treatments when the materials are available to call out distortions and untruths. (That doesn’t mean the practice has halted; I’m talking about a shifts in the terms of legitimacy among journalists, and about efforts like this.)
In fact, it’s taken a long time to get to this point. Back in 2004 setting a higher standard than he said, she said was still a novel idea. Chris Mooney wrote about it in the context of science coverage under Bush. (“How ‘Balanced’ Coverage Lets the Scientific Fringe Hijack Reality.”) As CJR’s Campaign Desk noted…
The candidate makes a statement. You write it down, then you call the other side for a response. It’s one of journalism’s fundamentals. Tell us what he said, tell us what she said, and you’re covered, right?Well, no. Given the amount of spin this election year, the old rules don’t apply any more. Campaign Desk herewith proposes a new ground rule: “He said/she said/we said.”
… With a variety of Internet research tools readily at hand, it has never been easier for reporters to draw an independent assessment on any given day of who is right, who is wrong, and in what way.
The tools are there to make an independent assessment of who is right: for journalists, that is the critical point. (See also my post from 2004, He Said, She Said, We Said and Rethinking Objectivity by Brent Cunningham from 2003.) Because of that—and because of working the refs, the Overton Window, the failures of the political press under Bush—he said, she said no longer has the acceptance rates it once did. Which is why it was so easy to get Ryan Chittum to answer my question, “is this the best the Times can do?”
It wasn’t. And it’s easier than ever to show that. More people are involved in showing it, too. This raises the question of whether a he said, she said treatment loses you more in user disgust with your lameness than any informational gain in having fresh news to report about Hank Greenberg trading barbs with A.I.G. Do people want to feel helpless in sorting out who’s bullshitting them more? Is that the news media’s role, to increase that feeling? Is such a practice even sustainable in the Web era?
That it may not be (and the industry knows it) is shown by what The Politico called a “high-stakes experiment” at the AP’s Washington bureau. The plan was to move “from its signature neutral and detached tone” to a more aggressive style of newswriting that bureau chief Ron Fournier calls “cutting through the clutter.”
In the stories the new boss is encouraging, first-person writing and emotive language are okay.So is scrapping the stonefaced approach to journalism that accepts politicians’ statements at face value and offers equal treatment to all sides of an argument. Instead, reporters are encouraged to throw away the weasel words and call it like they see it when they think public officials have revealed themselves as phonies or flip-floppers.
In other words, we can’t skate by on he said, she said any more. Call it like they see it is, in fact, a successor principle but this means that AP reporters are now involved in acts of political judgment that can easily go awry, and their own politics can be at issue.
Time to wrap this up.
Part of the problem is that American journalism as an occupational scene has never gone for the candor Paul Taylor showed in his comments on searching for the halfway point between the best and the worst that might be said. The pro system talks about the reporting of news as a truth-telling enterprise, but not a difference-splitting or dilemma-disposing one. It says: we’re the source of “the most authoritative news coverage,” as the AP recently put it. But it rarely mentions the refuge-seeking part, which subtly undermines that authority.
As I tried to explain in Why Campaign Coverage Sucks (published at TomDispatch.com and Salon, January 2008) there is an “innocence agenda” at work in the mainstream press. It favors certain practices:
Who’s-gonna-win is portable, reusable from cycle to cycle, and easily learned by newcomers to the press pack. Journalists believe it brings readers to the page and eyeballs to the screen. It [plays] well on television, because it generates an endless series of puzzles toward which journalists can gesture as they display their savviness, which is the unofficial religion of the mainstream press.But the biggest advantage of horse-race journalism is that it permits reporters and pundits to play up their detachment. Focusing on the race advertises the political innocence of the press because “who’s gonna win?” is not an ideological question. By asking it you reaffirm that yours is not an ideological profession.
In its heyday he said, she said was like a stamping plant in the factory of news. It recognized that production demands trumped truthtelling requirements. But these were the production demands of a beast that is now changing. Refusing to serve as a check on Hank Greenberg’s power to distort the news when the means for a such a check are available— this too can have a cost, just as importing the knowledge to do the check has a cost. At a certain point in this dynamic, he said, she said journalism loses its utility and becomes one of the things dragging the news business down. But as the industry sheds people and newsrooms thin out, there could be greater reliance on a more and more bankrupt and trust-rotting practice. That’s a downward spiral.
Criticism of he said, she said practices and the flippancy that comes with it should therefore continue. The other day, Paul Kane of the Washington Post said it was too much to expect him to import into his account the background knowledge that a Republican Senator warning about the dangers to Senate comity of proceeding with only 50 votes had voted to do the same thing when her party held the majority but not 60 votes. (Matthew Yglesias picked up on it.)
Kane said he was astonished by this demand; he couldn’t figure out where it was coming from. “We reported what Olympia Snowe said. That’s what she said. That’s what Republicans are saying. I really don’t know what you want of us.”
If he’s not just blowing smoke, and he really doesn’t know— that is a problem for the Washington Post.
This whole ‘On the Media’ segment is a bitter struggle over the legacy of he said, she said reporting, and the mistrust it has engendered. Seriously. It is…
Now that is an interesting summary by Michael Scherer, political reporter for Time magazine, at the Swampland blog: “Jay Rosen, new media deep-thinker, scourge, scold and provocateur, makes a substantial argument for reporters making more of an effort to take sides in public disputes when facts can be ascertained.”
Not to be picky—thought you have to be with this subject!—but I did not say additional effort should be expended in “taking sides” (a signal to journalists to freak out) but in calling out lies and distortions. But… If “calling out lies and distortions” equals “taking sides” to Scherer, that might help explain why it’s an infrequent practice. For then refusing to call out lies and distortions means refusing to take sides, and that’s a good thing in journalism… right?
Scott Rosenberg, who has written a really good book on blogging that’s out in July, in the comments:
A great value hesaid/shesaid used to have for the working journalist — and I think this is real value — was as a check against unfairness. It forced you, the reporter, to give at least a little space to a point of view you disagreed with.In the days when you, the reporter, controlled the mike (along with your colleagues and rivals at other publications), this was an important safeguard. If you didn’t give some space to “the other side of the argument” that you were making (either explicitly or, more often under the “objectivity” standard, covertly), it might well not have been heard at all.
Today there’s less need for it, he says.
John Walcott, McClatchy’s Washington bureau chief, in the comments. “This is a topic that deserves more attention, along with beat-sweeteners, access journalism (an oxymoron) and other afflictions of modern life.” He left a link to his excellent I.F. Stone award lecture:
Does the truth lie halfway between say, slavery and abolition, or between segregation and civil rights, or between communism and democracy? If you quote Dietrich Bonhoeffer or Winston Churchill, in other words, must you then give equal time and credence to Hitler and Joseph Goebbels? If you write an article that’s critical of John McCain, are you then obligated to devote an identical number of words to criticism of Barack Obama, and vice versa?
Pulitzer Prize winning reporter John McQuaid responds to this post: “The problem with [he said, she said] is that it implicitly assumes what everyone now knows to be wrong: that public figures make statements that can be taken at face value, and the truth can be ascertained by juxtaposing contradictory statements. It’s been obvious for some time that this is unworkable because the public ‘conversation’ is too splintered, its participants too practiced and manipulative.”
Eric Alterman and Danielle Ivory write on The George Will global cooling controversy and the reactions of Washington Post editorial page editor Fred Hiatt.
By letting Will express himself Bush-style, without being inconvenienced by any actual science, The Post was saying, yes, opinion writers are not merely entitled to their own opinions, but also their own “facts.” (Though Hiatt preferred to call these “inferences”):“It may well be that he is drawing inferences from data that most scientists reject—so, you know, fine, I welcome anyone to make that point. But don’t make it by suggesting that George Will shouldn’t be allowed to make the contrary point. Debate him.”
Hiatt’s argument that George Will ought to be able to make a dissenting point, regardless of its basis in reality, is an argument for false balance, he-said-she-said journalism, in lieu of real analysis. On March 23, Chris Mooney asked a smart follow-up question in the Post: “Can we ever know, on any contentious or politicized topic, how to recognize the real conclusions of science and how to distinguish them from scientific-sounding spin or misinformation?”
Cheryl Rofer: He said, she said and the usual suspects. Some testimony from New Mexico.
Chris Mooney at his Discover Magazine blog responds to this post. “Is ‘He Said, She Said, We’re Clueless’ Coverage Dying?” Mooney does not think so.
Matt Yglesias, All That Informs Is Not Good Journalism, includes this:
A lot of journalists have a kind of contempt for people who do PR and communications work, a contempt that I think is related to the fact that journalists are generally men and PR is generally done by women. There’s little appreciation for the fact that people doing communications are typically just as smart as the journalists covering them, but they’re also better-informed about the subject at hand, better-connected to centers of power, and more committed to having the impact they want to have on the world. Under the circumstances, it’s basically inevitable that press coverage of the powerful will mainly be driven by competition between manipulators-of-journalists rather than by bold truth-telling. This is, however, little recognized inside the profession.
If you’re interested in where this is all headed, then you are definitely a customer for Rosen’s Flying Seminar in the Future of News, an earlier post at PressThink.
So, sure, maybe sometimes it’s embarrassing to pass the buck of analysis. It’s reveals an intellectual lack of seriousness. But maybe other times it’s just what works best, allocating responsibilities to the parties most able and interested in bearing them.“This news has got to get out the door!” a newspaper editor might say. “Let’s revisit our he said, she said simplicity in a minute or an hour,” he might say. “We can post a link from the original to the update later. Maybe that thoughtful update will be ours, written not by an intern but by a wise veteran. But maybe it won’t….”
In January, Harvard’s Shorenstein Center published a study by Eric Pooley, former managing editor of Fortune. He shows that he said, she said “stenography” is the pathetic norm in climate change reporting. The right role is to be an active referee, calling fouls when there are fouls. But it is rarely done. Here’s the Environment Defense Fund blogger on it, with a link to the PDF.
Avedon Carol (The Sideshow, April 11, 2009)
It’s important to underline the fact that the newspaper model that is collapsing is not the model that kept newspapers healthy for hundreds of years, but the “modern” model that has been destroying more than just the newspaper industry – it’s the high-flyer model of newspapers as just another “business” whose sole purpose is to make money.
Making some money is of course a necessity for any business, but there’s a pretty huge gulf between the ethos of making enough money to keep the business healthy, on the one hand, and on the other the priority of generating immoral levels of wealth for just a tiny number of people at the very top. It is because that second view of business has consumed American industry that newspapers are really being squeezed.
Newspapers still run at a pretty good profit. They just don’t generate the enormous profit margins that they used to. The people who’ve messed things up would like to blame this on the internet, but Kos and Atrios are right that a constellation of bad decisions at the top are what’s really to blame.
But I’m always amused at the idea that people are buying fewer newspapers solely because they can read them for free on the web. How many people ever read newspapers from outside of their city? How many people subscribe to more than three daily papers and maybe a few weekly magazines? How many people got up every day and checked the headlines from papers that were not delivered to their doorstep, or at least left on the seat of the commuter train or bus?
(Oh, maybe you did – by turning on your television. Which, of course, you’ve been able to do for long before there was an “internet” thingy. But you couldn’t actually read the articles that way.)
The fact of the matter is that without the internet, I would be reading exactly the same number of American news pages on paper that I read now, and that would mean those articles from the previous day’s New York Times that are published locally in The International Herald Tribune. If I still lived in the DC area, assuming I could still bring myself to give them my money, I might have continued to take The Washington Post on my doorstep, and would have read the local papers. If I was still working at The Baltimore Sun, I would have read the paper in the newsroom, too. But I wouldn’t have daily access on paper to most of the material I can now read on the net in any event. I would never be able to see those articles from The Los Angeles Times, The New York Times, The Columbus Dispatch, or the good stuff written by some guy named Gene Lyons at some otherwise right-wing paper in Arkansas. Nor could I afford to subscribe to all of the magazines I now access on the web – I might take The Nation, but I never subscribed to The New Republic, The New Yorker, or Vanity Fair, and that would likely not have changed. I might read a friend’s copy of something I don’t subscribe to, but I wouldn’t be paying for it myself.
But, as I’ve mentioned before, my dad cancelled his subscription to The Washington Post even though he didn’t have a computer or the nerve to try to use one, because it had started to piss him off so much he couldn’t bring himself to give them his money any more; from then on he relied entirely on the local papers.
The decline in the quality of the product has a lot to do with why many people no longer are willing to fork over their money for papers. And the decline in the quality of product is a direct result of the fact that the big papers are owned by people who are heavily tied to big corporations that don’t actually want to be the purveyors of real news. The Washington Post, for example, is now part of a corporation that makes an awful lot of its money doing things that are somewhat antithetical to having a truly educated and engaged populace. That’s why they never just came right out and told you that Bush’s “education” initiative was just a giant project to rip taxpayers off, line the pockets of his friends and relatives, and also try to destroy the public school system. Because they’re in the “education” business.
One reason it’s been so easy for the right-wing to attack the media as “liberal elites” is because the elite part rings so true when the news media spends so much time talking up concerns and goals that are common to no one you know, and tells you so little of what you need to know to prevent it from destroying your world. They’re not going to make, “If you have a boss, you need a union,” a headline on their front page.
It’s not “the newspaper” that’s failing, it’s the business that’s been built around media that used to be newspapers, and it’s failing for a number of reasons that have more to do with bad business decisions than anything else. Like Kos says:
Newspapers have refused to adapt, or they’ve pissed away money buying baseball teams, or they’ve squeezed the value out of their product by demanding 30 percent profit margins, or they’ve expanded at unsustainable rates, or all of the above.
Or they’ve swung so far to the right that everyone has started to twig that their content is part of the problem.