What, Exactly, Is Facebook?

In the days leading up to and following this election, Facebook has been called lots of things — “a website,” “an internet company,” “a major player in the media universe,” “a strange new class of media outlet,” a “tech behemoth,” a “cesspool of nonsense.” Vox cut to the chase, calling on Facebook to “admit that it is, in fact, a media company” observing “that the design of its news feed inherently involves making editorial decisions, and that it has a responsibility to make those decisions responsibly.”

Even though Facebook continues to deny its role as part of the media — “News and media are not the primary things people do on Facebook,” Mark Zuckerberg wrote in a Facebook response on Monday, “so I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance.” — some 44 percent of Americans now use Facebook as their primary source of news, according to Pew research. In a series of tweets the day after the election, New York Times columnist Zeynep Tufekci wrote, linking to journalist Joshua Benton at Nieman Lab, “Facebook's algorithm is central to how news & information is consumed in the world today, and no historian will write about 2016 without it.”

Whether or not we deem it to be a media organization, Facebook will not, in the foreseeable future, wear that badge. But as a “new source of journalism” (a term that's recently cropped up), should it be expected to meet Fourth Estate obligations? And, if so, how does it do so responsibly? And if they refuse to, should we tax Facebook and other platforms to fund quality journalism?

The foundations of the Fourth Estate, fortified by the First Amendment, rest, in large part, on the idea of checks and balances. In quick summary, the press is, in theory, watchdog, civic forum, and agenda-setter, holding elected officials to account and bound by longstanding liability laws. In the words of Joseph Pulitzer, the press “should always fight for progress and reform; never tolerate injustice or corruption; always fight demagogues of all parties…always oppose privileged classes and public plunderer; never lack sympathy with the poor; always remain devoted to the public welfare…”

By contrast, the foundations of Facebook, and other new sources of journalism, rest in large part on the ideals of what John G. Palfrey, the former executive director of the Berkman Kline Center for Internet & Society at Harvard University, called the Open Internet. The early Open Internet, considered apart from the law and real life, was reinforced by 1996's Communications Decency Act's Section 230(c), or the Good Samaritan act. As writer and activist Soraya Chemaly and I have previously chronicled, Section 230 is widely cherished as the “most important law on the Internet,” credited with making possible a “trillion or so dollars of value” according to David Post, legal scholar and fellow at the Center for Democracy and Technology.

The provision reads: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” These 26 words put free speech decisions into private hands, effectively immunizing platforms from legal liability for all content that does not violate federal law, such as child pornography. In other words, Facebook and Google enjoy the benefits (and ad revenue) of being members of the media without any of the risk. Asking them to voluntarily declare themselves media companies seems more and more like a fool's errand and unlikely to inspire substantive change.

Nicco Mele is a technologist, former deputy publisher of the Los Angeles Times, and now director of the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School — an organization founded to study the media and their role in a democratic society. Mele believes that even if tech companies eventually label themselves media entities, it'll be a decade or so from now.

“These companies want to say they're tech companies,” he said, “and they are. But they're also media companies.” First, the bulk of their revenue is generated from ads and user attention. Second, they have a disproportionate ability to shape the public sphere. Mele argues that these companies, as a result, should be judged against other media companies. “They are not building Roombas,” he said. “If you shape public opinion, you do have special responsibilities. It's why we have the Pulitzer Prize, to motivate better behavior, and why, ultimately, we had the Fairness Doctrine.”

There are good reasons why Facebook doesn't want to be a media company, and the reasons, he said, are not simply legal or regulatory. It's also a matter of brand management, talent, revenue, and regulation, in that order. “It's cooler to be a tech company,” he said. And consumers want cool. Tech status helps attract talent. “Legacy media have reputations as bad places to work.” As a business model, if defined as media, ad revenue will be scrutinized, and, Mele predicted, “regulation will force them to address how they meet the public sphere, which will increase [employee] headcount and increase rates.”

In his 2016 book Free Speech, Timothy Garton Ash calls Facebook and Google superpowers, built exclusively on a profit model absent the moral and legal mechanisms of accountability that exist for traditional media. They control vast privately owned public spaces. They do not have the formal lawmaking authority of sovereign states. There is no formal mechanism of accountability. Their leaders are not accountable to their users. “Yet their capacity to enable or limit freedom of information and expression is greater than most states.”

“New media,” he writes, “live in constant tension between public service they offer” — freedom of expression and information — “and the private profit they pursue.”

And the tension has never been higher. News is only one subset of Facebook's content. In his Saturday night Facebook post Zuckerberg himself seemed to conflate a self-serving impetus to keep users on the platform, with the company's public service. Zuckerberg wrote, “Our goal is to show people the content they will find most meaningful, and people want accurate news.”

So how should platforms engage with the critical role that journalism serves within a democracy short of bearing the mantle of a media organization?

Among the proposed answers, the British Media Reform Coalition (MRC) and the National Union of Journalists are currently pushing the British Parliament to amend the Digital Economy Bill currently in play, to include a 1 percent levy on “large digital intermediaries” — Facebook and Google in particular — to fund nonprofit sources of investigative reporting like the Bureau of Investigative Journalism, ProPublica, or BBC. In other words, even if you can't force platforms to take on the accountability of acting as media, we have the power to make them fund journalism.

Des Freedman, London-based former chair of MRC and author of The Contradictions of Media Power, is among the many arguing that Facebook and Google are media companies, “even if they deny the case.” “What we see in Facebook and Google are utterly decentralized technologies organized in the most unbelievably centralized commercial structures,” he said earlier this week. “Some of the oil brands of the last century would be quite jealous of their position.” It is, he concludes, “productive and legitimate for the public to demand that they make a contribution.”

Indeed, it appears the public understands the value of contribution in support of journalism already. On Monday, Nieman Lab reported donations and subscriptions spiked post-election at The AtlanticProPublicaThe New York Times, and The Washington PostProPublica saw donations jump as election results rolled in, and a tenfold increase in the days following. What contribution looks like from the perspective of platform — what form it takes, legal or nonlegal — is another question.

“A levy is a very European thing, a welfarist redistribution system,” Freedman was quick to acknowledge, and, as such, perhaps unlikely to gain traction in the US. And even if a journalism tax were imposed, decisions to tax platforms in order to fund media under a Trump presidency opens the doors to a host of new concerns. Yet, he emphasized, in Bernie Sanders-like echo, asking corporations to contribute publicly for the privilege of profit is “a traditional form of seeking to equalize power and speech rights.”

“Google,” he noted, “has regularly made contributions.” In 2015 the company launched its DNI news initiative after accusations, in the words of one Guardian report, “distorting internet search results and acting anti-competitively by European regulators.” Google has committed some 150 million euros in Europe to date. Billed as “a collaboration between Google and news publishers in Europe to support high quality journalism and encourage a more sustainable news ecosystem through technology and innovation,” Freedman considers it a start in the right direction.

Maggie Shiels, formerly of the BBC and currently in corporate communications at Google, said, “We are striving to be better partners with publishers across the board, and recognize the value that quality journalism plays in the world today.” Though she declined to comment on the specifics of the levy proposal, she did say that Google is supporting quality journalism “on multiple fronts.” “Our News Lab team,” 10 or so in-house employees, some former journalists like herself, “trains tens of thousands of journalists every year from around the world on our tools — all free.” Among the summary of programs she forwarded, Shiels reported that Google is collaborating with news organizations, through its News Lab program, on using data to tell stories through Google's Trends tool. It is a founding member of the First Draft News coalition, “set up to raise awareness and find solutions in all aspects of social news gathering and verification.” Google is also working with the industry and others on the Trust Project, which, in her words, “explores how to make trustworthy journalism stand out.”

Like Google, Facebook declined comment on the specifics of the levy proposal. Facebook's journalism-related developments include participation in the First Draft News Coalition, a group of 20 news organizations, including The Telegraph, the New York TimesWashington Post, and Agence France-Presse, designed to improve reporting from social media and to address fake news. Facebook has plans to work with the First Draft to develop a training program for journalists worldwide. In the last few weeks, on its Facebook for Journalists Site, Facebook introduced an online training, available through Blueprint, and drawing on a bank of reporting case studies. In October, Facebook updated its Signal offerings to include Live video for journalists, and is beginning to see live content generated as a result.

It's early, but it's worth noting that Facebook does not yet appear to be contributing discreet funds to support outside sources of investigative journalism in the way that Google does through its DNI initiative. Instead it appears it is using its platform to increase use and traffic, educate journalists, and produce more content. MRC's Des Freedman hopes Facebook will one day adopt Google's approach.

He also hopes the approach will evolve toward something more sustainable, like a permanent levy. “It's not about protecting [existing media],” Freedman said, “but nurturing new forms of journalism,” forms that reach local areas where reporting has all but dried up, that represent the issues important to vast swaths of populations currently underreported. And these new forms need to be transparent, with transparent processes for distribution of funds. “We don't want to replace one form of unaccountability with an equally opaque form of journalism. Otherwise, you're just making the same mistakes.”

Whether or not private-public journalism partnerships ever take root, a growing number of experts, academics, pundits, and policy makers, and there were already many, are forcefully calling for transparency and accountability, no matter how tech companies move forward.

Some, including Jeff Jarvis, journalism professor at the City University of New York, and Edward Wasserman, dean of the University of California, Berkeley journalism school, have suggested Facebook hire more editors and journalists to help curate and manage its news feeds and algorithms. This, of course, raises even more questions. As Freedman asked, What is the basis on which these journalists and editors will now verify stories? What will be the editorial guidelines underpinning verification? Yet more proof that Google and Facebook are not neutral intermediaries but increasingly important media players with major responsibilities in the emerging news environment.

Others, including Safiya U. Noble and Sarah T. Roberts, and Zeynep Tufekci, call for significant AI reform to address outcomes like the one reported in Monday's Washington Post, headlined: “Google's top news link for 'final election results' goes to a fake news site with false numbers.” Facebook struggles with a similar challenge. Writes Tufekci, “Facebook could tweak its algorithm so that it does less to reinforce users' existing beliefs, and more to present factual information… Facebook should also allow truly independent researchers to collaborate with its data team to understand and mitigate these problems.” And if the company employs human decisions around news, it could explain those decisions publicly as well. Garton Ash recommends kitemarking all media providers, akin to food labeling, covering such information as editorial process, standards applied, and ownership, and also paying close attention to competition policy

In any case, as Des Freedman said, “These are early days.” He continued, “Let's not kid ourselves. Even if this [levy] model is successful, it's still really important to stress that recent events concerning, for example, Brexit and Trump, are political crises that have journalistic implications.” As The New York Times' Tufekci observed here in the US, “Mass media trivialized the election, social media inflamed it. But underlying it all: elite failure in responding to global turbulence.”

It's conceivable that Zuckerberg is in the midst of responding to turbulence of his own. As of press time, he had yet to publicly respond to his renegade employees' charges on Monday that fear fuels Facebook's news operation. “I am confident we can find ways for our community to tell us what content is most meaningful,” he wrote in Saturday's Facebook post, “but I believe we must be extremely cautious about becoming arbiters of truth ourselves.”

“Identifying the 'truth,'” he wrote, “is complicated.”

It's easier when pushed. On Monday, six days after Trump was elected the 45th president of the United States, after six days of media investigation and damning headlines, both Google and Facebook announced plans to start confronting the problem of fake and malicious news on their platforms. The first step? They're following the money, and restricting advertising on sites that publish hoaxes and lies and call them news.

Soraya Chemaly contributed to this story.

This post originally appeared at the Verge and is posted here with permission.

About the reporter

Catherine Buni

Catherine Buni

Catherine Buni is a freelance writer and editor focusing on technology, health, and gender.