The doctored video falsely depicting House Speaker Nancy Pelosi as ill or drunk spreading on social media is a wake-up call that even as the 2020 presidential campaign heats up, there is still no playbook for how the social media companies should respond to disinformation campaigns. While YouTube removed the video, Facebook did not, deciding only to reduce its rank and attach a caution that additional reporting is available.
This episode will only buttress the calls of Facebook co-founder Chris Hughes and Sen. Elizabeth Warren to break up the company using re-invigorated U.S. antitrust law. It will fortify the insistence on tougher privacy laws, especially if the Federal Trade Commission’s settlement of Facebook’s ongoing privacy violations is perceived to be an insufficient deterrent.
More competition and privacy controls are necessary, but they won’t fix the problem of disinformation spread by foreign actors and political operatives. Instead, we should be looking to another policy tradition for more tailored remedies: media policy.
While social media companies and digital networks are relatively new, the problems of information laundering and manipulation are not. Before Facebook and Twitter, newspapers and broadcasters posed similar threats—and new norms and policy solutions were developed and adopted to advance the public interest over hidden influence and concentrated power. While the rules of analog media cannot be grafted onto cyberspace, the concerns and principles behind them can and should inform how we address the challenges of today.
In every period of communications and political upheaval, from the Communications Act of 1936 to the 1947 Hutchins Commission on Freedom of the Press to the 1967 Public Broadcasting Act, the challenge was to make media serve democracy and not undermine it. But as the years went on and political debate, advertising and news consumption began to move online, what remained of these protections was never extended to this new digital ecosystem. It grew up and remains to this day largely unregulated. Newspapers and then broadcast and cable news programs imposed on themselves disclosures, including the separation of news from opinion, the presence of mastheads that revealed ownership and management, editorial codes and standards, and rules on conflicts of interest.
An understanding of the First Amendment that considered listener/viewer interests as well as broadcaster interests allowed regulators to impose additional requirements on broadcasters. These included programming and ownership reports—one purpose of which was to build firewalls against the spread of any single ideology or point of view. Broadcasters were required to reveal sponsors of content and maintain public files of political ads so that opponents could buy equivalent time at the same price. Broadcast stations got licenses in order to serve local communities and were expected to produce news as a “public interest” obligation. For a time, they even had to cover both sides of political issues under the Fairness Doctrine. Public broadcasting and the Corporation for Public Broadcasting were an attempt to fill the information and education gap that was arising as broadcasting bent to the commercial bottom line. In the 1990s, the Children’s Television Act limited advertising targeted at children, and also required broadcasters to air programming that “meets the educational and informational needs of the child audience.”
Today there is no question that the Internet is our new media gatekeeper; almost as many people get their news from the Internet as from television according to Pew Research and 40 percent of Americans think the Internet plays an integral role in American politics per the USC Annenberg School for Communication and Journalism. And social media suffers from the very vulnerabilities the architects of 20th-century media policy feared: an opaque information ecosystem, with centralized control, enfeebling local journalism, amplifying propaganda, and weakening our capacities for self-government.
Digital information platforms are vulnerable to influence laundering. Bots, fake accounts and click farms pretend to be people they’re not and create a false sense of consensus. Shallow fakes, like the edited Pelosi video, and deep fakes that use artificial intelligence to create media moments that never were trick people into false realities. The platforms, which are designed to keep users online to be served ads, end up privileging engagement over truth or the public interest. What drives engagement is often outrage and disgust, so this is what the algorithm rewards.
Meanwhile, the platforms have absorbed the ad revenue that once supported journalism. Google and Facebook together capture 60 percent of the digital advertising market. As a result of ad revenue losses and other factors, the Pew Research Center found that the number of people employed in newsrooms fell by 45 percent across the United States between 2004 and 2017, with hundreds of newspapers folding, including many dailies. What news organizations invest in credibility is quickly washed away in a sea of look-alike content that borrows the signals of credible journalism without paying the price. Now, the conspiracy blog looks just like the proven news organization in the online monoculture of content presentation. The consequences of this lack of transparency and deemphasis of credible (especially local) news would not have been hard to forecast for either those who developed journalistic standards or those who wrote U.S. media laws in the middle of the last century
Of course, verbatim application of 20th-century media policy won’t work for today’s digital environment; some of it didn’t work very well last century either. But its core concerns should be taken seriously and its principles—especially transparency, responsibility and structural design to promote news investment—can be adapted for the 21st century.
First, the platforms should provide more information on the supply chain of content labeling fake audio and video, as well as bots and fake accounts. Users should have options for how to structure the algorithms that recommend content to them. And third-party researchers should have access to engagement and advertising data, so that users can understand who is manipulating public discourse.
Second, platforms should develop more specific, transparent policies for taking down or dampening the distribution of widely spread and demonstrably false content—as well as incitements to violence, online harassment, and terrorist content—assuring rights of appeal. If the companies won’t take more aggressive action, Congress might encourage them. It could narrow the provision of law that today exempts the digital platforms from liability for most content by converting that exemption to a safe harbor; the safe harbor would be available if they develop more robust and transparent methods to deal with toxic viral content. This is how the law works to secure the rights of copyright holders—certainly, rights to be free from predatory disinformation, incitement and harassment are no less vital than property rights.
Third, platforms should provide at least the same degree of transparency for online political ads as do broadcasters. Congress should enact the bipartisan Honest Ads Act and enhance it by requiring that donors to groups running the ads be revealed as well. Users should also be informed if their political views are being collected to micro-target them with ads (as Europe’s privacy law requires) and we should consider going farther than this and just forbidding the micro-targeting users with different political ads.
Fourth, we should imagine what the new PBS and CPB of the Internet would look like. Surely it would include a fund for local journalism, provisions for data and scientific research, and mechanisms to support public access to civic information and interaction with government. Another crucial component of the old public media—rarely noted in public discourse—was the satellite capacity and reserved broadcast channels that freed public media from relying on commercial infrastructure. Investment in alternative public infrastructure like servers, broadband and even independent social media would bring to the online world the independence and local control 20th-century media policy considered essential for true freedom.
And fifth, to implement and enforce evidenced-based policies with both the transparency needed to assure public accountability and the flexibility to change with technology, a new agency is necessary. Congress clearly lacks the resources and sufficient technical expertise. The FTC lacks not only staff and resources but authority and a mandate to focus on democratic debate, not just consumers. This agency could also foster competition as the old electronic media ownership caps did—including by empowering users to port content and have conversations across different social media platforms—and implement a new comprehensive privacy law as well.
As we open the book on addressing digital information platforms’ effect on our polity, it would behoove us to pick up the discussions on media in the last century and attempt to ensure citizens have the information they need to participate in their democracy.
Ambassador Karen Kornbluh is senior fellow and director of the German Marshall Fund of the United States’ Digital Innovation Democracy Initiative. Ellen Goodman is director of the Rutgers Institute for Information Policy & Law.