Press "Enter" to skip to content

Don’t leave developers behind in the Section 230 debate

Last week marked the first time the U.S. Supreme Court reviewed Section 230 of the Communications Decency Act of 1996. In oral arguments in the Gonzalez v. Google case, important questions were raised about platform responsibility and the risk of viral content.

As the court grapples with these questions, it is an opportunity to reflect on why 230 was created in the first place, how it fosters innovation and what we all stand to lose if the protections embedded within 230 are narrowed.

Nicknamed the “26 words that created the internet” by Jeff Kosseff, Section 230 established a liability shield for platforms that host third-party content. In the nascent days of the internet, 230 created favorable legal conditions for startups and entrepreneurs to flourish, cementing the United States as a world leader in software.

While today’s tech landscape is dramatically different from the fledgling internet of the ’90s, the reasoning behind Section 230 still holds true today. The architecture of law creates conditions for innovation and can also chill it.

Seemingly lost in arguments taking aim at the outsized influence of large social media platforms is an appreciation of how Section 230 supports the broader online ecosystem, especially software developers. Developers are at the heart of our online world and at the forefront of creating solutions for global challenges, working to make the software that underpins our digital infrastructure more secure, reliable and safe.

Policymakers should recognize the critical role of developers and work to support them, not stifle innovation.

Developers rely on 230 to collaborate on platforms like GitHub and to build and operate new platforms rethinking social media. Narrowing 230 protections could have far-reaching implications, introducing legal uncertainty into the important work of software developers, startups and platforms that provide them the tools to realize their vision. As policymakers consider how to address new frontiers of intermediary liability, it’s essential to center developers in decisions that will shape the future of the internet.

Software developers contribute significantly to the United States’ economic competitiveness and innovation and are important stakeholders in platform policy. GitHub counts 17 million American developers on our platform — more than any other country. Their open source activity alone contributes more than $100 billion to the U.S. economy annually.

These developers maintain the invisible but essential software infrastructure that powers our daily lives. Nearly all software — 97% — contains open source components, which are often developed and maintained on GitHub.

As the chief legal officer at GitHub, a global community of over 100 million software developers collaborating on code, I know firsthand the importance of keeping 230 intact. While GitHub is a far cry from a general-purpose social media platform, GitHub depends on 230 protections to both host third-party content and engage in good-faith content moderation.

That’s especially important when a platform has over 330 million software repositories. GitHub has been able to grow while maintaining platform health thanks to intermediary liability protections. GitHub has a robust, developer-first approach to content moderation to keep our platform safe, healthy and inclusive while tailoring our approach to the unique environment of code collaboration, where the takedown of a single project can have significant downstream effects for thousands or more software projects.

When it comes to the specifics of the Gonzalez v. Google case, which asks the court to consider whether Section 230’s liability protections ought to include third-party content recommended by algorithms, a ruling in favor of the petitioners could have unintended consequences for developers. Recommendation algorithms are used throughout software development in myriad ways that are distinct from general-purpose social media platforms.

GitHub’s contributions to Microsoft’s amicus brief in the case outline our concerns: Recommendations powered by algorithms on GitHub are used to connect users with similar interests, let them find relevant software projects and are even used to recommend ways to improve code and fix software vulnerabilities. One such example is GitHub’s CodeQL, a semantic code analysis engine that allows developers to discover vulnerabilities and errors in open source code.

Developers are using GitHub to maintain open source projects that employ algorithmic recommendations to block hate speech and remove malicious code. A decision by the court to narrow 230 to exclude protection for recommendation algorithms could quickly ensnare a variety of societally valuable services, including tools that maintain the quality and security of the software supply chain.

A ruling in Gonzalez v. Google that seeks to pull back protections benefiting social media platforms has the potential to impact a much broader community. In the lead-up to the court hearing the case, a host of amicus briefs emphasized its far-reaching implications: from nonprofits (Wikimedia Foundation) to community content moderation (Reddit and Reddit moderators) and small businesses and startups (Engine).

While calls to narrow 230 focus mainly on putting Big Tech in check, doing so would unintentionally curb competition and innovation while creating additional barriers to entry for the next generation of developers and emerging providers.

These concerns are not hyperbole: In “How Law Made Silicon Valley,” Anupam Chander examines how the U.S. legal system created favorable conditions for internet entrepreneurship in contrast to Europe, where “concerns about copyright violations and strict privacy protections hobbled internet startups,” and Asia, where “Asian web enterprises faced not only copyright and privacy constraints, but also strict intermediary liability rules.”

Narrowing 230 wouldn’t just harm the United States’ global competitiveness; it would impede tech progress within the U.S. While GitHub has gone a long way from our startup beginnings, we’re committed to leveling the playing field so anyone, anywhere, can be a developer.

As we await the court’s decision in Gonzalez v. Google, it’s important to note that whatever the result of the case, there will surely be more efforts to narrow 230, whether they are taking aim at algorithmic recommendations, AI or other innovations. While these new technologies raise important questions about the future of intermediary liability, policymakers must strive to chart a path forward that creates a legal environment that supports developers, startups, small businesses and nonprofits that power so many socially beneficial parts of the internet.

Policymakers concerned about reducing harmful content can look to how developers are leading the way in content moderation. Developers use GitHub to develop valuable software projects, including open source content moderation algorithms that reflect policymakers’ calls for algorithmic transparency on platforms, such as the Algorithmic Accountability Act of 2022 and the Algorithmic Justice and Online Platform Transparency Act.

Platforms including Twitter, Bumble and Wikimedia have used GitHub to share the source code for algorithms that flag misinformation, filter lewd imagery and block spam, respectively. Open source is spurring innovation in content moderation while offering new models for community participation, oversight and transparency.

As we encounter new frontiers in intermediary liability, policymakers should recognize the critical role of developers and work to support — not stifle — innovation.

source: TechCrunch