Donate
Text Audio
00:00 00:00
Font Size

The Department of Justice has asked an important question about Section 230: Is the rule that helped shape the internet as we know it due for an update?

Deputy Attorney General Jeffrey A. Rosen gave a speech about possible changes to Section 230 of the Communications Decency Act at the Free State Foundation's 12th Annual Telecom Policy Conference on March 10, 2020. “After 25 years, it seems that the time has come for Congress to assess what changes to Section 230 are now needed,” he proclaimed.

Rosen noted that Big Tech platforms have been given “carte blanche to selectively remove anything from websites for other reasons and still claim immunity” without oversight, a situation that may change if things go forward.

What’s his solution? Focus on further defining the terms. “Perhaps there needs to be a more clear definition of ‘good faith,’ and the vague term ‘otherwise objectionable’ should be reconsidered.” Rosen explained further and also asked an important question that gets to the crux of the Big Tech bias problem:

Of course, platforms can choose whether or not to remove any content on their websites. But should they automatically be granted full statutory immunity for removing lawful speech and given carte blanche as a censor if the content that is not ‘“obscene, lewd, lascivious, filthy, excessively violent, or harassing’” under the statute?

Rosen briefly summarized some of the law’s key provisions: “Under Section 230, the social media site is not liable for what the user says, although the user themselves may still be liable. Section 230 also immunizes a website from some liability for ‘in good faith’ removing illicit user-generated content that is ‘obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.’”

Rosen explained how and why Section 230 shaped the internet as we know it: “When Section 230 was enacted in 1996, it enabled the growth of platforms that hosted user-generated content without fear that doing so would expose the platform to massive civil liability as publishers or speakers of that content.” He emphasized that “Without Section 230, some say, the potential civil liability and cost of litigation could have forced companies to significantly curtail their user-generated content – or even to cease to exist altogether.”

[ads:im:1]

As Bloomberg similarly summarized in its reporting, “Immunity from lawsuits over third-party content is crucial to their business models because they would otherwise need to hire armies of content checkers and could face burdensome legal challenges.”

Rosen also observed how “Under the Good Samaritan provision, platforms have the ability to remove content that they have a ‘good faith’ belief is ‘obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.’”

The problem, however is that these platforms have been given freedom without oversight, or as he described: “a blank check, ignoring the ‘good faith’ requirement and relying on the broad term ‘otherwise objectionable’ as carte blanche to selectively remove anything from websites for other reasons and still claim immunity.”

[ads:im:2]