Barr says they will encourage “responsible” moderation of content
Calling the immunity “outdated,” the Justice Department has released a set of proposed reforms for Sec. 230 of the Communications Decency Act, which currently provides a carveout from civil liability for Web sites’ moderation, or not, of most third-party content. DOJ said the section was “ripe for reform.”
Those would include essentially preventing web sites from removing anything but illegal content by changing the language that allows a site to remove “otherwise objectionable” material without civil liability to only content moderation that removes “unlawful” material or that which “promotes terrorism.”
Currently Sec. 230 immunity applies to “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene…or otherwise objectionable.”
The “otherwise objectionable” language in the secction has allowed for content moderation according to a site’s own standards for its online community, which can vary from site to site, as it does between Twitter, which moderates political content, and Facebook, which does not.
The President targeted the section in an Executive Order last month and enlisted the Commerce and Justice Departments, and if it agrees, the FCC, in an effort to regulate online content to prevent what the President says is censorship of conservative speech.
But Justice said its recommendations are after a 10-month review and in response to bipartisan concerns about the scope of the immunity and are meant to get at “illicit material.” The President’s order was targeted at political speech.
But the order does get at all forms of speech by proposing to read “otherwise objectionable” out of the protection since obscenity is already illegal (though itself a judgment call), as is, by definition, “unlawful” conduct.
The reforms include a clarification of “good faith,” one of the central issues in the President’s Executive Order. Trump wants the FCC to come up with “the conditions under which an action restricting access to or availability of material is not ‘taken in good faith,’ and so would violate a web site’s terms of service.
Among the proposed reforms are: 1) “a carve-out for bad actors who purposefully facilitate or solicit content that violates federal criminal law or are willfully blind to criminal content on their own services. Those would include child abuse, terrorism, and cyber-stalking 2) “a statutory definition of “good faith” to clarify its original purpose. The new statutory definition would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and consistent with public representations. These measures would encourage platforms to be more transparent and accountable to their users”; 3) “clarify” that Sec. 230 does not apply to civil actions brought by the government.
It would also “clarify” that Sec. 230 is not immunity from antitrust claims and that removal of content consistent with terms of service does not render an online platform a publisher or speaker for the other content on its service.
There are definitely bipartisan concerns about issues like sex trafficking and fomenting terrorism, for example.
“These reforms are targeted at platforms to make certain they are appropriately addressing illegal and exploitative content while continuing to preserve a vibrant, open, and competitive internet,” said Attorney General William Barr. “These twin objectives of giving online platforms the freedom to grow and innovate while encouraging them to moderate content responsibly were the core objectives of Section 230 at the outset.”
Barr said the government can’t delegate the protection of Americans’ safety ” purely to the judgment of profit-seeking private firms.”