A law professor serving on the EFF’s board of directors (and advisory boards for the Electronic Privacy Information Center and the Center for Democracy and Technology) offers this analysis of “the push for stricter rules for internet platforms,” reviewing proposed changes to the liability-limiting Section 230 of the Communications Decency Act — and speculating about what the changes would accomplish:
Short of repeal, several initiatives aim to change section 230. Eleven bills have been introduced in the Senate and nine in the House of Representatives to amend section 230 in various ways… Some would widen the categories of harmful conduct for which section 230 immunity is unavailable. At present, section 230 does not apply to user-posted content that violates federal criminal law, infringes intellectual property rights, or facilitates sex trafficking. One proposal would add to this list violations of federal civil laws.
Some bills would condition section 230 immunity on compliance with certain conditions or make it unavailable if the platforms engage in behavioral advertising. Others would require platforms to spell out their content moderation policies with particularity in their terms of service (TOS) and would limit section 230 immunity to TOS violations. Still others would allow users whose content was taken down in “bad faith” to bring a lawsuit to challenge this and be awarded $5,000 if the challenge was successful. Some bills would impose due process requirements on platforms concerning removal of user-posted content. Other bills seek to regulate platform algorithms in the hope of stopping the spread of extremist content or in the hope of eliminating biases…
Neither legislation nor an FCC rule-making may be necessary to significantly curtail section 230 as a shield from liability. Conservative Justice Thomas has recently suggested a reinterpretation of section 230 that would support imposing liability on Internet platforms as “distributors” of harmful content… Section 230, after all, shields these services from liability as “speakers” and “publishers,” but is silent about possible “distributor” liability. Endorsing this interpretation would be akin to adopting the notice-and-takedown rules that apply when platforms host user-uploaded files that infringe copyrights.
Thanks to Slashdot reader Beeftopia for sharing the article, which ultimately concludes:- Notice-and-takedown regimes have long been problematic because false or mistaken notices are common and platforms often quickly take-down challenged content, even if it is lawful, to avoid liability…
For the most part, these platforms promote free speech interests of their users in a responsible way. Startup and small nonprofit platforms would be adversely affected by some of the proposed changes insofar as the changes would enable more lawsuits against platforms for third-party content. Fighting lawsuits is costly, even if one wins on the merits.
Much of the fuel for the proposed changes to section 230 has come from conservative politicians who are no longer in control of the Senate.
The next Congress will have a lot of work to do. Section 230 reform is unlikely to be a high priority in the near term. Yet, some adjustments to that law seem quite likely over time because platforms are widely viewed as having too much power over users’ speech and are not transparent or consistent about their policies and practices.