Section 230 is finally getting the clear-eyed attention that it deserves. No longer is it naive to suggest that we revisit the law that immunizes online platforms from liability for illegality that they enable. Today, the harm wrought by the current approach is undeniable. Time and practice have made clear that tech companies don't have enough incentive to remove harmful content, especially if it generates likes, clicks, and shares. They earn a fortune in advertising fees from illegality like nonconsensual pornography with little risk to their reputations. Victims can't sue the entities that have enabled and profited from their suffering. The question is how to fix Section 230. The legal shield enjoyed by online platforms needs preconditions. This essay proposes a reasonable steps approach borne out of more than 12 years working with tech companies on content moderation policies and victims of intimate privacy violations. In this essay, I lay out concrete suggestions for a reasonable steps approach, one that has synergies with international efforts.

Citation
Danielle Citron, How To Fix Section 230, 103 Boston University Law Review, 713–761 (2023).