Attorney General William Barr appears before the House Oversight Committee on Capitol Hill, in Washington, U.S., July 28, 2020.
Matt McClain | Reuters
The Department of Justice proposed new legislation Wednesday to reform a key legal liability shield for the tech industry known as Section 230.
The draft legislation, which would need to be passed by Congress, focuses on two areas of reform. First, it aims to narrow the criteria online platforms must meet to earn the liability protections granted by Section 230. Second, it would carve out the statute’s immunity for certain cases, like offenses involving child sexual abuse.
Section 230 of the Communications Decency Act protects online platforms from liability for their users’ posts. But it also allows them to moderate and remove harmful content without being penalized.
The statute’s protections helped tech platforms grow from the early days of the internet but have come under scrutiny in recent years as lawmakers and regulators more broadly question the tech industry’s power.
Several lawmakers have proposed reforms to Section 230 in recent months and President Donald Trump signed an executive order in May targeting the law, claiming to crack down on alleged “censorship” by tech platforms. Trump introduced the order shortly after Twitter slapped fact-check labels on his tweets for the first time.
The Justice Department has been looking at Section 230 for the better part of a year. Attorney General William Barr said at a conference in December 2019 that the Department was “thinking critically” about Section 230. It later hosted experts in February to debate the merits of the law and discuss how it could be reformed.
The DOJ’s proposed reforms echo some legislation that has already been introduced by lawmakers. For example, it narrows the standard tech companies must follow in order to remove content that is considered “obscene, lewd, lascivious, filthy, excessively violent,” from a subjective one to that of an “objectively reasonable belief.” A bill introduced by three powerful Republicans earlier this month includes the same standard and similarly narrows the types of content that platforms could be protected for removing, like content promoting self-harm or terrorism.
This story is developing. Check back for updates.