Select Language

English

Down Icon

Select Country

France

Down Icon

Trump Signs Controversial Law Targeting Nonconsensual Sexual Content

Trump Signs Controversial Law Targeting Nonconsensual Sexual Content
The Take It Down Act requires platforms to remove instances of “intimate visual depiction” within two days. Free speech advocates warn it could be weaponized to fuel censorship.
US President Donald Trump signs an executive order.Photograph: Chris Kleponis

US President Donald Trump signed into law legislation on Monday nicknamed the Take It Down Act, which requires platforms to remove nonconsensual instances of “intimate visual depiction” within 48 hours of receiving a request. Companies that take longer or don’t comply at all could be subject to penalties of roughly $50,000 per violation.

The law received support from tech firms like Google, Meta, and Microsoft and will go into effect within the next year. Enforcement will be left up to the Federal Trade Commission, which has the power to penalize companies for what it deems unfair and deceptive business practices. Other countries, including India, have enacted similar regulations requiring swift removals of sexually explicit photos or deepfakes. Delays can lead to content spreading uncontrollably across the web; Microsoft, for example, took months to act in one high-profile case.

But free speech advocates are concerned that a lack of guardrails in the Take It Down Act could allow bad actors to weaponize the policy to force tech companies to unjustly censor online content. The new law is modeled on the Digital Millennium Copyright Act, which requires internet service providers to expeditiously remove material that someone claims is infringing on their copyright. Companies can be held financially liable for ignoring valid requests, which has motivated many firms to err on the side of caution and preemptively remove content before a copyright dispute has been resolved.

For years, fraudsters have abused the DMCA takedown process to get content censored for reasons that have nothing to do with copyright infringements. In some cases, the information is unflattering or belongs to industry competitors that they want to harm. The DMCA does include provisions that allow fraudsters to be held financially liable when they make false claims. Last year, for example, Google secured a default judgment against two individuals accused of orchestrating a scheme to suppress competitors in the T-shirt industry by filing frivolous requests to remove hundreds of thousands of search results.

Fraudsters who may have feared the penalties of abusing DMCA could find Take It Down a less risky pathway. The Take It Down Act doesn’t include a robust deterrence provision, requiring only that takedown requestors exercise “good faith,” without specifying penalties for acting in bad faith. Unlike the DMCA, the new law also doesn’t outline an appeals process for alleged perpetrators to challenge what they consider erroneous removals. Critics of the regulation say it should have exempted certain content, including material that can be viewed as being in the public’s interest to remain online.

Another concern is that the 48-hour deadline specified in the Take It Down Act may limit how much companies can vet requests before making a decision about whether to approve them. Free speech groups contend that could lead to the erasure of content well beyond nonconsensual “visually intimate depictions,” and invite abuse by the same kinds of fraudsters who took advantage of the DMCA.

Since it receives millions of DMCA takedown requests annually, Google has said in court papers that it “often must rely” on the “accuracy of the statements submitted by copyright claimants.” It’s difficult to imagine that the process would be any different for Take It Down, says Becca Branum, deputy director of the free expression project at the Center for Democracy and Technology. (CDT receives a minority of its funding from Google and other tech companies.)

“Platforms have no incentive or requirement to make sure what comes through the system is nonconsensual intimate imagery,” Branum says. Because it’s often cheaper and easier for companies to comply with requests than to investigate them, she says, more content may be removed from the internet than deserves to be. Braum points to another set of laws passed by Congress about seven years ago addressing sex trafficking content, which she argues also led to the removal of unrelated information from the web.

Under their existing takedown processes for nonconsensual intimate imagery, some tech companies require requestors to show government-issued identification confirming that they are the person being depicted. But advocates for victims say the rules unfairly burden legitimate requestors and jeopardize their privacy.

Take It Down doesn’t require identity verification, and it’s possible that burdensome processes to request removals could trigger FTC scrutiny. Likewise, critics of Take It Down also may call on the FTC—which is typically aligned with the president’s political party—to investigate companies that allow bogus requests to sail through.

Ted Cruz and Amy Klobuchar, the bipartisan pair of senators who helped shepherd Take It Down through Congress with little opposition, didn’t respond to requests for comment about issues that have been raised with the legislation. For members of Congress, passing the bill was imperative to protecting people, like the teenagers whose experiences helped shape it. The hope is that future victims will get their privacy back without delay.

wired

wired

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow