The newly signed Take It Down Act makes it illegal to publish nonconsensual explicit images – real or AI-generated – and gives platforms just 48 hours to comply with a victim’s takedown request or face liability. While widely praised as a long-overdue win for victims, experts warn its vague language, lax standards for verifying claims, and tight compliance window could pave the way for overreach, censorship of legitimate content, and even surveillance.
Blog
-
Naukri exposed recruiter email addresses, researcher says
The recruiter website fixed the email address exposure earlier this week.
-
After Klarna, Zoom’s CEO also uses an AI avatar on quarterly call
After Klarna CEO, Zoom’s CEO opts to use AI avatar for initial comments during the earnings call
-
Signal’s new Windows update prevents the system from capturing screenshots of chats
Signal said today that it is updating its Windows app to prevent the system from capturing screenshots, thereby protecting the content that is on display. The company said that this new “screen security” setting is enabled by default on Windows 11. Signal said that this new feature is designed to protect users’ privacy from Microsoft’s […]
-
Fortnite returns to the US App Store after a five-year gap
Popular battle royale game Fortnite has finally returned to the U.S. App Store amid game maker Epic Games’ lengthy legal skirmish with Apple. As of Tuesday, Fortnite is also available on the Epic Games Store and AltStore in the EU. It’ll show up in App Store searches soon, Epic said in a post on X. […]