Grok Supercharges the Nonconsensual Pornography Epidemic
This time last year, I conducted a study analyzing the technical features and marketing strategies of more than two dozen “undressing apps”—generative AI-powered tools that advertise their ability to transform images of fully clothed women into pornography.
I discovered tools that enable users to strip, pose, and digitally force victims into sexually explicit scenes “within moments” of uploading a single photo of the desired subject’s face. The most sophisticated apps also encouraged users to modify their victims’ physical appearances by altering the size of her breasts or adding tattoos to her skin. I found that for a few extra dollars, users could access premium features like video generation, allowing them to create explicit clips “shot from” various angles.
None of the tools I studied were as powerful, versatile, or as easily accessible as Elon Musk’s Grok.
The controversial AI model that once described itself as “MechaHitler,” developed by xAI and integrated into X.com, is at the center of a new Ofcom investigation this week and various other regulatory inquiries around the world after journalists began to raise flags shortly after the new year that the model was on a “mass digital undressing spree.” X has since disabled Grok’s image-generation features for most free users, limiting them to paying subscribers. Two countries—Indonesia and Malaysia—have also restricted access to Grok.