How X is responding to the sexual deepfake controversy
X users have been using Grokโs image generation feature to 'undress' photos of ordinary women (New Yearโs selfies, cosplayers, etc.) and transforming them into bikini shots or borderline porn. In the process, there have even been serious cases of CSAM (Child Sexual Abuse Material) being generated, such as turning photos of minors into micro-bikini shots. The victims (original posters) became aware of the situation when these altered photos were exposed to them, sparking a major controversy.
(Details: https://bbs.ruliweb.com/community/board/300143/read/73626220)

https://www.asiae.co.kr/article/2026011011132244802#summarynews_popup
Xโs response: They've restricted image generation and editing to $8 paid subscribers only. This only applies to Grok within the X platform; the standalone Grok app is reportedly still free.
"Users are absolutely savage toward Elon Musk's 'solution,' accusing him of essentially monetizing digital sex crimes and acting like a rich version of a far-right troll."
#AngryContinue Browsing