Apple largely stayed silent during the Grok controversy earlier this year, but a new report suggests it was far more involved behind the scenes than it let on.
According to a report from NBC News, Apple informed U.S. senators that both Grok and the X app were found to be in violation of App Store guidelines after a surge of complaints.
Apple pushed for fixes, rejected early updates
The controversy began when users discovered that Grok could generate sexualized deepfakes by editing photos, including images involving women and, in some cases, minors. The issue quickly escalated, with growing pressure on Apple to remove the apps from the App Store.
Instead of taking immediate action, Apple reportedly contacted the developers and asked them to outline a clear plan to fix content moderation issues. At the same time, it warned that failure to act could result in removal from the App Store.
An updated version of Grok was later submitted for review but was rejected by Apple, which said the proposed fixes did not go far enough. A revised version of the X app, however, was approved, while Grok required further changes before meeting Apple’s standards.
Apple eventually cleared a later version of Grok after the company made additional improvements to its content moderation systems, according to the letter cited in the report.
Concerns remain despite improvements
Despite those updates, the issue does not appear fully resolved.
A separate NBC News report claims that Grok can still generate sexualized images of individuals without consent in certain cases. While the frequency has decreased compared to earlier this year, some users are reportedly still finding ways to bypass restrictions.
The report also highlights that images can still be altered to make individuals appear in more revealing clothing, raising ongoing concerns around safety and enforcement.
These details offer a clearer picture of why Grok’s features changed rapidly during the backlash, even as Apple chose not to comment publicly at the time.



