Grok nonconsensual pornographic deepfakes almost led to an App Store ban

Technology

Apple reportedly threatened Grok owner xAI with an App Store ban if the deepfake nude generation issues weren’t addressed. In spite of ongoing problems with the chatbot, the app was never removed.

For several horrific days in January, social media platform X was flooded with AI-generated pornographic images involving non-consenting adults and minors. Many wondered why legal entities were slow to respond, but above all, why Apple was completely silent on the matter.

According to a new report from CNBC, shared by 9to5Mac, Apple did threaten to remove Grok from the App Store. While Elon Musk did change moderation rules on X, even after monetizing the illegal porn, the Grok app didn’t change much at all.

Apple rejected an app update for Grok based on these grounds and shared its reasoning for the rejection. The letter was revealed Tuesday for the first time.

“Apple reviewed the next submissions made by the developers and determined that X had substantially resolved its violations, but the Grok app remained out of compliance. As a result, we rejected the Grok submission and notified the developer that additional changes to remedy the violation would be required, or the app could be removed from the App Store.”

After some changes and interactions between app review and xAI developers, Apple determined that the app update could be approved. While the details of what exactly changed weren’t shared, Apple called it “substantially improved.”

It wasn’t long after Elon Musk challenged everyone to go ahead and try to force Grok to make deepfake nudes of real people that they did just that. To this day, while more difficult, people are still finding ways to get around the blocks put in place to deescalate the flood of illegal deepfakes.

Apple’s guidelines aren’t always easy to interpret, but there is a rule that requires moderation of user-generated content. There was virtually zero moderation before, and there is at least some now.

Perhaps that was enough for Apple.

Apple’s management of the App Store isn’t always equal

If any other developer had been flooded with generated nudes of minors and non-consensual porn, Apple would boot it without question. In fact, it has kicked apps for much lesser crimes without much thought.

Shortly after the X and Grok debacle, Apple was made aware of other deepfake porn apps on the App Store. At least 28 of them were quietly removed after a report surfaced about the problem.

However, the more powerful the entity behind an app, the harder it is to deal with.

ICEBlock was removed at the demand of the US government

Kicking ICEBlock after not-so-subtle demands from the US government is an easy decision. Kicking Elon Musk’s pet apps due to a flood of illegal content comes with a lot of red tape.

Not only is there a lot of money behind Musk, but government influence too. There’s also the problem that many people incorrectly believe X is some kind of “free speech platform” and is protected by the first amendment. (It isn’t)

While we’ll never be able to know exactly what decisions were made behind closed doors, it seems what happened with X and Grok wasn’t by accident. Apple made a conscious choice to let those apps keep operating in spite of the horrific content being shown.

Governments around the world have considered forcing Apple to enable App Review to operate as an external entity. Examples like these make that eventuality seem all the more likely.

Source: www.appleinsider.com
Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

13 + 13 =