X’s deepfake porn feature clearly violates app store guidelines. Why won’t Apple and Google pull it?

Eric

Mama's lil stinker
Top Poster Of Month
Joined
Aug 10, 2020
Posts
15,067
Solutions
18
Main Camera
Sony

Since X’s users started using Grok to undress women and children using deepfake images, I have been waiting for what I assumed would be inevitable: X getting booted from Apple’s and Google’s app stores. The fact that it hasn’t happened yet tells me something serious about Silicon Valley’s leadership: Tim Cook and Sundar Pichai are spineless cowards who are terrified of Elon Musk.

Here’s the relevant Apple App Store developer guideline: “Apps should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy.” Huh! How about that.
 
Good old Tim … Sweeney, equating free speech with platforms being asked to enforce their own policies on revenge porn and CSAM.



I guess his half a billion fine for violating children’s privacy and using dark patterns to get them to buy microtransactions was also government censorship.

Also not for nothing but this isn’t the case of a failed guard rail, but rather the guard rails being deliberately removed and the response of those in charge of them saying “lol, lmao” to those asking what happened.

Also there’s a deepfake nude ban sponsored by … Ted Cruz, which went into effect last May:



 
Back
Top