📱

Read on Your E-Reader

Thousands of readers get articles like this delivered straight to their e-reader. Works with Kindle, Boox, and any device that syncs with Google Drive or Dropbox.

Learn More

This is a preview. The full article is published at 9to5mac.com.

As pressure mounts for Apple to pull the X app, xAI says Grok will stop undressing people

As pressure mounts for Apple to pull the X app, xAI says Grok will stop undressing people

By Marcus Mendes9to5Mac

Hours after a coalition of digital rights, child safety, and women’s rights organizations asked Apple to “take immediate action” against X and Grok AI, xAI confirmed that Grok will no longer edit “images of real people in revealing clothing such as bikinis,” with significant carve-outs. Here are the details. Apple faces renewed pressure to remove X and Grok from the App Store In recent days, countless X and Grok users have been asking xAI’s chatbot to undress women and even underage girls, based on photos posted to X. While xAI initially defined the situation as “lapses in safeguards,” Grok kept on complying with multiple requests to edit images in such a way. This, in turn, led X to be blocked in several countries, and xAI to become the target of investigations in others. In the meantime, Apple has been facing renewed pressure to remove the X and Grok apps from the App Store, from both senators and users. Earlier today, a coalition of 28 digital rights, child safety, and women’s rights organizations submitted open letters to Apple and to Google, asking both companies “to take immediate action to ban Grok, the large language model (LLM) powered by xAI,” from their app stores. From the open letter: We, the undersigned organizations, write to urge Apple leadership to take immediate action to ban Grok, the large language model (LLM) powered by xAI, from Apple’s app store. Grok is being used to create mass amounts of nonconsensual intimate images (NCII), including child sexual abuse material (CSAM)-content that is both a criminal offense and in direct violation of Apple’s App Review Guidelines. Because Grok is available on the Grok app and directly integrated into X, we call on Apple leadership to immediately remove access to both apps. And As it stands, Apple is not just enabling NCII and CSAM, but profiting off of it. As a coalition of organizations committed to the online safety and wellbeing of all-particularly women and children-as well as the ethical application of artificial intelligence (AI), we demand that Apple leadership urgently remove Grok and X from the App Store to prevent further abuse and criminal activity. xAI says Grok will stop editing images, sort of While Apple and Google have remained mostly silent since the issue began, prompting harsh criticisms, as well as speculation that they feared angering Elon Musk and even President Trump, xAI confirmed today that it will update the Grok account on X to address the problem, at least in part: Updates to @Grok Account We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis. This restriction applies to all users, including paid subscribers. Additionally, image creation and the ability to edit images via the Grok account on the X platform are now only available to paid subscribers. This adds an extra layer of protection by helping to ensure that individuals who attempt to abuse the Grok account to violate the...

Preview: ~500 words

Continue reading at 9To5Mac

Read Full Article

More from 9to5Mac

Subscribe to get new articles from this feed on your e-reader.

View feed

This preview is provided for discovery purposes. Read the full article at 9to5mac.com. LibSpace is not affiliated with 9To5Mac.

As pressure mounts for Apple to pull the X app, xAI says Grok will stop undressing people | Read on Kindle | LibSpace