Until iOS 17, parents could remove the dangerous iMessage app called
#images, but the latest release removes this control, exposing millions of kids to graphic sexual content.
Apple's texting app "Messages" comes with a feature that allows you to search for animated GIFs to insert into texts, including many images that are extremely innappropriate for children.
If you've never tried it, this iMessage app allows searching for animated GIFs. In the image below, you can see the result of a search for
goats, but a less innocent search would fill the screen with images not suitable for children.
While the #images GIF search seems to block full nudity and certain explicit searches, it still grants access to thousands of sexually graphic images that most parents would never permit young children to see. Try searching for
lingerie if you're skeptical, and remember that millions of children have iPhones.
iOS 17 does allow removing third party iMessage apps through a new Settings area. But the apps provided by Apple do not appear as options to disable.
What Should Parents Do?
- The simplest and most effective thing parents can do is don't update, or let their kids update to iOS 17 until (hopefully) Apple fixes this issue.
- If you've already updated, we recommend temporarily crippling the Messages app by setting it to be allowed only for 1 minute each day, with ScreenTime's "App Limits" feature.
- Help raise awareness of this issue by linking to or posting this article on social media, opening support requests with Apple, or any other thing you can think of to make Apple aware of how big a deal this is for parents.
The Gertrude mac app helps you protect your kids online with strict internet filtering that you can manage from your own computer or phone, plus remote monitoring of screenshots and keylogging. $5/mo, with a 60 day free trial.Start free trial →