Last week messaging app Telegram was suddenly pulled from the App Store for several hours without explanation, leaving many wondering what had happened to the highly popular messaging service. Today, Phil Schiller, Apple’s senior vice president of Worldwide Marketing, confirmed that the app was pulled from the store due to child pornography fears.
Phil reported that the App Store team was alerted to illegal content, specifically child pornography within the Telegram app, with Apple verifying the claims through their own investigation and consequently pulling the app from sale as a result until the content had been removed.
Telegram has over 100 million active users, offering end-to-end encryption for all messages sent on the service. It’s understood the illegal content was found within a third-party plug-in used by Telegram.
As shared by 9to5Mac, Phil reports Apple notified the NCMEC (National Center for Missing and Exploited Children) after verifying the existence of the illegal content…
The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).
Telegram CEO Pavel Durov issued the following…
We were alerted by Apple that inappropriate content was made available to our users and both apps were taken off the App Store. Once we have protections in place we expect the apps to be back on the App Store.