While we doubt many of you may have noticed it last week, Apple pulled the instant messaging app Telegram from its App Store for a few hours. The reason behind the move was because ‘inappropriate content’ had appeared on the app – in this case it was actually child pornography that was being circulated among the app’s users.
Naturally, the discovery of the sickening crime led both Apple and Telegram to act fast; the latter having implemented new safeguards against such content before finally being allowed back on the App Store shelves.
Phil Schiller, the manager of Apple’s App Store, responded to 9to5Mac (who discovered the reason behind the app’s takedown) via email about the discovery:
“The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).”
It’s not an exaggeration to say that Telegram is one of the most popular instant messaging apps. It’s end-to-end encryption is designed to protect the privacy of messages written between users. Unfortunately, that encryption can also be a double-edged sword, allowing shady users to pass around less-than-ethical content, such as the case of this article. To date, the app has been downloaded more than 100 million times on mobile devices, and can even be used independently on the desktop.
(Sources: 9to5Mac, Engadget, The Verge)
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.