X

Apple and 500px app commingle in naked controversy

The 500px photo app is reportedly removed from the App Store, allegedly because its photo-sharing capabilities mean that you might search for and find naked bodies.

Chris Matyszczyk
2 min read
Screenshot by Chris Matyszczyk/CNET

The naked body lobby is the most enthusiastic and powerful political movement in America.

It's clear that people in possession of naked bodies can harm society in a multitude of ways, warping the minds of children and adults and inciting acts of unspeakable cruelty.

Since the days of Steve Jobs, Apple has always been keen to ensure that naked bodies are strictly controlled, even though it's still quite hard to stop anyone for searching for them using, say, Safari.

Cupertino's latest step against naked bodies and the people who promote them is, purportedly, the removal of the 500px app, a Canadian creation that allows people to share photos that might include naked bodies.

As TechCrunch disrobes it, a new version of the app fell into the hands of an Apple reviewer, who tut-tutted at the alleged ease with which one (you) can search for carnal images.

500px's COO, Evgeny Tchebotarev, told TechCrunch that it's actually quite hard.

He claims it's far easier to casually search for the clothing-optional images on Tumblr and Instagram. With his app, you have to go to your desktop and explicitly opt for explicitness. The default is safe search.

He also casually mentioned that pornography is not allowed on the app and that he'd offered to make a quick fix to create an even more impenetrable shield.

He seems frustrated that the app had been in the App Store for 16 months and this new version was no more nor less "safe" than the previous one.

500px's Alex Flint seems equally frustrated. On Twitter he offered: "Just a reminder, while @500px's iOS app has been removed by Apple, our Android app is still readily available :-)"

It seems that sometimes Apple's app reviewers tend to opt for twisting their undergarments, rather than twisting their minds toward a little common sense.

"The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines," Apple spokesman Tom Neumayr told CNET. "We also received customer complaints about possible child pornography. We've asked the developer to put safeguards in place to prevent pornographic images and material in their app."

One can never allow oneself to drop one's pants -- I'm sorry, I mean drop one's guard -- in the face of so much naked body proliferation. One simply never knows where it might lead.

Updated 2:56pm PT with comment from Apple.