There is a new Twitter owned app named Vine that was just recently released in Apple’s App Store that has many raising an eyebrow in the short time the app has been live. Vine is a video-sharing app that allows you to shoot up to six seconds of footage (like an animated GIF) and share it with followers on Twitter and Facebook.
The issue is that Vine allows for pornography clips to be shared and one of the pornography videos was accidentally placed in the Editor’s Picks section of the mobile app.
In a statement to CNN about the porn issue in the Editor’s Pick, Vine said, “A human error resulted in a video with adult content becoming one of the videos in Editor’s Pick, and upon realizing this mistake we removed the video immediately.”
Vine is only available for iOS users and with such content that has been displayed with the app, questions have been asked as to why Apple hasn’t pulled the plug on the app. Just recently Apple banned the app 500px which featured artistic images of nudity, but because it gave users access to sexual content it got canned.
It’s surprising Apple would have double standards in a situation like this because compared to the 500px app I would take art any day over pornography clips. It’s also shocking that the app hasn’t been pulled as Vine is rated 12+ on iTunes for infrequent/ mild sexual content or nudity. I didn’t realize it’s okay for a 12 year old to be exposed to pornography videos when they haven’t even had a talk about the birds and the bees.
Vine is community monitored, a clip isn’t marked as “adult” content until enough people report it. Which means many people must be exposed to the image, including children until enough people flag it.
So Apple, can you please explain this doubt standard? I am with many at a loss as to how Apple makes these decisions.
Update: Vine has banned some hashtag searches to help with the problem but it took us 5 seconds to run across porn with this new filter.