Last week, Vine updated its terms of service to prohibit sexually explicit material. While that might appease nervous parents, the decision merely placed a PR rug over a glaring stain on the company’s reputation: A Daily Dot investigation into Vine’s content revealed thousands of explicitly sexually videos posted by minors, who in some cases may be younger than 12.
The social app owes a lot of its early success to teens. The service’s short-looped videos—locked in at 6 seconds—are a perfect avenue for the type of ephemeral self-expression and instant attention that many young people crave.
Vine’s social backbone, a stream of videos that can be re-shared with one click, is a launching pad for virality and online fame. Some of Vine’s most popular teen stars accumulate followings that dwarf those of legitimate celebrities: 16-year-old Lauren Geraldo, for instance, has 2.4 million followers and makes about $2,000 every time she revines a sponsor’s post.
Teens use Vine to make jokes, to post short confessionals, to capture the briefest moment of their lives. Some, however, are abusing the platform to post child porn.
One user, for instance, identified as an 11-year-old girl, posted a series of nude videos in which she performs sexual acts alone. Another, who appeared to be between 9- and 12-years-old, repeatedly exposed herself while describing sexual intercourse in exchanges with someone identifying himself as a 32-year-old male. Another profile, dedicated to aggregating sexual and nonsexual videos of children, contained over 1,700 vines and was followed by 964 users.
The comments sections for the videos were equally troubling, loaded with explicit sexual language and frequented by Vine users who were clearly adults. Many attempted to lure the minors off Vine and into private chat rooms. Here’s just one of countless examples of the types of conversations taking place on the platform.
Making things worse, Vine provides no simple method for anonymously reporting the profiles.
That’s what prompted our tipster to contact the Daily Dot on Feb. 24.
“I’ve tried to report it,” he or she wrote, “but it’s not possible to do anonymously, so I decided against it.”
Reporting a single offensive post—that is, one of the videos itself—can’t be done anonymously, which is troublesome, given the content’s possible legal implications. But it is otherwise easy to report. Users just need to tap two buttons on their phone. But reporting an entire user is another thing entirely.
Vine is a mobile app, but reporting a profile requires you to fill out a form on the company’s website. Moreover, the form asks for both your full name—not something most people want to provide when reporting child pornography—and the full Vine ID of the offending profile. That ID is not easy to find. On the app, the only way to see it is, unbelievably, to click the “share profile” button from the menu. That will load up an email on your phone that includes a link with the Vine ID. You need to hit “send” before actually sharing the profile, of course. But most mobile email apps will save that draft on their servers, so someone who reports a number of profiles could also have a collection of links to child pornography in their email, unless they remember to delete them.
And that’s almost beside the point. It is surely counter-intuitive that to report a profile posting child porn users first have to get just one step removed from actually sharing it. And this process is essentially useless for anyone attempting to file a complaint using a computer. The “Share this profile” option appears nowhere in Vine’s Web app.
Twitter, which bought Vine for $30 million in 2013, did not respond to a request for comment on this story. Instead, on March 6, after multiple Daily Dot inquiries, the company adopted a new set of rules regarding NSFW content. Pornography, as defined by the “Vine explicit sexual content FAQ,” means sexually provocative nudity, or nudity in a sexual context or setting, “like a strip club.”
In addition to amending Vine’s guidelines, Twitter is adopting a hands-on approach to content review. According to ZDNet, reported sexual content on Vine will undergo review by “teams that Twitter has been training specifically for consistency.” It’s unclear, however, what effect adding in-house moderators will have without significant modifications to Vine’s existing report system.
Even with the outright ban on sexual content, child porn hasn’t gone away. Some of the content the Daily Dot discovered, which in some cases dates back more than year, remains on the site. And that’s at least partially because of the service’s dependence on users to report child porn, and the clunky system it provides to do so.
When we scrolled through the main #teens tag on Vine earlier this week, it wasn’t long until a flaccid penis appeared on the screen. It’s not clear if that user was a minor or an adult, though in addition to the #teens tag, the user had added #h0rbyt33n to the post. That user’s profile page, in turn, was little more than a stream of re-vined pornographic content from other accounts identifying themselves as teens.
Vine actually blacked out some of those clips, admonishing users with a warning: “This post may contain sensitive content. Click to view.”
The Daily Dot reported the content to the National Center for Missing and Exploited Children.
Since the posts weren’t removed outright, it’s likely that the blackout and warning was a result of automated software that went into effect after enough users flagged the post. It also suggests that Twitter’s teams of content police are struggling to keep up with the service’s sheer volume of porn.
And their hands-on approach is missing the most obvious and urgent cases: Self-identified minors posting their own porn.
Illustration by Dell Cameron