Teen Webcam Periscope

Teen Webcam Periscope




🔞 ALL INFORMATION CLICK HERE 👈🏻👈🏻👈🏻

































Teen Webcam Periscope
Periscope is grateful to you for going LIVE together and being part of this community. Past public broadcasts will continue to be available on Periscope web. Learn more
Keep the conversation going on Twitter.












Tap to play GIF
Tap to play GIF











Your weekday morning guide to breaking news, cultural analysis, and everything in between
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
YouNow is the weirdest, most fascinating video streaming site. Unlike Periscope or Meerkat, you can pay a teen while he sings, dances, or even sleeps.
As Meerkat and now Periscope are being touted as a possible future of news, YouNow is the livestreaming video app where teens are flocking. If Meerkat and Periscope are competing for the eyeballs of news junkie adults on Twitter, YouNow has already won with the hordes of young people who just want to hang out with each other.
Lately, I've been enjoying a deeply creepy yet technically totally innocent new activity: lying in bed at night and watching random teens sleep. I've been doing it on YouNow, a mobile app and web live-streaming app that's a hit with teens. On its popular #sleepingsquad hashtag, I can see about 20 sleeping teens at any given time. (It usually seems around 50-plus people are broadcasting in the hashtag, but a lot of them are in complete darkness, so you can't actually see anything. Because, you know, they're sleeping.)
Some teens sleep with light music on. Some are completely silent. And some, eerily, have the distinctive soft breathing sounds of sleep.
I don't know exactly why a teen would broadcast themselves sleeping. I can't ask them.
I have asked other teens (or younger — I talked with kids as young as 10) why they use YouNow , a real-time video broadcasting app. The problem with asking a 13-year-old why they do anything is that it's quite difficult to get anything past "I dunno/I'm bored." But that's also the wrong question to ask. Why climb Mount Everest? Why tweet? Do adults really ever have a better answer than "I was bored" for anything we do? The aching desire to cut through the tedium of daily life with human interaction is the driving force of everything on the internet. In fact, boredom is such an integral raison d'être of teen life that #bored is one of the top channels on YouNow.
I chatted the the other people watching in the #sleepingsquad: Why? One girl watching a sleeping teen boy with me gave a reasonable response: "He's my boyfriend." Others had elliptical reasoning: "I think it's more that the people doing it want to get likes and fans."
Adi Sideman, the founder of YouNow, told me his theory on #sleepingsquad: "It's the addiction to the internet, it's the addiction to social media, it's not wanting to leave it behind even when you're sleeping." Andy Weissman of Union Square Ventures, who is invested in the app, described it as "an online slumber party" in an email to BuzzFeed News. "I also think part of the human condition is to look for connection with others. And this is probably more acute with younger people."
The app is sort of like Vine meets Chat Roulette meets The Gong Show . You can watch people live-streaming in different channels like "Musicians," "Dancing," or "Girls" and chat feedback or questions to them. If you really like them, you can tip them with points purchased with real money through the app, and the performer gets real money as a tip. YouNow's revenue model is based completely around the tipping system; they take a cut of the in-app purchases when fans buy points to tip the performers.
It's basically like an open-mic night where the hat is passed around: Some people will watch for free, some will toss a dollar in, and the house takes a cut at the end of the night. Currently, there are no plans to introduce ads. "We're happy with our current revenue model," said Sideman.
Fandom doesn't have a price on other platforms, like Vine or YouTube, where teen stars are made — ad-supported videos eliminate the need for financial transactions between the watchers and the watched. I asked Sideman why these mostly young users (70% are under 24, according to Sideman) would actually pony up cash to enjoy someone playing an Ed Sheeran cover instead of just enjoying an Ed Sheeran cover for free.
"Most of the fans just enjoy and chat and interact. Some of the fans want to stand out and want to participate more in, really, the content creation," said Sideman. "Because think about it — from a theoretical standpoint this thing is as much about the audience as it is about the broadcast. And that's really our focus — to let everybody participate and create content together. So if I tip or if I send a message and he incorporates it into what he's doing, we're collaborating."
The chat section for these popular YouTube stars moves fast — paying to pin your message to the top gets their attention.
During the day, the #sleepingsquad disappears. Musicians, performers, and cute charming teens dominate. I checked out the kids in the #truthordare channel. This where a distinct knot in my stomach kicked in. These were often young girls, seeming around ages 10–15, who are playing a sexually suggestive game with strangers. Coming up with harmless dares and G-rated truths is tough. So I did a few would-you-rathers instead:
For a dare, I dared the teen girls to lip-synch to a Taylor Swift song of their choosing (A+ dare, FYI. Feel free to use that one). One of them rolled her eyes and said she didn't like Taylor Swift — you could see the teen embarrassment of not wanting to like the thing that her peers liked — and offered to lip-synch instead to a parody of "Blank Space" by the YouTube star Shane Dawson.
Teens, let me give you a word of advice from a cool adult: Liking Shane Dawson is way more embarrassing than liking Taylor Swift.
After I had dared the third girl into singing a T. Swift song, I realized… this is really fun . It was a nostalgic rush to watch these girls lip-synch along to a pop star from inside their bedrooms — an activity that I have done not infrequently myself. It didn't feel creepy or wrong; it reminded me of a fun slumber party, exactly as the venture capitalist Andy Weissman described (though I maintain I am more qualified than him to judge similarities to a teen girl sleepover).
I am thinking very hard back to my teen self, and if this would have appealed to me. I was shy, and I think I wouldn't have liked the performative nature of it, but it's so hard to compare how normalized this technology is to kids now (for comparison, Myspace didn't exist until I was out of college). The kids on YouNow seems to represent the full social map of the lunchroom: theatre kids, hot popular girls, nerds, randos, short show-off-y boys in snapbacks. The difference is between YouNow and the real lunchroom is you can pay to sit at the popular kids' table if you want.
A girl in the "dance" category receives 50 "thumbs up" points from a fan, while a guy streams in the "guys" category.
Undoubtedly, there is something extremely worrisome about the vulnerability of children on the site. Sideman has his own knowledge of the dangers of adult predators. He produced a documentary Chicken Hawk about the notorious NAMBLA (North America Man-Boy Love Association) while in NYU film school in the mid-1990s. It was shown in the New York Underground Film Festival and a write-up in the Los Angeles Times called it "coldly objective" (the film is not at all supportive of NAMBLA). A 2001 article in New York magazine about the New York tech scene mentions him in not entirely flattering terms (the article is an amazing read as a time capsule of the tech bubble; I can't recommend it enough). The the author, Steve Fishman, chronicles his year of trying to get a karaoke website off the ground, and Sideman was involved as a business partner.
"I didn't speak to Steve, who is now a friend, for a few years after that. I was upset he wrote that my loft smelled like beer," he told me, chuckling. Adi, a former Israeli military paratrooper in his forties, wears a tight T-shirt over a henley and jeans and has funky glasses. He's likable and animated and offered me a cocktail at the office. He does not seem like someone whose loft would smell like beer.
Admittedly, as nervous for these kids' safety as I felt, I never saw anything weird or overtly sexual or harmful on YouNow. No one was exploiting the tipping system for stripping, and I didn't observe anyone acting untoward in the chat feature that runs along the side. YouNow employs a team of both in-house and outsourced content moderators.
"We have a large responsibility because it's live and because it's very popular with teens," said Sideman. "We invest a lot in our community management. We invest a lot in trust and safety in multiple languages to make sure that this is a safe place, and I'm very happy to say it is."
The broadcasters themselves didn't seem to worry either.
"Do you worry if there's creepy people on here?" I asked an 11-year-old girl.
"Do your parents know you use this app?"
"Does that matter? No. It doesn't matter. They don't know."
Katie Notopoulos is a senior technology reporter for BuzzFeed News and is based in New York. Contact this reporter at katie@buzzfeed.com.
Got a confidential tip? 👉 Submit it here







Light







Dark







Auto





Pandemic
Putin
Mar-a-Lago Special Master
Stacey Abrams
Martha's Vineyard
Albert Pujols
Catfishing
LIV







Light







Dark







Auto





About

About Us
Work With Us
Contact
Pitch Guidelines
Send Us Tips
Corrections
Commenting
Reprints



Subscriptions

Subscribe
Sign In
Account
Subscription FAQs
Podcast FAQs
Newsletters
Customer Support



Advertising

Site Advertising
Podcast Advertising
AdChoices
Cookie Preferences


Photo illustration by Lisa Larson-Walker. Photo by Clem Onojeghuo/Unsplash.
Every social media network has its underbelly, and the one on Periscope, Twitter’s live-video app, might be uglier than most: On any given day, users appear to flock to broadcasts from minors and encourage them to engage in sexual and inappropriate behavior. Worried Periscope users have been ringing the alarm for more than a year , and Twitter has reaffirmed its zero-tolerance policy against child exploitation after reporters have followed up. But if the company has been working any harder to enforce that policy, its efforts don’t appear to have scrubbed out the grime.
Last month, a tipster described to me how some Periscope users were routinely pursuing children who had logged on to the platform to play games like truth or dare with others. It took pseudonym-cloaked commenters less than six minutes to persuade a girl, broadcasting with a friend and playing truth or dare on a public forum recently, to lift her shirt and show her breast. “Fully out,” typed one user, right before the girl revealed herself. “But with shirt up…” instructed another, before the girl did it again. The girls, both of whom had braces and appeared to be younger than 18, said they loved to roller-skate, mentioned their homeroom class, and said they didn’t know what an “underboob” was after being asked to show some. It’s not clear whether the users directing the girls were also minors or were adults. But whatever the age of the commenters, their behavior was in violation of Periscope’s standards, which bars users from engaging in sexual acts and “directing inappropriate comments to minors in a broadcast.”
In another alarming video, a pair of girls who described themselves as sisters (one said she was 14, and the other appeared to be several years younger) were asked to show their bras and their underwear and pressured by multiple commenters to continue to strip. “Dare y’all to play rock, paper, scissors, and loser has to flash,” said one viewer, after both girls had already shown their underwear.
Launched in 2015, Periscope makes it easy for anyone to start a broadcast that others can watch live and send comments to the broadcaster while he or she is filming. Commenters can also send broadcasters hearts to show that they’re enjoying the live content. As you Periscope, you can see the comments and hearts in response to your stream. There is also a private stream function, which is only available to users who follow each other. In incidents like the ones described above, commenters routinely ask the young broadcaster to follow them, perhaps hoping to engage in a private video stream.
Although concerned Periscope users have been alerting the company that some people were using its app to coax children into inappropriate behavior for more than a year—and in July, the BBC even aired an investigation into how users on Periscope were pressuring children with sexually explicit messages—children and teenagers can still be swamped with requests from viewers to do things like take off their shirts and pants, show their underwear, show their feet, kiss other kids, do handstands, and answer lewd questions. In other words, it’s clear the company hasn’t figured out how to solve the problem. In response to the BBC’s reporting, Periscope said, “We have a strong content moderation policy and encourage viewers to report comments they feel are abusive. We have zero tolerance for any form of child sexual exploitation.”
It’s not that Periscope hasn’t done anything. On Nov. 27, about five months after the BBC report, Periscope rolled out an update to its reporting tool that allows users to flag potentially inappropriate content. The updated tool includes a category for “child safety,” as well as a way to flag “sexually inappropriate” comments by users talking to broadcasters on livestreams. In that announcement , Periscope said that since “the beginning of 2017, we have banned more than 36,000 accounts for engaging or attempting to engage inappropriately with minors.” This announcement, however, came in the form of a post on Medium (where Periscope only has 116 followers), which the company tweeted out five days after publishing it, after updating it to add details on the new reporting tools. In the app itself, there was no announcement or indication that the new feature existed that I’ve been able to find, suggesting that many Periscope users might be unaware of the updated reporting tool.
I contacted Periscope on Nov. 30 to ask about explicit interactions with minors on the platform and what the company is doing to solve the problem. In response, Periscope encouraged me to report any problematic videos found in the future and said that it has “a team that reviews each and every report and works as quickly as possible to remove content that violates our Community Guidelines .” I then asked about the size of the team, which Periscope said in its recent Medium post is expanding, and asked for more information about what else the company is doing about this kind of content. I haven’t heard back but will update this piece if I do. I also asked the Department of Justice if it was aware of and had taken any actions regarding this activity on Periscope. A spokeswoman said, “As a matter of policy, the U.S. Department of Justice generally neither confirms nor denies the existence of an investigation.”
In its Medium post Periscope did say that it’s “working to implement new technology” that is supposed to help detect accounts that are potentially violating the company’s policy and improve the reporting process—though at the moment, it’s not clear whether that software is running or the company is relying on user reporting alone. (When pressed on that question, Periscope did not respond.) Due to the live nature of the videos, it’s probably hard for Periscope to know exactly when a new one pops up that features a minor and attracts predatory commenters, though the platform has removed live broadcasts while they are happening in the past . “Unless they’ve got keywords down really tightly to know what constitutes a grooming message, … automated detection may be a little harder to do just via existing algorithmic tools,” Thomas Holt, a criminal justice professor at Michigan State University who specializes in computer crimes, told me. That means that having a reporting feature to help target accounts for removal is critically important, as is having staff to review the user reports. But, according to Holt, the efficacy of those reporting tools depends on how much users are even aware they exist. Kids might not even know when a pedophile is attempting to lure them into sexual acts, or even that it’s wrong and should be reported. And again, even a strong reporting regime clearly isn’t enough.
Videos of children being lured into sexual or inappropriate behavior on Periscope can rack up more than 1,000 views. The videos tend to follow a pattern: Once the stream starts, dozens of Periscope users flock into the comments, as if they had been alerted either on Periscope or via a separate forum outside of Periscope, suggesting some level of coordination. This type of swarming is common, according to Holt: “Multiple people will often start to send sexual requests, questions, or content in an attempt to exert a degree of social pressure on the person to respond to a request.” This makes the request seem more normal, Holt says, and can manipulate a child to respond to a sexual request to please the group.
One place within Periscope that had become a hive for this kind of misbehavior was the “First Scope” channel, which curated streams from people using the platform for the first time, according to Geoff Golberg, a former active Periscope user who has been vocal in calling attention to the problem of inappropriate behavior directed toward minors on the app. That channel was removed in November, months after Golberg sent emails to the company (which he tweeted out ) about the potential of minors being sexually exploited in the channel. *
While it’s good that Periscope is taking some degree of action, Holt says that the risk posed by virtually every social media platform—particularly ones that are more reliant on images than text, since text is easier to patrol with software—means it’s critically important for parents to understand what their kids are doing when they’re online, and to have conversations with them about what apps they use, what constitutes bad behavior, and how to report it. Periscope isn’t the only popular social media site struggling to moderate how kids use the app. Last month, the New York Times reported how the YouTube Kids app hosted and recommended videos with disturbing animations of characters killing themselves and committing other violent acts. On Periscope, though, the dangers are heightened because of the live, instant nature of the broadcasts, which can put a mob of predators in conversation with children before there’s time to intervene.
In many ways Perisc
Joi Fap
Teens Erotica Nu Video
Porn Video Pack

Report Page