Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
I AM GRANDO
Aug 20, 2006

Parakeet vs. Phone posted:

Yep, I'm struggling to find the article right now but the guy that directed Tickled found groups of people tricking kids into making fetish porn for Youtube at least a year ago and reported it all. It was around the time that the other weird and creepy kids videos got big.

Edit: Found it, bit of :nms: since it is talking about a pedophile network, but really similar stuff apparently happening 2 years. https://thespinoff.co.nz/society/21-11-2016/hello-my-name-is-ally-how-children-are-being-exploited-by-youtube-predators/.

The gross mastermind behind the tickling stuff who died while the movie was being made had a bunch of twitter accounts with videos of tickling stuff, and the videos are still up years later.

Adbot
ADBOT LOVES YOU

MonsieurChoc
Oct 12, 2013

Every species can smell its own extinction.
This just reminds me of Call me lucky.

https://www.youtube.com/watch?v=3FChmOC-Qjw

Plus ça change, plus c'est pareil.

Edit: gently caress, Crimmins died last year. I didn't know that. He didn't even get to close down the Catholic Church.

MonsieurChoc fucked around with this message at 01:11 on Feb 19, 2019

Skippy McPants
Mar 19, 2009

Hiro Protagonist posted:

Quick question: what could actually be done about this? Not defending Google at all, I'm just amazed at pedophiles ability to find ways to skirt around each and every system put in place to prevent them from being the horrible people they are. It reminds me of all those videos of kids doing ASMR, where it becomes hard to distinguish from a dumb kid loving around on YouTube and purposeful child exploitation. Also, will admit I didn't see the full extent of the video, as it was too much.

What can be done is to hire a robust moderation staff who can take appropriate action to bust-up these kinds of communities whenever they coagulate on a platform. What will happen is nothing, because hiring those people costs money and unless this stuff gets signal boosted into national attention, Google won't bother. The tragically ironic thing is that the very same algorithm that makes it so easy collate all this poo poo also makes it super easy to trace for anyone actually looking, but nobody is looking.

Edit: and let's be clear here. The pedophiles, Nazis, rapists, and other extreme transgressives are not clever. They're parasites. Cultural nomads who will migrate to wherever moderation is lax. When one place that's long harbored them finally gets around to purging—reddit, 4chan, and hell even SA if you go back far enough—they simply pick up stakes and relocate.

Skippy McPants fucked around with this message at 02:31 on Feb 19, 2019

AnEdgelord
Dec 12, 2016

Antifa Turkeesian posted:

The gross mastermind behind the tickling stuff who died while the movie was being made had a bunch of twitter accounts with videos of tickling stuff, and the videos are still up years later.

I think there was an episode of The Dollop on this guy

Max Wilco
Jan 23, 2012

I'm just trying to go through life without looking stupid.

It's not working out too well...

Archer666 posted:

https://www.youtube.com/watch?v=3rCTpwLdYQA

Civvie engages in a old time tradition that all boomer gamers engage in once every so often: Blitzing through the original DOOM.

If he wanted a really authentic emulation of the classic Doom experience, he should have used the Chocolate Doom sourceport. :colbert:

I didn't know the Berserk pack effects persisted through the whole level, or that there was that shortcut in E1M6. The pistol accuracy trick I believe also applies to the chain-gun as well.

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.
Summoning Salt has a new video on blindfolded runs of Mike Tyson's Punch-Out:

https://www.youtube.com/watch?v=iZT6JEOC3D8

Terrible Opinions
Oct 18, 2013



Skippy McPants posted:

What can be done is to hire a robust moderation staff who can take appropriate action to bust-up these kinds of communities whenever they coagulate on a platform. What will happen is nothing, because hiring those people costs money and unless this stuff gets signal boosted into national attention, Google won't bother. The tragically ironic thing is that the very same algorithm that makes it so easy collate all this poo poo also makes it super easy to trace for anyone actually looking, but nobody is looking.

Edit: and let's be clear here. The pedophiles, Nazis, rapists, and other extreme transgressives are not clever. They're parasites. Cultural nomads who will migrate to wherever moderation is lax. When one place that's long harbored them finally gets around to purging—reddit, 4chan, and hell even SA if you go back far enough—they simply pick up stakes and relocate.
It would only take a few hundred techs without college degrees. At most you'd want A+ certification and maybe some experience showing they can deal with customers.

tracecomplete
Feb 26, 2017

I would be a little wary of saying it would "only" take a few hundred people. Maybe if you could isolate the problem to the English-speaking world, but probably not even then--YouTube's so big it's hard to wrap your head around the scope of poo poo that needs to be reviewed.

There's also the problem of what happens to these people. Facebook's content moderation team, as gong-show awful as it is, ends up breaking a lot of people's brains. More are probably needed just to spread it out.

(edit: to be clear, it is very doable and they need to do it, this is not an excuse)

Terrible Opinions
Oct 18, 2013



I'm not suggesting only a few hundred to review every single video, but to refer anything that pops on algorithm for obvious red flag words and anything reported by users. I work in a very similar industry, though one requiring significantly more technical knowledge, and it isn't hard to automate large portions of the review process and get the cases to your techs. The issue is that youtube doesn't want to pay for full-time positions and doesn't really want to remove the child porn videos or any videos for that matter because that reduces their number of ad impressions.

Skippy McPants
Mar 19, 2009

AFashionableHat posted:

(edit: to be clear, it is very doable and they need to do it, this is not an excuse)

Yeah, I don't think anyone pretends that it's an easy problem to solve, but it is chilling the degree to which all social media and content platforms simply ignore the problem for lack of any incentive to do otherwise.

Terrible Opinions posted:

The issue is that youtube doesn't want to pay for full-time positions and doesn't really want to remove the child porn videos or any videos for that matter because that reduces their number of ad impressions.

I'm pretty sure they want to get rid it, and they probably have algorithms which are quite good at scraping the really heinous stuff out of the platform. The issue is that when things get even a little bit grey, the automated system falls apart. Like, in that linked video most of the stuff inside the "wormhole" might be innocuous considered in isolation and removed from context. Random videos of little kids playing around aren't inherently suspect. But when they're grouped into massive interconnected achieves of those videos that're timed stamped with suggestive poses and filled with comments which disclose children's personal information or even link out to literal child pornography then you have something utterly vile and exploitative, and also something that requires a human to parse and formulate a response.

Edit: I think we're all in agreement that Google, Amazon, Facebook, et al. want moderation to be fully bot driven—cause it's cheap—and won't hire necessay staff unless forced by laws or social pressure.

Skippy McPants fucked around with this message at 05:16 on Feb 19, 2019

sexpig by night
Sep 8, 2011

by Azathoth

DEEP STATE PLOT posted:

hopefully the fbi actually gets involved because entirely gently caress this poo poo

yea just off the thumbnail I thought this would be more a 'well see there are loving freaks who COULD jerk off to basically anything with a child in it who isn't covered head to toe in a bedsheet, so we need to be better about talking to our kids about how 'youtube fame' isn't really a good goal to have and to be careful about who they interact with and all'.

Noooope this is just 'well see there are loving freaks who figured out if they VERY CAREFULLY curate their child porn to line up with EXTREMELY NARROW fetishes they can just have the biggest web platform in the world cater to them and they can just tell little girls how beautiful their feet are and for some reason that's ok???'

Terrible Opinions
Oct 18, 2013



Skippy McPants posted:

Yeah, I don't think anyone pretends that it's an easy problem to solve, but it is chilling the degree to which all social media and content platforms simply ignore the problem for lack of any incentive to do otherwise.


I'm pretty sure they want to get rid it, and they probably have algorithms which are quite good at scraping the really heinous stuff out of the platform. The issue is that when things get even a little bit grey, the automated system falls apart. Like, in that linked video most of the stuff inside the "wormhole" might be innocuous considered in isolation and removed from context. Random videos of little kids playing around aren't inherently suspect. But when they're grouped into massive interconnected achieves of those videos that're timed stamped with suggestive poses and filled with comments which disclose children's personal information or even link out to literal child pornography then you have something utterly vile and exploitative, and also something that requires a human to parse and formulate a response.

Edit: I think we're all in agreement that Google, Amazon, Facebook, et al. want moderation to be fully bot driven—cause it's cheap—and won't hire necessay staff unless forced by laws or social pressure.
I genuinely think that youtube resists this because they straight up want to keep even the most heinous horrible poo poo so long as advertisers don't pull out. They might want to get rid of it now that it's causing a ruckus but only because of that ruckus. Youtube would keep up ISIS execution videos if it got ad impressions.

Acute Grill
Dec 9, 2011

Chomp
If you want to get heinous poo poo removed from Youtube (well, demonetized) then report the video to the corporation whose ads played before it. Google's not gonna suddenly grow a conscience and stop profiting off offensive content, but image-obsessed corporate brands will absolutely pull their ad support if they're associated with something that hurts their general appeal.

Skippy McPants
Mar 19, 2009

Terrible Opinions posted:

I genuinely think that youtube resists this because they straight up want to keep even the most heinous horrible poo poo so long as advertisers don't pull out. They might want to get rid of it now that it's causing a ruckus but only because of that ruckus. Youtube would keep up ISIS execution videos if it got ad impressions.

I dunno, this feels like a different phenomenon than, say, those content farms that churn out non-sensical procedurally generated cartoons. That is entirely about ads, but this seems more like a question of opportunity costs. Again going back to the linked video, most of what he found wasn't monetized. He did find some adds, but the majority are just taking up space while generating zero revenue.

Youtube would probably be happy to see them gone but figuring out which videos, users, and communities need purging is well beyond the capabilities of their automated system. An actual solution requires people, people who need payment. So it's cheaper to let the videos remain, even though they're fueling something repugnant.

Skippy McPants fucked around with this message at 06:26 on Feb 19, 2019

Puppy Time
Mar 1, 2005


"Youtube" is a platform run by multiple people with multiple levels of power, so it's silly to talk about "what Youtube wants," since it's either incapable of wanting anything, or it's a group that probably don't all agree on anything.

ArfJason
Sep 5, 2011
Yeah, and while im willing to believe the guy who posted earlier about how he works on a similar industry and how youd only need to check for algorithmicallg detected red flags, i cant imagine how many people itd take to go through that much video and text

Skippy McPants
Mar 19, 2009

Puppy Time posted:

"Youtube" is a platform run by multiple people with multiple levels of power, so it's silly to talk about "what Youtube wants," since it's either incapable of wanting anything, or it's a group that probably don't all agree on anything.

You're being pedantic. Obviously, I'm using it as shorthand for "the people who own, administrate, and direct the platform's content restrictions and terms of service." I just don't think it's reaching to say that, broadly, corporate social media and content providers would rather pedophiles don't congregate on their sites. That said, they are also quite willing to turn a blind eye if policing those elements cuts into their bottom line.

ArfJason posted:

Yeah, and while im willing to believe the guy who posted earlier about how he works on a similar industry and how youd only need to check for algorithmicallg detected red flags, i cant imagine how many people itd take to go through that much video and text

Probably more than you'd think, and however many it takes they're obviously not interested in retaining them. Like, maybe there hasn't been a meeting at Google HQ where someone sits the Board of Directors down and says, "okay this is how much money we have to spend to keep this one group of awful people off our platform," but I gauntree they've had lots of meetings about how awesome algorithmic moderation is, how much money they can save by not needing live mod teams, about how any gaps are minor considerations, and whatever other self-serving platitudes they have to come up with to keep shoving money into their portfolios while avoiding anything resembling responsibility.

Terrible Opinions
Oct 18, 2013



Skippy McPants posted:

You're being pedantic. Obviously, I'm using it as shorthand for "the people who own, administrate, and direct the platform's content restrictions and terms of service." I just don't think it's reaching to say that, broadly, corporate social media and content providers would rather pedophiles don't congregate on their sites. That said, they are also quite willing to turn a blind eye if policing those elements cuts into their bottom line.
In fairness to Puppy Time I was referring to the aggregate of shareholders that those administrators report to. So we were talking past eachother.

Skippy McPants posted:

Probably more than you'd think, and however many it takes they're obviously not interested in retaining them. Like, maybe there hasn't been a meeting at Google HQ where someone sits the Board of Directors down and says, "okay this is how much money we have to spend to keep this one group of awful people off our platform," but I gauntree they've had lots of meetings about how awesome algorithmic moderation is, how much money they can save by not needing live mod teams, about how any gaps are minor considerations, and whatever other self-serving platitudes they have to come up with to keep shoving money into their portfolios while avoiding anything resembling responsibility.
It's pretty sad that we can't hold one of the biggest companies on earth to the same standards as pornhub.

Parakeet vs. Phone
Nov 6, 2009
The potentially damning bit from the video is that Youtube is supposed to have an algorithm to catch predators in the comments and auto-disable the comment section if it gets predatory. A few of the videos had comments auto-disabled, which means that something probably triggered that alert, but the videos were still up.

It's not even that Youtube needs human moderators for every video. They don't seem to have anyone to catch when poo poo like this is taking root. It has to be a big story before they even look.

I AM GRANDO
Aug 20, 2006

A friend of mine was a human content monitor for facebook. She lasted a month before burning out.

Ghostlight
Sep 25, 2009

maybe for one second you can pause; try to step into another person's perspective, and understand that a watermelon is cursing me



Puppy Time posted:

"Youtube" is a platform run by multiple people with multiple levels of power, so it's silly to talk about "what Youtube wants," since it's either incapable of wanting anything, or it's a group that probably don't all agree on anything.

youtube is a legally recognised person and it is as equally valid to discuss the conglomerate decisions of its constituent human composition as a singular personal desire of that entity as it is to discuss the wants of a human being working under the direction of their constituent cellular composition. the negotiated desires of the parts form the enacted desires of the whole.

Psychepath
Apr 30, 2003
I don't think it's a stretch to say that a video of 8 year olds eating popsicles that gets a million views should be looked at by a human for any of a hundred red flags instead of an algorithm. None of the videos the dude clicked on had a single bit of value to anyone who should be choosing what to watch on youtube, so above like 60 views on a child sitting on their bed should be an automatic flag.

Parakeet vs. Phone
Nov 6, 2009
Also the algorithm had no trouble stringing all those videos together in a recommendation chain. Theoretically, if a human had looked at just one the whole thing would have been cracked open right away and they could have worked on chopping it down. But it's easier/cheaper to just ban anything with "cp" as a tag and pray that that works.

Psychepath
Apr 30, 2003
I'm super curious how many clicks of "not interested" it would take to get back out of the hole and what kind of secondary nightmare you'd eventually end up in, like how many degrees of child porn do you get separated from before you're safe for work.

RareAcumen
Dec 28, 2012




I can't wait for Youtube to axe all the videos of not-pedos and White supremacists, just like tumblr when this got brought up.

Absurd Alhazred
Mar 27, 2010

by Athanatos

RareAcumen posted:

I can't wait for Youtube to axe all the videos of not-pedos and White supremacists, just like tumblr when this got brought up.

Honestly, I'm surprised they haven't called the FBI on this guy for bringing it up.

Fil5000
Jun 23, 2003

HOLD ON GUYS I'M POSTING ABOUT INTERNET ROBOTS
For fans of Hbomb's charity stream, Mermaids has just had it confirmed that the National Lottery grant funding (that Graham Linehan campaigned to get withdrawn and which inspired the whole stream in the first place) is going to continue. So that's nice.

RareAcumen
Dec 28, 2012




Fil5000 posted:

For fans of Hbomb's charity stream, Mermaids has just had it confirmed that the National Lottery grant funding (that Graham Linehan campaigned to get withdrawn and which inspired the whole stream in the first place) is going to continue. So that's nice.

Yep!

https://twitter.com/Mermaids_Gender/status/1097867934750396418

https://www.youtube.com/watch?v=-YCN-a0NsNk

sexpig by night
Sep 8, 2011

by Azathoth

Fil5000 posted:

For fans of Hbomb's charity stream, Mermaids has just had it confirmed that the National Lottery grant funding (that Graham Linehan campaigned to get withdrawn and which inspired the whole stream in the first place) is going to continue. So that's nice.

Nice, so they get the money they were gonna get and all the donations?

Good work, Graham, really stuck it too em.

JordanKai
Aug 19, 2011

Get high and think of me.


Fil5000 posted:

For fans of Hbomb's charity stream, Mermaids has just had it confirmed that the National Lottery grant funding (that Graham Linehan campaigned to get withdrawn and which inspired the whole stream in the first place) is going to continue. So that's nice.

Awesome! :yayclod:

ManlyGrunting
May 29, 2014
I'm over the moon, that stream meant so, SO much to me personally. More than a year of discovering I was trans and it was the first time I genuinely felt proud of it.

Dan, since I've seen you post here: thank you so, so much for helping facilitate the stream (and Casey of course), as well as all the trans voices that were on that stream (a good amount of which would fit easily in this thread come to think of it).

DEEP STATE PLOT
Aug 13, 2008

Yes...Ha ha ha...YES!




ahahahahahaha get hosed linehan

DoubleCakes
Jan 14, 2015

Thanks to Dan, Casey and the rest of Teeth Gang! And a big thank you to Graham! Without him, none of this would have happened! :cheers:

Junpei Hyde
Mar 15, 2013




We did it.

We beat beaver bother

Thompsons
Aug 28, 2008

Ask me about onklunk extraction.
laffo graham and all those other shithead terfs have got to be loving apoplectic right about now

Fil5000
Jun 23, 2003

HOLD ON GUYS I'M POSTING ABOUT INTERNET ROBOTS

Thompsons posted:

laffo graham and all those other shithead terfs have got to be loving apoplectic right about now

Predictably his timeline is even more of a sewer than usual. Between this and Don Cheadle's t shirt on SNL he's having a great time.

Kim Justice
Jan 29, 2007

Get hosed Glinner, you loving oval office.

Fabricated
Apr 9, 2007

Living the Dream
lol, good

x1o
Aug 5, 2005

My focus is UNPARALLELED!
I can't believe I watched a 54 minute video about punting in the NFL, but god drat does Jon Bois make great videos

https://www.youtube.com/watch?v=F9H9LwGmc-0

Adbot
ADBOT LOVES YOU

rujasu
Dec 19, 2013

TheHeadSage posted:

I can't believe I watched a 54 minute video about punting in the NFL, but god drat does Jon Bois make great videos

Same. It's not one of my favorite Bois videos, but it's still good.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply