In a move that’s baffling at best and rather appalling at worst, Facebook has been busted asking users if they think it’s alright for adults to solicit “sexual pictures” from minors on its platform. While this may sound ridiculous on the surface — because it is — nevertheless, it happened.
On Sunday, the social media behemoth sent surveys out to a group of its users with questions on the issue of child grooming, the process of adults befriending children for the purposes of sexual abuse or other nefarious ends like trafficking and prostitution.
https://twitter.com/JonathanHaynes/status/970235172355477505
“There are a wide range of topics and behaviours that appear on Facebook,” began one of the questions. “In thinking about an ideal world where you could set Facebook’s policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures.”
Respondents’ answer options ranged from “this content should be allowed on Facebook, and I would not mind seeing it” to “this content should not be allowed on Facebook, and no one should be able to see it. ”Survey takers were also allowed to select that they have “no preference” on the subject.
In a follow-up question, the tech company asked users who the arbiter of such content and behavior should be. Answer options ranged from “Facebook decides the rules on its own” to “Facebook users decide the rules by voting and tell Facebook.” Others involved getting input from outside experts.
Strangely, neither of the two questions gave survey takers the choice to suggest that law enforcement should be alerted to the situation.
It didn’t take long for the media to catch on. The digital editor for the Guardian, Jonathan Haynes, flagged the issue on Twitter. He got a response from Facebook’s VP of Product, Guy Rosen, who called the inclusion of such questions a “mistake” that shouldn’t have happened:
“We run surveys to understand how the community thinks about how we set policies. But this kind of activity is and will always be completely unacceptable on FB. We regularly work with authorities if identified. It shouldn’t have been part of this survey. That was a mistake.”
A statement from Facebook shared with the media struck a similarly apologetic tone but also contained some defensiveness:
“We sometimes ask for feedback from people about our community standards and the types of content they would find most concerning on Facebook. We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey. We have prohibited child grooming on Facebook since our earliest days; we have no intention of changing this and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice.”
Speaking to the Guardian, British Parliament member Yvette Cooper, chair of the Home Affairs Select Committee, roundly condemned Facebook’s move:
“This is a stupid and irresponsible survey. Adult men asking 14-year-olds to send sexual images is not only against the law, it is completely wrong and an appalling abuse and exploitation of children. I cannot imagine that Facebook executives ever want it on their platform but they also should not send out surveys that suggest they might tolerate it or suggest to Facebook users that this might ever be acceptable.”
Andy Burrows, associate head of child safety for the National Society for the Prevention of Cruelty to Children, told Newsweek that “Facebook’s decision to crowdsource views on how to deal with a criminal offence is hugely concerning.”
The move, and the backlash, comes as social media companies face increased pressure to moderate the content on their platforms. Given that context, TechCrunch notes that it’s “hard to fathom” what Facebook was thinking with such a survey.
Further, the outlet highlights, the incident shows that the company would much rather lay the responsibility of content moderation on its users:
“The approach also reinforces the notion that Facebook is much more comfortable trying to engineer a moral compass (via crowdsourcing views and thus offloading responsibility for potentially controversial positions onto its users) than operating with any innate sense of ethics and/or civic mission of its own.”
So, WHO ultimately at FB included these questions, and how many checkpoints along the way gave their approval to them?
Those who really see through this crap want to know.
“… how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures.”
I’d set up a 3 way meeting… myself, the scumbag pervert, and Louisville Slugger.
“TechCrunch notes that it’s “hard to fathom” what Facebook was thinking with such a survey.”
Not really.
PEDOPHILE PUKES!!!
Oh how slickly they frame the narrative. It’s a “let me let you think about this and consider its okayness, its normality.” So casually asking, while all the while planting seeds in young minds of possibility and curiosity. No sense of warning or even caution for young readers. I hope the whole enterprise crumbles and disappears FOREVER!!
Hey #1, I’m so glad there is an endless supply of Louisville Sluggers.
🙂
.
Those would be tough to ban, at least. 🙂
Until some spotted owl or jewish squirrel was found living in ash trees.
Les Paul made one helluvan ax using a 4×4 post. Different kind of ax, but still utilizing wood to its best.
Back in my day, we would have said, sure they can ask.
As long as they provide a home address….
And a selfie.
🙂
.
Another Facebook issue:
“Facebook is complicit in (Israel’s) crimes … Facebook favors the (Israeli) occupation.”
https://www.yenisafak.com/en/world/palestinians-in-gaza-protest-facebook-censorship-3136095
.
The next steps in legalizing….for money of course.
When this is legalized then the ‘jews’ will assume they have full control of the populace.
-flek
I see. They’re certain that anyone who waves a confederate flag should die, but they’re just not sure where they stand on pedophilia.
If this doesn’t tell you all you need to know
Than the rock above you is way too big