Imagine police knocking on your door because you posted a ‘troubling comment’ on a social media website.
Imagine a judge forcing you to be jailed, sorry I meant hospitalized, because a computer program found your comment(s) ‘troubling’.
You can stop imagining, this is really happening.
A recent TechCrunch article, warns that Facebook’s “Proactive Detection” artificial intelligence (A.I.) will use pattern recognition to contact first responders. The A.I. will contact first responders, if they deem a person’s comment[s] to have troubling suicidal thoughts.
“Facebook also will use AI to prioritize particularly risky or urgent user reports so they’re more quickly addressed by moderators, and tools to instantly surface local language resources and first-responder contact info.”
A private corporation deciding who goes to jail? What could possibly go wrong?
Facebook is using pattern recognition and moderators to contact law enforcement.
Facebook is ‘using pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide, and to help respond to reports faster.’
“Dedicating more reviewers from our Community Operations team to review reports of suicide or self harm.”
Facebook admits, that they have asked the police to conduct more than ONE HUNDRED wellness checks on people.
“Over the last month, we’ve worked with first responders on over 100 wellness checks based on reports we received via our proactive detection efforts. This is in addition to reports we received from people in the Facebook community.”
Why are police conducting wellness checks for Facebook? Are private corporations running police departments?
Not only do social media users have to worry about a spying A.I. but now they have to worry about thousands of spying Facebook ‘Community Operations’ people who are all to willing to call the police.
Should we trust pattern recognition to determine who gets hospitalized or arrested?
A 2010, CBS News article warns that pattern recognition and human behavior is junk science. The article shows, how companies use nine rules to convince law enforcement that pattern recognition is accurate.
A 2016, Forbes article used words like ‘nonsense, far-fetched, contrived and smoke and mirrors’ to describe pattern recognition and human behavior.
“Cookie-cutter ratios, even if scientifically derived, do more harm than good. Every person is different. Engagement is an individual and unique phenomenon. We are not widgets, nor do we conform to widget formulas.”
Who cares, if pattern recognition is junk science right? At least Facebook is trying to save lives.
Just imagine in the not too distant future you are hanging out with your friends at a bar and someone say’s do you remember Susan? One of your friends responds, yes she does and it’s too bad she posted a troubling comment on social media which got her arrested.
That is our future, if we let private corporations decide who police should arrest or hospitalize..
Letting corporations decide who needs help can and will be abused.