WEF wants to use AI to automate censorship of “disinformation”

The Counter Signal

The World Economic Forum (WEF) has revealed its latest scheme: they want to use AI to automate the censorship of whatever they deem “hate speech” and “disinformation.” 

According to an article published by WEF, there is an urgent need to utilize “human-curated, multi-language, off-platform intelligence into learning sets” so that AI can detect online abuse before it ever reaches mainstream platforms.

The Forum says this is necessary to stop the proliferation of everything from child abuse to extremism, disinformation, hate speech, and fraud.

“Supplementing this smarter automated detection with human expertise to review edge cases and identify false positives and negatives and then feeding those findings back into training sets will allow us to create AI with human intelligence baked in,” the WEF article reads. “This more intelligent AI gets more sophisticated with each moderation decision, eventually allowing near-perfect detection, at scale.”

The article continues, with the author stating that there is a lag between when new novel abuse tactics are created and when artificial intelligence can actually detect them. Thus, trust and safety teams should move towards incorporating human-curated AI into the content moderation process so that no such abuses can even be posted.

While such AI would undoubtedly be good in preventing the spread of child abuse material, it’s alarming that the WEF is promoting using AI to stop supposedly harmful “disinformation” from ever being posted in the first place.

The Forum has increasingly come under scrutiny over the last two years, with many waking up to just how influential the organization has become regarding global and regional governance.

Such AI-driven censorship would, of course, be highly effective in reducing such scrutiny.

The Counter Signal

One thought on “WEF wants to use AI to automate censorship of “disinformation”

Join the Conversation

Your email address will not be published. Required fields are marked *