YouTube CEO Susan Wojcicki tells Lesley Stahl what the video platform is doing about hate speech in an interview Sunday on the CBS newsmagazine program ’60 Minutes.’
Wojcicki told ’60 Minutes’ that Google employs 10,000 people to focus on “controversial content.” She described their schedule, which includes time for therapy. Stahl also said there are reports that the “monitors” are “beginning to buy the conspiracy theories.”
“What we really had to do was tighten our enforcement of that to make sure we were catching everything and we use a combination of people and machines,” Wojcicki explained. “So Google as a whole has about 10,000 people that are focused on controversial content.”
Lesley Stahl: I’m told that it is very stressful to be looking at these questionable videos all the time. And that there’s actually counselors to make sure that there aren’t mental problems with the people who are doing this work. Is that true?
Susan Wojcicki: It’s a very important area for us. We try to do everything we can to make sure that this is a good work environment. Our reviewers work five hours of the eight hours reviewing videos. They have the opportunity to take a break whenever they want.
Lesley Stahl: I also heard that these monitors, reviewers, sometimes, they’re beginning to buy the conspiracy theories.
Susan Wojcicki: I’ve definitely heard about that. And we work really hard with all of our reviewers to make sure that, you know, we’re providing the right services for them.
Wojcicki on Section 230, stopping 70% of controversial content:
Lesley Stahl: Once you watch one of these, YouTube’s algorithms might recommend you watch similar content. But no matter how harmful or untruthful, YouTube can’t be held liable for any content, due to a legal protection called Section 230.
The law under 230 does not hold you responsible for user-generated content. But in that you recommend things, sometimes 1,000 times, sometimes 5,000 times, shouldn’t you be held responsible for that material, because you recommend it?
Susan Wojcicki: Well, our systems wouldn’t work without recommending. And so if–
Lesley Stahl: I’m not saying don’t recommend. I’m just saying be responsible for when you recommend so many times.
Susan Wojcicki: If we were held liable for every single piece of content that we recommended, we would have to review it. That would mean there’d be a much smaller set of information that people would be finding. Much, much smaller.
Lesley Stahl: She told us that earlier this year, YouTube started re-programming its algorithms in the U.S. to recommend questionable videos much less and point users who search for that kind of material to authoritative sources, like news clips. With these changes Wojcicki says they have cut down the amount of time Americans watch controversial content by 70%.