Google is looking at rolling out yet another revision to their search engine algorithm by curtailing your results to sites it considers to be the most “trustworthiness.”
New Scientist Explains:
Google’s search engine currently uses the number of incoming links to a web page as a proxy for quality, determining where it appears in search results. So pages that many other sites link to are ranked higher. This system has brought us the search engine as we know it today, but the downside is that websites full of misinformation can rise up the rankings, if enough people link to them.
A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team (arxiv.org/abs/1502.03519v1). The score they compute for each page is its Knowledge-Based Trust score.
The software works by tapping into the Knowledge Vault, the vast store of facts that Google has pulled off the internet. Facts the web unanimously agrees on are considered a reasonable proxy for truth. Web pages that contain contradictory information are bumped down the rankings.
LazyTruth developer Matt Stempeck, now the director of civic media at Microsoft New York, wants to develop software that exports the knowledge found in fact-checking services such as Snopes, PolitiFact and FactCheck.org so that everyone has easy access to them. He says tools like LazyTruth are useful online, but challenging the erroneous beliefs underpinning that information is harder. “How do you correct people’s misconceptions? People get very defensive,” Stempeck says. “If they’re searching for the answer on Google they might be in a much more receptive state.”