Google is making changes to its search engine designed to counter the spread of fake news.
The technology giant has revealed that new guidelines for its human testers, known as Raters, and algorithm changes to identify more authoritative and recognised web pages will be used to improve the quality of ranking in search results.
The firm has been criticised in recent months over “low-quality content” on its platform, including a high-profile incident in which a page denying the events of the Holocaust appeared at the top of search results on the subject.
New in-depth feedback tools are also being added to the search engine’s auto-complete and featured snippets features so users can report offensive or inaccurate content.
Earlier this year, a featured snippet that suggested former US president Barack Obama was planning a coup with China had to be removed by the search engine.
“Search can always be improved,” Google Search’s engineering vice president Ben Gomes said.
“Today, in a world where tens of thousands of pages are coming online every minute of every day, there are new ways that people try to game the system.
“The most high profile of these issues is the phenomenon of ‘fake news’, where content on the web has contributed to the spread of blatantly misleading, low-quality, offensive or downright false information.
“While this problem is different from issues in the past, our goal remains the same – to provide people with access to relevant information from the most reliable sources available.
“And while we may not always get it right, we’re making good progress in tackling the problem. But in order to have long-term and impactful changes, more structural changes in Search are needed.”
The updated guidelines, introduced last month, “explicitly provide” examples of low-quality pages that should be flagged by Raters and demoted as a result when combined with the changes to Google’s algorithms.
Google uses what it calls Quality Raters – around 10,000 testers based around the world – who analyse and rate web pages to help Google’s algorithm better identify which pages to promote in a search.
The firm says it uses testers from all segments of its user base to avoid making political or biased decisions when determining ratings.
“We’ve adjusted our signals to help surface more authoritative pages and demote low-quality content, so that issues similar to the Holocaust denial results that we saw back in December are less likely to appear,” Gomes said.
The direct feedback tools will enable users to give more detailed reasons why they flagged a web page as inaccurate or offensive, Google said.
Users will be able to choose different categories to explain why they feel content is unhelpful, which will help Google improve its algorithms.
Gomes said the firm was determined to stay “one step ahead” and the changes put it “on a path to addressing this problem”.