Google search knowledge panels have misinformation issue


Hristo Georgiev recently received a disturbing message from a friend: Google says he’s a serial killer. If you searched for his name on Google, the search engine would serve you Georgiev’s professional portrait next to a Wikipedia article about a Bulgarian serial killer of the same name who died in 1980. It’s an unfortunate mistake, but it’s also not the first time Google’s algorithms have done something like this.

The Wikipedia article that surfaced in Georgiev’s results did not include his photo in his head, and if you read carefully, you’ll quickly learn that the eponymous serial killer died by firing squad ago. decades. Yet Google’s automated systems had made Georgiev, a Swiss-based software engineer, appear to be someone he was not. The company’s algorithms had deposited the information into one of its “knowledge panels,” the little boxes that appear at the top of search engine results and are supposed to provide a quick and authoritative response to search queries in order to that you don’t have to click on it. on the results. But since Google launched these panels in 2012, they have repeatedly promoted disinformation.

It took Google a few hours to resolve the issue after several people reported it and it drew attention to the Hacker News web forum, Georgiev told Recode. Still, he says the poor result, which may have been in place for weeks, made him feel at least “a little uncomfortable.” He recalled that a person who had researched his name had had a momentary “mini heart attack”.

The problem that led to Georgiev’s incorrect results can be attributed to Google’s Knowledge Graph, which the search engine calls a giant virtual encyclopedia of facts. “Organizing billions of entities in our Knowledge Graph requires that we use automated systems, which often work well but are not perfect,” a Google spokesperson told Recode. “We are sorry for the problems caused by this confusion. “

Google has a formal process for reporting and removing inaccurate information in its knowledge panels, but it relies heavily on users to spot any issues. This leaves the responsibility for users to notice if Google shows incorrect information about them in its first search results, and then report the error to the search platform. The company launched a system that allows organizations and individuals to verify their identity with Google so that they can more easily provide direct feedback to Google on the accuracy of the information sheet that concerns them.

Yet in the past, people have complained that removing false information from these signs is a tedious process, and others have said it can take months or even years. The Google spokesperson told Recode that the company regularly reviews comments on its Knowledge Graph results, but did not comment on how often the company receives requests for changes.

Ultimately, the problem is part of Google’s larger problem: relying on algorithms to identify and deliver the correct information doesn’t always work and can actually risk amplifying misinformation.

Google claims that its Knowledge Graph works by connecting pieces of information from the web about a particular person, place or thing, especially for notable people, places and things. It’s more advanced and specific than showing results based solely on keywords, as the company explained when it launched the tool in 2012. Google used the information collected from this system, which the company says includes 500 billion pieces of information about 5 billion entities, to organize special sections of its search results that the company calls panels knowledge. These boxes encourage people to stay on the Google results page rather than clicking on results and visiting other websites.

Google’s Knowledge Graph focuses on highlighting different information on the web about a particular topic or thing rather than finding information that just includes the same keywords.

Sometimes these signs make getting answers quicker and easier on Google. But Georgiev is just one of many people who get frustrated when information about the murderers appears in info sheets when people search for their name. Other times, these results will incorrectly report that a person is married or dead. Even more disturbingly, these signs raised hateful content, as they did two years ago when a sign associated with the term “Jew who hates himself” – an anti-Semitic term – attached an image of the Jew. actress Sarah Silverman.

Eni Mustafaraj, professor of computer science at Wellesley, told Recode that these problems often arise when a computer system does not match information from two different sources – in Georgiev’s case, an image and a Wikipedia page.

At the same time, the incident clearly shows how much Google’s knowledge panels depend on information edited by users on Wikipedia. “This kind of story is just a reminder of the general reliance of search engines on what is essentially the unpaid, unpaid volunteer work of a huge group of people around the world.” said Nicholas Vincent, a doctoral student at Northwestern who has studied search engines told Recode.

Google says errors are rare in these panels, but it still seems like it’s still up to humans to spot and correct their errors. “You work with massive amounts of information, don’t you? Georgiev told Recode. “There are bound to be mistakes. It’s just that when you’re Google you have to be really careful with that stuff. ”

In the meantime, he says, you should start searching on Google yourself.

[ad_2]

Source Link