The Future Is Here
We may earn a commission from links on this page

Search Atlas Shows How Google Results Differ Around the Globe

The tool shows how results for sensitive topics, political conflicts, or even just the word "God" can differ significantly across regions.

The Google logo seen on a computer screen in Washington, DC in July 2019.
The Google logo seen on a computer screen in Washington, DC in July 2019.
Photo: Alastair Pike / AFP (Getty Images)

How does a search engine like Google quantify, analyze, and rank information? What factors does it take into account, and how are they weighted? The algorithms that handle queries may be opaque, but the end results are clearly visible.

That’s the idea behind Search Atlas, a new tool developed by academics that aims to show how Google would display search results if a query was entered in different locales around the world. It’s an experimental interface for Google Search that returns three, rather than one, columns of results selected from the more than 100 geographically localized versions of the search engine across the world. So, for example, a search for Tiananmen Square may prioritize the infamous massacre of protesters there in 1989 or directions for tourists; in the U.S. certain results may be removed due to Digital Millennium Copyright Act complaints; or in France and Germany, certain Holocaust denial sites may be blocked from results.

Advertisement

Wired reports that the creators of Search Atlas first presented their results at the Designing Interactive Systems conference in June and it remains in private beta, but they have released a paper and other preview materials on the project’s website. The tool is already turning up interesting results. For example, using Search Atlas to look for images of “God” turns up Christian imagery in the U.S. and Europe, while in Asia it turned up images of Buddha and in the Persian Gulf and North Africa it turned up Arabic script.

In the UK and Singapore, a search for Tiananmen Square turned up images related to the massacre, while a search tuned to China (where Google has been blocked since 2010) turned up “recent, sunny images of the square, smattered with tourists,” according to Wired. Results for “how to combat climate change” emphasized policy solutions in Germany, while island nations like Mauritius and the Philippines seemed to receive results emphasizing the immediate, dire nature of the threat, like sea-level rise that threatens to disproportionately impact them much sooner.

Advertisement

Similarly, Wired wrote that queries on the war in Ethiopia’s Tigray region set to within the country turned up “Facebook pages and blogs that criticized Western diplomatic pressure to deescalate the conflict, suggesting that the US and others were trying to weaken Ethiopia,” whereas searches set to Kenya or the U.S. “more prominently featured explanatory news coverage from sources such as the BBC and The New York Times.”

Advertisement
Advertisement

MIT science, technology, and society Ph.D. student and Search Atlas creator Rodrigo Ochigame told Wired that the project aims to dispel the persistent notion that search engines like Google are neutral arbitrators of information: “Any attempt to quantify relevance necessarily encodes moral and political priorities.”

Advertisement

Project co-creator Katherine Ye, a computer science Ph.D. student at Carnegie Mellon University and research fellow at the Center for Arts, Design, and Social Research nonprofit, told Wired that “People ask search engines things they would never ask a person, and the things they happen to see in Google’s results can change their lives. It could be ‘How do I get an abortion?’ restaurants near you, or how you vote, or get a vaccine.”

For example, Ye tweeted that Google results for “Crimean annexation” showed up results in Russia framed around the impact on the Russian Federation, in Ukraine framed around “occupation,” and in the Netherlands framed around European Union sanctions on Russia.

Advertisement

These disparate results aren’t necessarily the result of any intent to suppress information, but factors like Google trying to localize its results to be of more interest to people in specific geographic regions, commercial interest, local laws, and what Ochigame and Ye told Wired are “information borders” that create “partial perspectives.” These supposedly apolitical adjustments nonetheless inevitably bleed over into politics. While the difference in results for Tiananmen Square appears to reflect the Chinese government’s desire to cover up the incident, a Google spokesperson told Wired that the search engine turns up the tourist-friendly images when it infers an intent to travel. The differences in searches for “God,” the spokesperson told the site, were due to the way the term is translated into different languages.

The end result is a partial slice of reality predicated upon Google’s assumptions about the world and influenced by a desire to maximize revenue, according to the researchers.

Advertisement

“Even the earliest studies, based on anecdotal observations, already suggested that search engines systematically suppress some sites in favor of others, in line with financial interests,” the researchers wrote in the paper. “More recent studies have argued that commercial search engines deploy algorithms that reinforce existing social structures, particularly racist and sexist patterns of exposure, invisibility, and marginalization. Thus, it is vital to expose the partial perspective of search engines.”