Google Tightens Grip on Research into ‘Sensitive Topics’
Google is currently under fire for apparently pushing out a researcher whose work warned of bias in AI, and now a report from Reuters says others doing such work at the company have been asked to “strike a positive tone” and undergo additional reviews for research touching on “sensitive topics.”
Reuters, citing researchers at the company and internal documents, reports that Google has implemented new controls in the last year, including an extra round of inspection for papers on certain topics and seemingly an increase in executive interference at later stages of research.
Among the sensitive topics, according to an internal, are: “the oil industry, China, Iran, Israel, COVID-19, home security, insurance, location data, religion, self-driving vehicles, telecoms and systems that recommend or personalize web content.”
A senior Google manager reportedly told researchers to “take great care to strike a positive tone” in their report on content-recommendation technology, which has recently come under fire for pulling internet users down rabbit holes of increasingly radical content. The process is designed to provide an additional layer of scrutiny on top of Google’s already rigorous review standards that identify and prevent disclosing trade secrets.
It’s clear that many of these issues are indeed sensitive, though advising researchers to take care when addressing them seems superfluous considering the existence of ethics boards, peer review, and other ordinary controls on research. One researcher warned that this sort of top-down interference from Google could soon get into a serious problem of censorship.
This is in addition to the fundamental issue of vital research being conducted under the sponsorships of a company for which it may or may not be in their interest to publish. Naturally large private research institutions have existed for nearly as long as organized scientific endeavor, but companies like Facebook, Google, Apple, Microsoft and others exert an enormous influence over fields like AI and have good reason to avoid criticism of lucrative technologies while shouting their usefulness from every rooftop.