The site, designed by Verily, the life sciences arm of Google parent Alphabet, was first made public Friday at a White House press conference by President Donald Trump, who said the search giant was working with the US government to provide preliminary screening and information about coronavirus testing.
Verily's online screening test was developed so people could determine if they should be tested for COVID-19, the disease caused by the novel coronavirus, based on their symptoms. The website is launching despite a shortage of coronavirus test kits and as authorities advise the public to avoid swamping emergency rooms.
The service also requires visitors to have a Google account to use the tech giant's platform as a health resource. Visitors who don't have a Google account will be required to create one to use the service.
A Google account is required for authentication, as well as contacting people during the screening and testing process, a Verily representative said. The representative didn't say why Verily specifically needs a Google account to accomplish these tasks.
That requirement is raising privacy concerns for experts wary of Google's data collection empire. It also brings up criticisms of Google using a public health crisis like the coronavirus outbreak to gather health data on people.
"COVID-19 testing is a vital public necessity right now -- a core imperative for slowing this disease," said Jake Snow, a technology and civil rights attorney with the American Civil Liberties Union of Northern California. "Access to critical testing should not depend on creating an account and sharing information with what is, essentially, an advertising company."
A Verily FAQ notes that data collected through the screening service is only linked to a person's Google account with explicit permission. Verily already gets consent to share your data by using the COVID-19 screening service, locking privacy-conscious people out of the health resource.
"Authorization is required to collect, use and share information and must be provided before screening begins," a Verily spokeswoman said in a statement. "The services the Baseline COVID-19 Program is providing inherently require the limited and responsible sharing of information with other groups."
A Verily spokeswoman said this wasn't the same authorization needed to link its medical data to a person's Google account. The company said that would require a separate consent request, which it doesn't currently make.
The statement gave examples of data that would be shared, such as medical information with companies performing the physical coronavirus tests. On Verily's FAQ, the company noted that with that permission, data could still be shared with "certain service providers," including Google.
Requiring consent to data policies in exchange for a technology service is considered "forced consent" by privacy regulators in the European Union. In 2018, Facebook, Instagram, WhatsApp and Google's Android faced four complaints over "forced consent" with each lawsuit alleging the tech giants simply cut off access to the service if a user didn't give permission for data collection.
"What's most chilling is that most states have no prohibition on this sort of coercion, forcing people to sign away their privacy to access vital government services," said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project. "If profit-driven companies are going to play a central role in our response to the COVID-19 pandemic, we must take steps to ensure that they are serving the public, not just their bottom line."
Verily said that it complies with applicable laws and regulations. This potential forced consent with Verily could violate GDPR, but Verily noted that the coronavirus screening isn't intended for residents protected by Europe's sweeping data privacy law.
"The Baseline COVID-19 Program is currently only intended for people in the US, specifically the Bay Area pilot launch," a spokeswoman said. "GDPR, however, is focused on personal data from EU data subjects."
While health data is protected by the Health Insurance Portability and Accountability Act, the regulation only applies to covered entities, which includes healthcare providers and insurance agencies. It can also extend to business associates in contracts with these healthcare providers, but Verily did not respond to questions about if it needed to comply with HIPAA.
On its own, Verily is not a healthcare provider that would need to meet HIPAA guidelines -- it's a private company providing health services to consumers.
"Generally speaking, a lot of technology providers, if they have a relationship with the consumer, are not a covered entity," said Charlotte Tschider, a visiting assistant professor of law at the University of Nebraska who studies health and data privacy laws, said.
That distinction draws a line in how the law protects your data. Private health tech companies tend to over-regulate to avoid government scrutiny, said Hined Rafeh, a PhD candidate at Rensselear Polytechnic Institute who researches HIPAA.
She noted that Verily clearly laid out how it manages data privacy and permissions, but it's not required to by any laws. Rafeh added that in some cases, Verily was operating on a similar level to what HIPAA requires. But if it is not a covered entity, it's not mandated to.
"When you deal with a consumer, you're not dealing with a patient. HIPAA protects patients, not consumers," Rafeh said.
Watch this: Pandemic: Here's what's changed about the coronavirus
When you give permission to Verily to conduct the COVID-19 screener, it also allows the company to share that data with third parties, including Salesforce. Verily said that this was so its customer service team could contact people "with emails or calls as appropriate."
The company didn't verify what other third parties have access to that data.
"This is how privacy invasions have the potential to disproportionately harm the vulnerable," Snow said. "Google should release this tool without those limits, so testing can proceed as quickly as possible."
Verily said sharing data from its screening process was "fundamental to the coordination of services," and third parties include the California Department of Health and the clinical laboratory that's running the tests.
The company added that the third-party access to data was limited and had technical security measures to prevent unauthorized access. Still, sharing that data for any purposes beyond what's necessary to perform a coronavirus screening opens up privacy concerns.
"I think the more entities you're sharing sensitive health information with, the more vectors there are for both abuse and screw-ups," said Lindsey Barrett, a staff attorney at Georgetown Law's Institute for Public Representation Communications and Technology Clinic.
35 things to buy if you're stuck at home thanks to coronavirus (besides toilet paper)