SSL and internet security news

tracking

Auto Added by WPeMatico

Global Surveillance in the Wake of COVID-19

OneZero is tracking thirty countries around the world who are implementing surveillance programs in the wake of COVID-19:

The most common form of surveillance implemented to battle the pandemic is the use of smartphone location data, which can track population-level movement down to enforcing individual quarantines. Some governments are making apps that offer coronavirus health information, while also sharing location information with authorities for a period of time. For instance, in early March, the Iranian government released an app that it pitched as a self-diagnostic tool. While the tool’s efficacy was likely low, given reports of asymptomatic carriers of the virus, the app saved location data of millions of Iranians, according to a Vice report.

One of the most alarming measures being implemented is in Argentina, where those who are caught breaking quarantine are being forced to download an app that tracks their location. In Hong Kong, those arriving in the airport are given electronic tracking bracelets that must be synced to their home location through their smartphone’s GPS signal.

Powered by WPeMatico

Contact Tracing COVID-19 Infections via Smartphone Apps

Google and Apple have announced a joint project to create a privacy-preserving COVID-19 contact tracing app. (Details, such as we have them, are here.) It’s similar to the app being developed at MIT, and similar to others being described and developed elsewhere. It’s nice seeing the privacy protections; they’re well thought out.

I was going to write a long essay about the security and privacy concerns, but Ross Anderson beat me to it. (Note that some of his comments are UK-specific.)

First, it isn’t anonymous. Covid-19 is a notifiable disease so a doctor who diagnoses you must inform the public health authorities, and if they have the bandwidth they call you and ask who you’ve been in contact with. They then call your contacts in turn. It’s not about consent or anonymity, so much as being persuasive and having a good bedside manner.

I’m relaxed about doing all this under emergency public-health powers, since this will make it harder for intrusive systems to persist after the pandemic than if they have some privacy theater that can be used to argue that the whizzy new medi-panopticon is legal enough to be kept running.

Second, contact tracers have access to all sorts of other data such as public transport ticketing and credit-card records. This is how a contact tracer in Singapore is able to phone you and tell you that the taxi driver who took you yesterday from Orchard Road to Raffles has reported sick, so please put on a mask right now and go straight home. This must be controlled; Taiwan lets public-health staff access such material in emergencies only.

Third, you can’t wait for diagnoses. In the UK, you only get a test if you’re a VIP or if you get admitted to hospital. Even so the results take 1-3 days to come back. While the VIPs share their status on twitter or facebook, the other diagnosed patients are often too sick to operate their phones.

Fourth, the public health authorities need geographical data for purposes other than contact tracing – such as to tell the army where to build more field hospitals, and to plan shipments of scarce personal protective equipment. There are already apps that do symptom tracking but more would be better. So the UK app will ask for the first three characters of your postcode, which is about enough to locate which hospital you’d end up in.

Fifth, although the cryptographers – and now Google and Apple – are discussing more anonymous variants of the Singapore app, that’s not the problem. Anyone who’s worked on abuse will instantly realise that a voluntary app operated by anonymous actors is wide open to trolling. The performance art people will tie a phone to a dog and let it run around the park; the Russians will use the app to run service-denial attacks and spread panic; and little Johnny will self-report symptoms to get the whole school sent home.

I recommend reading his essay in full. Also worth reading are this EFF essay, and this ACLU white paper.

To me, the real problems aren’t around privacy and security. The efficacy of any app-based contact tracing is still unproven. A “contact” from the point of view of an app isn’t the same as an epidemiological contact. And the ratio of infections to contacts is high. We would have to deal with the false positives (being close to someone else, but separated by a partition or other barrier) and the false negatives (not being close to someone else, but contracting the disease through a mutually touched object). And without cheap, fast, and accurate testing, the information from any of these apps isn’t very useful. So I agree with Ross that this is primarily an exercise in that false syllogism: Something must be done. This is something. Therefore, we must do it. It’s techies proposing tech solutions to what is primarily a social problem.

EDITED TO ADD: Susan Landau on contact tracing apps and how they’re being oversold. And Farzad Mostashari, former coordinator for health IT at the Department of Health and Human Services, on contact tracing apps.

As long as 1) every contact does not result in an infection, and 2) a large percentage of people with the disease are asymptomatic and don’t realize they have it, I can’t see how this sort of app is valuable. If we had cheap, fast, and accurate testing for everyone on demand…maybe. But I still don’t think so.

Powered by WPeMatico

Privacy vs. Surveillance in the Age of COVID-19

The trade-offs are changing:

As countries around the world race to contain the pandemic, many are deploying digital surveillance tools as a means to exert social control, even turning security agency technologies on their own civilians. Health and law enforcement authorities are understandably eager to employ every tool at their disposal to try to hinder the virus ­ even as the surveillance efforts threaten to alter the precarious balance between public safety and personal privacy on a global scale.

Yet ratcheting up surveillance to combat the pandemic now could permanently open the doors to more invasive forms of snooping later.

I think the effects of COVID-19 will be more drastic than the effects of the terrorist attacks of 9/11: not only with respect to surveillance, but across many aspects of our society. And while many things that would never be acceptable during normal time are reasonable things to do right now, we need to makes sure we can ratchet them back once the current pandemic is over.

Cindy Cohn at EFF wrote:

We know that this virus requires us to take steps that would be unthinkable in normal times. Staying inside, limiting public gatherings, and cooperating with medically needed attempts to track the virus are, when approached properly, reasonable and responsible things to do. But we must be as vigilant as we are thoughtful. We must be sure that measures taken in the name of responding to COVID-19 are, in the language of international human rights law, “necessary and proportionate” to the needs of society in fighting the virus. Above all, we must make sure that these measures end and that the data collected for these purposes is not re-purposed for either governmental or commercial ends.

I worry that in our haste and fear, we will fail to do any of that.

More from EFF.

Powered by WPeMatico

Apple’s Tracking-Prevention Feature in Safari has a Privacy Bug

Last month, engineers at Google published a very curious privacy bug in Apple’s Safari web browser. Apple’s Intelligent Tracking Prevention, a feature designed to reduce user tracking, has vulnerabilities that themselves allow user tracking. Some details:

ITP detects and blocks tracking on the web. When you visit a few websites that happen to load the same third-party resource, ITP detects the domain hosting the resource as a potential tracker and from then on sanitizes web requests to that domain to limit tracking. Tracker domains are added to Safari’s internal, on-device ITP list. When future third-party requests are made to a domain on the ITP list, Safari will modify them to remove some information it believes may allow tracking the user (such as cookies).

[…]

The details should come as a surprise to everyone because it turns out that ITP could effectively be used for:

  • information leaks: detecting websites visited by the user (web browsing history hijacking, stealing a list of visited sites)
  • tracking the user with ITP, making the mechanism function like a cookie
  • fingerprinting the user: in ways similar to the HSTS fingerprint, but perhaps a bit better

I am sure we all agree that we would not expect a privacy feature meant to protect from tracking to effectively enable tracking, and also accidentally allowing any website out there to steal its visitors’ web browsing history. But web architecture is complex, and the consequence is that this is exactly the case.

Apple fixed this vulnerability in December, a month before Google published.

If there’s any lesson here, it’s that privacy is hard — and that privacy engineering is even harder. It’s not that we shouldn’t try, but we should recognize that it’s easy to get it wrong.

Powered by WPeMatico

Customer Tracking at Ralphs Grocery Store

To comply with California’s new data privacy law, companies that collect information on consumers and users are forced to be more transparent about it. Sometimes the results are creepy. Here’s an article about Ralphs, a California supermarket chain owned by Kroger:

…the form proceeds to state that, as part of signing up for a rewards card, Ralphs “may collect” information such as “your level of education, type of employment, information about your health and information about insurance coverage you might carry.”

It says Ralphs may pry into “financial and payment information like your bank account, credit and debit card numbers, and your credit history.”

Wait, it gets even better.

Ralphs says it’s gathering “behavioral information” such as “your purchase and transaction histories” and “geolocation data,” which could mean the specific Ralphs aisles you browse or could mean the places you go when not shopping for groceries, thanks to the tracking capability of your smartphone.

Ralphs also reserves the right to go after “information about what you do online” and says it will make “inferences” about your interests “based on analysis of other information we have collected.”

Other information? This can include files from “consumer research firms” ­– read: professional data brokers ­– and “public databases,” such as property records and bankruptcy filings.

The reaction from John Votava, a Ralphs spokesman:

“I can understand why it raises eyebrows,” he said. We may need to change the wording on the form.”

That’s the company’s solution. Don’t spy on people less, just change the wording so they don’t realize it.

More consumer protection laws will be required.

Powered by WPeMatico

Google Receives Geofence Warrants

Sometimes it’s hard to tell the corporate surveillance operations from the government ones:

Google reportedly has a database called Sensorvault in which it stores location data for millions of devices going back almost a decade.

The article is about geofence warrants, where the police go to companies like Google and ask for information about every device in a particular geographic area at a particular time. In 2013, we learned from Edward Snowden that the NSA does this worldwide. Its program is called CO-TRAVELLER. The NSA claims it stopped doing that in 2014 — probably just stopped doing it in the US — but why should it bother when the government can just get the data from Google.

Both the New York Times and EFF have written about Sensorvault.

Powered by WPeMatico

Wi-Fi Hotspot Tracking

Free Wi-Fi hotspots can track your location, even if you don’t connect to them. This is because your phone or computer broadcasts a unique MAC address.

What distinguishes location-based marketing hotspot providers like Zenreach and Euclid is that the personal information you enter in the captive portal­ — like your email address, phone number, or social media profile­ — can be linked to your laptop or smartphone’s Media Access Control (MAC) address. That’s the unique alphanumeric ID that devices broadcast when Wi-Fi is switched on.

As Euclid explains in its privacy policy, “…if you bring your mobile device to your favorite clothing store today that is a Location — ­and then a popular local restaurant a few days later that is also a Location­ — we may know that a mobile device was in both locations based on seeing the same MAC Address.”

MAC addresses alone don’t contain identifying information besides the make of a device, such as whether a smartphone is an iPhone or a Samsung Galaxy. But as long as a device’s MAC address is linked to someone’s profile, and the device’s Wi-Fi is turned on, the movements of its owner can be followed by any hotspot from the same provider.

“After a user signs up, we associate their email address and other personal information with their device’s MAC address and with any location history we may previously have gathered (or later gather) for that device’s MAC address,” according to Zenreach’s privacy policy.

The defense is to turn Wi-Fi off on your phone when you’re not using it.

EDITED TO ADD: Note that the article is from 2018. Not that I think anything is different today….

Powered by WPeMatico