SSL and internet security news

privacy

Auto Added by WPeMatico

Obfuscation as a Privacy Tool

This essay discusses the futility of opting out of surveillance, and suggests data obfuscation as an alternative.

We can apply obfuscation in our own lives by using practices and technologies that make use of it, including:

  • The secure browser Tor, which (among other anti-surveillance technologies) muddles our Internet activity with that of other Tor users, concealing our trail in that of many others.
  • The browser plugins TrackMeNot and AdNauseam, which explore obfuscation techniques by issuing many fake search requests and loading and clicking every ad, respectively.
  • The browser extension Go Rando, which randomly chooses your emotional “reactions” on Facebook, interfering with their emotional profiling and analysis.
  • Playful experiments like Adam Harvey’s “HyperFace” project, finding patterns on textiles that fool facial recognition systems ­ not by hiding your face, but by creating the illusion of many faces.

I am generally skeptical about obfuscation tools. I think of this basically as a signal-to-noise problem, and that adding random noise doesn’t do much to obfuscate the signal. But against broad systems of financially motivated corporate surveillance, it might be enough.

Powered by WPeMatico

Mapping Security and Privacy Research across the Decades

This is really interesting: “A Data-Driven Reflection on 36 Years of Security and Privacy Research,” by Aniqua Baset and Tamara Denning:

Abstract: Meta-research—research about research—allows us, as a community, to examine trends in our research and make informed decisions regarding the course of our future research activities. Additionally, overviews of past research are particularly useful for researchers or conferences new to the field. In this work we use topic modeling to identify topics within the field of security and privacy research using the publications of the IEEE Symposium on Security & Privacy (1980-2015), the ACM Conference on Computer and Communications Security (1993-2015), the USENIX Security Symposium (1993-2015), and the Network and Distributed System Security Symposium (1997-2015). We analyze and present data via the perspective of topics trends and authorship. We believe our work serves to contextualize the academic field of computer security and privacy research via one of the first data-driven analyses. An interactive visualization of the topics and corresponding publications is available at https://secprivmeta.net.

I like seeing how our field has morphed over the years.

Powered by WPeMatico

Public Voice Launches Petition for an International Moratorium on Using Facial Recognition for Mass Surveillance

Coming out of the Privacy Commissioners’ Conference in Albania, Public Voice is launching a petition for an international moratorium on using facial recognition software for mass surveillance.

You can sign on as an individual or an organization. I did. You should as well. No, I don’t think that countries will magically adopt this moratorium. But it’s important for us all to register our dissent.

Powered by WPeMatico

Wi-Fi Hotspot Tracking

Free Wi-Fi hotspots can track your location, even if you don’t connect to them. This is because your phone or computer broadcasts a unique MAC address.

What distinguishes location-based marketing hotspot providers like Zenreach and Euclid is that the personal information you enter in the captive portal­ — like your email address, phone number, or social media profile­ — can be linked to your laptop or smartphone’s Media Access Control (MAC) address. That’s the unique alphanumeric ID that devices broadcast when Wi-Fi is switched on.

As Euclid explains in its privacy policy, “…if you bring your mobile device to your favorite clothing store today that is a Location — ­and then a popular local restaurant a few days later that is also a Location­ — we may know that a mobile device was in both locations based on seeing the same MAC Address.”

MAC addresses alone don’t contain identifying information besides the make of a device, such as whether a smartphone is an iPhone or a Samsung Galaxy. But as long as a device’s MAC address is linked to someone’s profile, and the device’s Wi-Fi is turned on, the movements of its owner can be followed by any hotspot from the same provider.

“After a user signs up, we associate their email address and other personal information with their device’s MAC address and with any location history we may previously have gathered (or later gather) for that device’s MAC address,” according to Zenreach’s privacy policy.

The defense is to turn Wi-Fi off on your phone when you’re not using it.

EDITED TO ADD: Note that the article is from 2018. Not that I think anything is different today….

Powered by WPeMatico

On Chinese “Spy Trains”

The trade war with China has reached a new industry: subway cars. Congress is considering legislation that would prevent the world’s largest train maker, the Chinese-owned CRRC Corporation, from competing on new contracts in the United States.

Part of the reasoning behind this legislation is economic, and stems from worries about Chinese industries undercutting the competition and dominating key global industries. But another part involves fears about national security. News articles talk about “spy trains,” and the possibility that the train cars might surreptitiously monitor their passengers’ faces, movements, conversations or phone calls.

This is a complicated topic. There is definitely a national security risk in buying computer infrastructure from a country you don’t trust. That’s why there is so much worry about Chinese-made equipment for the new 5G wireless networks.

It’s also why the United States has blocked the cybersecurity company Kaspersky from selling its Russian-made antivirus products to US government agencies. Meanwhile, the chairman of China’s technology giant Huawei has pointed to NSA spying disclosed by Edward Snowden as a reason to mistrust US technology companies.

The reason these threats are so real is that it’s not difficult to hide surveillance or control infrastructure in computer components, and if they’re not turned on, they’re very difficult to find.

Like every other piece of modern machinery, modern train cars are filled with computers, and while it’s certainly possible to produce a subway car with enough surveillance apparatus to turn it into a “spy train,” in practice it doesn’t make much sense. The risk of discovery is too great, and the payoff would be too low. Like the United States, China is more likely to try to get data from the US communications infrastructure, or from the large Internet companies that already collect data on our every move as part of their business model.

While it’s unlikely that China would bother spying on commuters using subway cars, it would be much less surprising if a tech company offered free Internet on subways in exchange for surveillance and data collection. Or if the NSA used those corporate systems for their own surveillance purposes (just as the agency has spied on in-flight cell phone calls, according to an investigation by the Intercept and Le Monde, citing documents provided by Edward Snowden). That’s an easier, and more fruitful, attack path.

We have credible reports that the Chinese hacked Gmail around 2010, and there are ongoing concerns about both censorship and surveillance by the Chinese social-networking company TikTok. (TikTok’s parent company has told the Washington Post that the app doesn’t send American users’ info back to Beijing, and that the Chinese government does not influence the app’s use in the United States.)

Even so, these examples illustrate an important point: there’s no escaping the technology of inevitable surveillance. You have little choice but to rely on the companies that build your computers and write your software, whether in your smartphones, your 5G wireless infrastructure, or your subway cars. And those systems are so complicated that they can be secretly programmed to operate against your interests.

Last year, Le Monde reported that the Chinese government bugged the computer network of the headquarters of the African Union in Addis Ababa. China had built and outfitted the organization’s new headquarters as a foreign aid gift, reportedly secretly configuring the network to send copies of confidential data to Shanghai every night between 2012 and 2017. China denied having done so, of course.

If there’s any lesson from all of this, it’s that everybody spies using the Internet. The United States does it. Our allies do it. Our enemies do it. Many countries do it to each other, with their success largely dependent on how sophisticated their tech industries are.

China dominates the subway car manufacturing industry because of its low prices­ — the same reason it dominates the 5G hardware industry. Whether these low prices are because the companies are more efficient than their competitors or because they’re being unfairly subsidized by the Chinese government is a matter to be determined at trade negotiations.

Finally, Americans must understand that higher prices are an inevitable result of banning cheaper tech products from China.

We might willingly pay the higher prices because we want domestic control of our telecommunications infrastructure. We might willingly pay more because of some protectionist belief that global trade is somehow bad. But we need to make these decisions to protect ourselves deliberately and rationally, recognizing both the risks and the costs. And while I’m worried about our 5G infrastructure built using Chinese hardware, I’m not worried about our subway cars.

This essay originally appeared on CNN.com.

EDITED TO ADD: I had a lot of trouble with CNN’s legal department with this essay. They were very reluctant to call out the US and its allies for similar behavior, and spent a lot more time adding caveats to statements that I didn’t think needed them. They wouldn’t let me link to this Intercept article talking about US, French, and German infiltration of supply chains, or even the NSA document from the Snowden archives that proved the statements.

Powered by WPeMatico

A Feminist Take on Information Privacy

Maria Farrell has a really interesting framing of information/device privacy:

What our smartphones and relationship abusers share is that they both exert power over us in a world shaped to tip the balance in their favour, and they both work really, really hard to obscure this fact and keep us confused and blaming ourselves. Here are some of the ways our unequal relationship with our smartphones is like an abusive relationship:

  • They isolate us from deeper, competing relationships in favour of superficial contact
    — ‘user engagement’ — that keeps their hold on us strong. Working with social media, they insidiously curate our social lives, manipulating us emotionally with dark patterns to keep us scrolling.

  • They tell us the onus is on us to manage their behavior. It’s our job to tiptoe around them and limit their harms. Spending too much time on a literally-designed-to-be-behaviorally-addictive phone? They send company-approved messages about our online time, but ban from their stores the apps that would really cut our use. We just need to use willpower. We just need to be good enough to deserve them.

  • They betray us, leaking data / spreading secrets. What we shared privately with them is suddenly public. Sometimes this destroys lives, but hey, we only have ourselves to blame. They fight nasty and under-handed, and are so, so sorry when they get caught that we’re meant to feel bad for them. But they never truly change, and each time we take them back, we grow weaker.

  • They love-bomb us when we try to break away, piling on the free data or device upgrades, making us click through page after page of dark pattern, telling us no one understands us like they do, no one else sees everything we really are, no one else will want us.

  • It’s impossible to just cut them off. They’ve wormed themselves into every part of our lives, making life without them unimaginable. And anyway, the relationship is complicated. There is love in it, or there once was. Surely we can get back to that if we just manage them the way they want us to?

Nope. Our devices are basically gaslighting us. They tell us they work for and care about us, and if we just treat them right then we can learn to trust them. But all the evidence shows the opposite is true.

Powered by WPeMatico

Modifying a Tesla to Become a Surveillance Platform

From DefCon:

At the Defcon hacker conference today, security researcher Truman Kain debuted what he calls the Surveillance Detection Scout. The DIY computer fits into the middle console of a Tesla Model S or Model 3, plugs into its dashboard USB port, and turns the car’s built-in cameras­ — the same dash and rearview cameras providing a 360-degree view used for Tesla’s Autopilot and Sentry features­ — into a system that spots, tracks, and stores license plates and faces over time. The tool uses open source image recognition software to automatically put an alert on the Tesla’s display and the user’s phone if it repeatedly sees the same license plate. When the car is parked, it can track nearby faces to see which ones repeatedly appear. Kain says the intent is to offer a warning that someone might be preparing to steal the car, tamper with it, or break into the driver’s nearby home.

Powered by WPeMatico

Surveillance as a Condition for Humanitarian Aid

Excellent op-ed on the growing trend to tie humanitarian aid to surveillance.

Despite the best intentions, the decision to deploy technology like biometrics is built on a number of unproven assumptions, such as, technology solutions can fix deeply embedded political problems. And that auditing for fraud requires entire populations to be tracked using their personal data. And that experimental technologies will work as planned in a chaotic conflict setting. And last, that the ethics of consent don’t apply for people who are starving.

Powered by WPeMatico

Exploiting GDPR to Get Private Information

A researcher abused the GDPR to get information on his fiancee:

It is one of the first tests of its kind to exploit the EU’s General Data Protection Regulation (GDPR), which came into force in May 2018. The law shortened the time organisations had to respond to data requests, added new types of information they have to provide, and increased the potential penalty for non-compliance.

“Generally if it was an extremely large company — especially tech ones — they tended to do really well,” he told the BBC.

“Small companies tended to ignore me.

“But the kind of mid-sized businesses that knew about GDPR, but maybe didn’t have much of a specialised process [to handle requests], failed.”

He declined to identify the organisations that had mishandled the requests, but said they had included:

  • a UK hotel chain that shared a complete record of his partner’s overnight stays

  • two UK rail companies that provided records of all the journeys she had taken with them over several years

  • a US-based educational company that handed over her high school grades, mother’s maiden name and the results of a criminal background check survey.

Powered by WPeMatico