SSL and internet security news

lawenforcement

Auto Added by WPeMatico

Alternatives to Government-Mandated Encryption Backdoors

Policy essay: “Encryption Substitutes,” by Andrew Keane Woods:

In this short essay, I make a few simple assumptions that bear mentioning at the outset. First, I assume that governments have good and legitimate reasons for getting access to personal data. These include things like controlling crime, fighting terrorism, and regulating territorial borders. Second, I assume that people have a right to expect privacy in their personal data. Therefore, policymakers should seek to satisfy both law enforcement and privacy concerns without unduly burdening one or the other. Of course, much of the debate over government access to data is about how to respect both of these assumptions. Different actors will make different trade-offs. My aim in this short essay is merely to show that regardless of where one draws this line — whether one is more concerned with ensuring privacy of personal information or ensuring that the government has access to crucial evidence — it would be shortsighted and counterproductive to draw that line with regard to one particular privacy technique and without regard to possible substitutes. The first part of the paper briefly characterizes the encryption debate two ways: first, as it is typically discussed, in stark, uncompromising terms; and second, as a subset of a broader problem. The second part summarizes several avenues available to law enforcement and intelligence agencies seeking access to data. The third part outlines the alternative avenues available to privacy-seekers. The availability of substitutes is relevant to the regulators but also to the regulated. If the encryption debate is one tool in a game of cat and mouse, the cat has other tools at his disposal to catch the mouse — and the mouse has other tools to evade the cat. The fourth part offers some initial thoughts on implications for the privacy debate.

Blog post.

Powered by WPeMatico

Law Enforcement Access to IoT Data

In the first of what will undoubtedly be a large number of battles between companies that make IoT devices and the police, Amazon is refusing to comply with a warrant demanding data on what its Echo device heard at a crime scene.

The particulars of the case are weird. Amazon’s Echo does not constantly record; it only listens for its name. So it’s unclear that there is any evidence to be turned over. But this general issue isn’t going away. We are all under ubiquitous surveillance, but it is surveillance by the companies that control the Internet-connected devices in our lives. The rules by which police and intelligence agencies get access to that data will come under increasing pressure for change.

Related: A newscaster discussed Amazon’s Echo on the news, causing devices in the same room as tuned-in televisions to order unwanted products. This year, the same technology is coming to LG appliances such as refrigerators.

Powered by WPeMatico

Encryption Working Group Annual Report from the US House of Representatives

The Encryption Working Group of the House Judiciary Committee and the House Energy and Commerce Committee has released its annual report.

Observation #1: Any measure that weakens encryption works against the national interest.

Observation #2: Encryption technology is a global technology that is widely and increasingly available around the world.

Observation #3: The variety of stakeholders, technologies, and other factors create different and divergent challenges with respect to encryption and the “going dark” phenomenon, and therefore there is no one-size-fits-all solution to the encryption challenge.

Observation #4: Congress should foster cooperation between the law enforcement community and technology companies.

Powered by WPeMatico

Securing Communications in a Trump Administration

Susan Landau has an excellent essay on why it’s more important than ever to have backdoor-free encryption on our computer and communications systems.

Protecting the privacy of speech is crucial for preserving our democracy. We live at a time when tracking an individual — ­a journalist, a member of the political opposition, a citizen engaged in peaceful protest­ — or listening to their communications is far easier than at any time in human history. Political leaders on both sides now have a responsibility to work for securing communications and devices. This means supporting not only the laws protecting free speech and the accompanying communications, but also the technologies to do so: end-to-end encryption and secured devices; it also means soundly rejecting all proposals for front-door exceptional access. Prior to the election there were strong, sound security arguments for rejecting such proposals. The privacy arguments have now, suddenly, become critically important as well. Threatened authoritarianism means that we need technological protections for our private communications every bit as much as we need the legal ones we presently have.

Unfortunately, the trend is moving in the other direction. The UK just passed the Investigatory Powers Act, giving police and intelligence agencies incredibly broad surveillance powers with very little oversight. And Bits of Freedom just reported that “Croatia, Italy, Latvia, Poland and Hungary all want an EU law to be created to help their law enforcement authorities access encrypted information and share data with investigators in other countries.”

Powered by WPeMatico

Apple's Cloud Key Vault

Ever since Ian Krstić, Apple’s Head of Security Engineering and Architecture, presented the company’s key backup technology at Black Hat 2016, people have been pointing to it as evidence that the company can create a secure backdoor for law enforcement.

It’s not. Matthew Green and Steve Bellovin have both explained why not. And the same group of us that wrote the “Keys Under Doormats” paper on why backdoors are a bad idea have also explained why Apple’s technology does not enable it to build secure backdoors for law enforcement.

The problem with Tait’s argument becomes clearer when you actually try to turn Apple’s Cloud Key Vault into an exceptional access mechanism. In that case, Apple would have to replace the HSM with one that accepts an additional message from Apple or the FBI­ — or an agency from any of the 100+ countries where Apple sells iPhones­ — saying “OK, decrypt,” as well as the user’s password. In order to do this securely, these messages would have to be cryptographically signed with a second set of keys, which would then have to be used as often as law enforcement access is required. Any exceptional access scheme made from this system would have to have an additional set of keys to ensure authorized use of the law enforcement access credentials.

Managing access by a hundred-plus countries is impractical due to mutual mistrust, so Apple would be stuck with keeping a second signing key (or database of second signing keys) for signing these messages that must be accessed for each and every law enforcement agency. This puts us back at the situation where Apple needs to protect another repeatedly-used, high-value public key infrastructure: an equivalent situation to what has already resulted in the theft of Bitcoin wallets, RealTek’s code signing keys, and Certificate Authority failures, among many other disasters.

Repeated access of private keys drastically increases their probability of theft, loss, or inappropriate use. Apple’s Cloud Key Vault does not have any Apple-owned private key, and therefore does not indicate that a secure solution to this problem actually exists.

It is worth noting that the exceptional access schemes one can create from Apple’s CKV (like the one outlined above) inherently entails the precise issues we warned about in our previous essay on the danger signs for recognizing flawed exceptional access systems. Additionally, the Risks of Key Escrow and Keys Under Doormats papers describe further technical and nontechnical issues with exceptional access schemes that must be addressed. Among the nontechnical hurdles would be the requirement, for example, that Apple run a large legal office to confirm that requests for access from the government of Uzbekistan actually involved a device that was located in that country, and that the request was consistent with both US law and Uzbek law.

My colleagues and I do not argue that the technical community doesn’t know how to store high-value encryption keys­ — to the contrary that’s the whole point of an HSM. Rather, we assert that holding on to keys in a safe way such that any other party (i.e. law enforcement or Apple itself) can also access them repeatedly without high potential for catastrophic loss is impossible with today’s technology, and that any scheme running into fundamental sociotechnical challenges such as jurisdiction must be evaluated honestly before any technical implementation is considered.

Powered by WPeMatico

Electronic Surveillance Failures Leading up to the 2008 Mumbai Terrorist Attacks

Long New York Times article based on “former American and Indian officials and classified documents disclosed by Edward J. Snowden” outlining the intelligence failures leading up to the 2008 Mumbai terrorist attacks:

Although electronic eavesdropping often yields valuable data, even tantalizing clues can be missed if the technology is not closely monitored, the intelligence gleaned from it is not linked with other information, or analysis does not sift incriminating activity from the ocean of digital data.

This seems to be the moral:

Although the United States computer arsenal plays a vital role against targets ranging from North Korea’s suspected assault on Sony to Russian cyberthieves and Chinese military hacking units, counterterrorism requires a complex mix of human and technical resources. Some former counterterrorism officials warn against promoting billion-dollar surveillance programs with the narrow argument that they stop attacks.

That monitoring collects valuable information, but large amounts of it are “never meaningfully reviewed or analyzed,” said Charles (Sam) Faddis, a retired C.I.A. counterterrorism chief. “I cannot remember a single instance in my career when we ever stopped a plot based purely on signals intelligence.”

[…]

Intelligence officials say that terror plots are often discernible only in hindsight, when a pattern suddenly emerges from what had been just bits of information. Whatever the reason, no one fully grasped the developing Mumbai conspiracy.

“They either weren’t looking or didn’t understand what it all meant,” said one former American official who had access to the intelligence and would speak only on the condition of anonymity. “There was a lot more noise than signal. There usually is.”

Powered by WPeMatico