SSL and internet security news

backdoors

Auto Added by WPeMatico

WhatsApp Security Vulnerability

Back in March, Rolf Weber wrote about a potential vulnerability in the WhatsApp protocol that would allow Facebook to defeat perfect forward secrecy by forcibly change users’ keys, allowing it — or more likely, the government — to eavesdrop on encrypted messages.

It seems that this vulnerability is real:

WhatsApp has the ability to force the generation of new encryption keys for offline users, unbeknown to the sender and recipient of the messages, and to make the sender re-encrypt messages with new keys and send them again for any messages that have not been marked as delivered.

The recipient is not made aware of this change in encryption, while the sender is only notified if they have opted-in to encryption warnings in settings, and only after the messages have been re-sent. This re-encryption and rebroadcasting effectively allows WhatsApp to intercept and read users’ messages.

The security loophole was discovered by Tobias Boelter, a cryptography and security researcher at the University of California, Berkeley. He told the Guardian: “If WhatsApp is asked by a government agency to disclose its messaging records, it can effectively grant access due to the change in keys.”

The vulnerability is not inherent to the Signal protocol. Open Whisper Systems’ messaging app, Signal, the app used and recommended by whistleblower Edward Snowden, does not suffer from the same vulnerability. If a recipient changes the security key while offline, for instance, a sent message will fail to be delivered and the sender will be notified of the change in security keys without automatically resending the message.

WhatsApp’s implementation automatically resends an undelivered message with a new key without warning the user in advance or giving them the ability to prevent it.

Note that it’s an attack against current and future messages, and not something that would allow the government to reach into the past. In that way, it is no more troubling than the government hacking your mobile phone and reading your WhatsApp conversations that way.

An unnamed “WhatsApp spokesperson” said that they implemented the encryption this way for usability:

In WhatsApp’s implementation of the Signal protocol, we have a “Show Security Notifications” setting (option under Settings > Account > Security) that notifies you when a contact’s security code has changed. We know the most common reasons this happens are because someone has switched phones or reinstalled WhatsApp. This is because in many parts of the world, people frequently change devices and Sim cards. In these situations, we want to make sure people’s messages are delivered, not lost in transit.

He’s technically correct. This is not a backdoor. This really isn’t even a flaw. It’s a design decision that put usability ahead of security in this particular instance. Moxie Marlinspike, creator of Signal and the code base underlying WhatsApp’s encryption, said as much:

Under normal circumstances, when communicating with a contact who has recently changed devices or reinstalled WhatsApp, it might be possible to send a message before the sending client discovers that the receiving client has new keys. The recipient’s device immediately responds, and asks the sender to reencrypt the message with the recipient’s new identity key pair. The sender displays the “safety number has changed” notification, reencrypts the message, and delivers it.

The WhatsApp clients have been carefully designed so that they will not re-encrypt messages that have already been delivered. Once the sending client displays a “double check mark,” it can no longer be asked to re-send that message. This prevents anyone who compromises the server from being able to selectively target previously delivered messages for re-encryption.

The fact that WhatsApp handles key changes is not a “backdoor,” it is how cryptography works. Any attempt to intercept messages in transmit by the server is detectable by the sender, just like with Signal, PGP, or any other end-to-end encrypted communication system.

The only question it might be reasonable to ask is whether these safety number change notifications should be “blocking” or “non-blocking.” In other words, when a contact’s key changes, should WhatsApp require the user to manually verify the new key before continuing, or should WhatsApp display an advisory notification and continue without blocking the user.

Given the size and scope of WhatsApp’s user base, we feel that their choice to display a non-blocking notification is appropriate. It provides transparent and cryptographically guaranteed confidence in the privacy of a user’s communication, along with a simple user experience. The choice to make these notifications “blocking” would in some ways make things worse. That would leak information to the server about who has enabled safety number change notifications and who hasn’t, effectively telling the server who it could MITM transparently and who it couldn’t; something that WhatsApp considered very carefully.

How serious this is depends on your threat model. If you are worried about the US government — or any other government that can pressure Facebook — snooping on your messages, then this is a small vulnerability. If not, then it’s nothing to worry about.

Slashdot thread. Hacker News thread. BoingBoing post. More here.

Powered by WPeMatico

Encryption Working Group Annual Report from the US House of Representatives

The Encryption Working Group of the House Judiciary Committee and the House Energy and Commerce Committee has released its annual report.

Observation #1: Any measure that weakens encryption works against the national interest.

Observation #2: Encryption technology is a global technology that is widely and increasingly available around the world.

Observation #3: The variety of stakeholders, technologies, and other factors create different and divergent challenges with respect to encryption and the “going dark” phenomenon, and therefore there is no one-size-fits-all solution to the encryption challenge.

Observation #4: Congress should foster cooperation between the law enforcement community and technology companies.

Powered by WPeMatico

My Priorities for the Next Four Years

Like many, I was surprised and shocked by the election of Donald Trump as president. I believe his ideas, temperament, and inexperience represent a grave threat to our country and world. Suddenly, all the things I had planned to work on seemed trivial in comparison. Although Internet security and privacy are not the most important policy areas at risk, I believe he — and, more importantly, his cabinet, administration, and Congress — will have devastating effects in that area, both in the US and around the world.

The election was so close that I’ve come to see the result as a bad roll of the dice. A few minor tweaks here and there — a more enthusiastic Sanders endorsement, one fewer of Comey’s announcements, slightly less Russian involvement — and the country would be preparing for a Clinton presidency and discussing a very different social narrative. That alternative narrative would stress business as usual, and continue to obscure the deep social problems in our society. Those problems won’t go away on their own, and in this alternative future they would continue to fester under the surface, getting steadily worse. This election exposed those problems for everyone to see.

I spent the last month both coming to terms with this reality, and thinking about the future. Here is my new agenda for the next four years:

One, fight the fights. There will be more government surveillance and more corporate surveillance. I expect legislative and judicial battles along several lines: a renewed call from the FBI for backdoors into encryption, more leeway for government hacking without a warrant, no controls on corporate surveillance, and more secret government demands for that corporate data. I expect other countries to follow our lead. (The UK is already more extreme than us.) And if there’s a major terrorist attack under Trump’s watch, it’ll be open season on our liberties. We may lose a lot of these battles, but we need to lose as few as possible and as little of our existing liberties as possible.

Two, prepare for those fights. Much of the next four years will be reactive, but we can prepare somewhat. The more we can convince corporate America to delete their saved archives of surveillance data and to store only what they need for as long as they need it, the safer we’ll all be. We need to convince Internet giants like Google and Facebook to change their business models away from surveillance capitalism. It’s a hard sell, but maybe we can nibble around the edges. Similarly, we need to keep pushing the truism that privacy and security are not antagonistic, but rather are essential for each other.

Three, lay the groundwork for a better future. No matter how bad the next four years get, I don’t believe that a Trump administration will permanently end privacy, freedom, and liberty in the US. I don’t believe that it portends a radical change in our democracy. (Or if it does, we have bigger problems than a free and secure Internet.) It’s true that some of Trump’s institutional changes might take decades to undo. Even so, I am confident — optimistic even — that the US will eventually come around; and when that time comes, we need good ideas in place for people to come around to. This means proposals for non-surveillance-based Internet business models, research into effective law enforcement that preserves privacy, intelligent limits on how corporations can collect and exploit our data, and so on.

And four, continue to solve the actual problems. The serious security issues around cybercrime, cyber-espionage, cyberwar, the Internet of Things, algorithmic decision making, foreign interference in our elections, and so on aren’t going to disappear for four years while we’re busy fighting the excesses of Trump. We need to continue to work towards a more secure digital future. And to the extent that cybersecurity for our military networks and critical infrastructure allies with cybersecurity for everyone, we’ll probably have an ally in Trump.

Those are my four areas. Under a Clinton administration, my list would have looked much the same. Trump’s election just means the threats will be much greater, and the battles a lot harder to win. It’s more than I can possibly do on my own, and I am therefore substantially increasing my annual philanthropy to support organizations like EPIC, EFF, ACLU, and Access Now in continuing their work in these areas.

My agenda is necessarily focused entirely on my particular areas of concern. The risks of a Trump presidency are far more pernicious, but this is where I have expertise and influence.

Right now, we have a defeated majority. Many are scared, and many are motivated — and few of those are applying their motivation constructively. We need to harness that fear and energy to start fixing our society now, instead of waiting four or even eight years, at which point the problems would be worse and the solutions more extreme. I am choosing to proceed as if this were cowpox, not smallpox: fighting the more benign disease today will be much easier than subjecting ourselves to its more virulent form in the future. It’s going to be hard keeping the intensity up for the next four years, but we need to get to work. Let’s use Trump’s victory as the wake-up call and opportunity that it is.

Powered by WPeMatico

Securing Communications in a Trump Administration

Susan Landau has an excellent essay on why it’s more important than ever to have backdoor-free encryption on our computer and communications systems.

Protecting the privacy of speech is crucial for preserving our democracy. We live at a time when tracking an individual — ­a journalist, a member of the political opposition, a citizen engaged in peaceful protest­ — or listening to their communications is far easier than at any time in human history. Political leaders on both sides now have a responsibility to work for securing communications and devices. This means supporting not only the laws protecting free speech and the accompanying communications, but also the technologies to do so: end-to-end encryption and secured devices; it also means soundly rejecting all proposals for front-door exceptional access. Prior to the election there were strong, sound security arguments for rejecting such proposals. The privacy arguments have now, suddenly, become critically important as well. Threatened authoritarianism means that we need technological protections for our private communications every bit as much as we need the legal ones we presently have.

Unfortunately, the trend is moving in the other direction. The UK just passed the Investigatory Powers Act, giving police and intelligence agencies incredibly broad surveillance powers with very little oversight. And Bits of Freedom just reported that “Croatia, Italy, Latvia, Poland and Hungary all want an EU law to be created to help their law enforcement authorities access encrypted information and share data with investigators in other countries.”

Powered by WPeMatico

Smartphone Secretly Sends Private Data to China

This is pretty amazing:

International customers and users of disposable or prepaid phones are the people most affected by the software. But the scope is unclear. The Chinese company that wrote the software, Shanghai Adups Technology Company, says its code runs on more than 700 million phones, cars and other smart devices. One American phone manufacturer, BLU Products, said that 120,000 of its phones had been affected and that it had updated the software to eliminate the feature.

Kryptowire, the security firm that discovered the vulnerability, said the Adups software transmitted the full contents of text messages, contact lists, call logs, location information and other data to a Chinese server.

On one hand, the phone secretly sends private user data to China. On the other hand, it only costs $50.

Powered by WPeMatico

Recovering an iPhone 5c Passcode

Remember the San Bernardino killer’s iPhone, and how the FBI maintained that they couldn’t get the encryption key without Apple providing them with a universal backdoor? Many of us computer-security experts said that they were wrong, and there were several possible techniques they could use. One of them was manually removing the flash chip from the phone, extracting the memory, and then running a brute-force attack without worrying about the phone deleting the key.

The FBI said it was impossible. We all said they were wrong. Now, Sergei Skorobogatov has proved them wrong. Here’s his paper:

Abstract: This paper is a short summary of a real world mirroring attack on the Apple iPhone 5c passcode retry counter under iOS 9. This was achieved by desoldering the NAND Flash chip of a sample phone in order to physically access its connection to the SoC and partially reverse engineering its proprietary bus protocol. The process does not require any expensive and sophisticated equipment. All needed parts are low cost and were obtained from local electronics distributors. By using the described and successful hardware mirroring process it was possible to bypass the limit on passcode retry attempts. This is the first public demonstration of the working prototype and the real hardware mirroring process for iPhone 5c. Although the process can be improved, it is still a successful proof-of-concept project. Knowledge of the possibility of mirroring will definitely help in designing systems with better protection. Also some reliability issues related to the NAND memory allocation in iPhone 5c are revealed. Some future research directions are outlined in this paper and several possible countermeasures are suggested. We show that claims that iPhone 5c NAND mirroring was infeasible were ill-advised.

Susan Landau explains why this is important:

The moral of the story? It’s not, as the FBI has been requesting, a bill to make it easier to access encrypted communications, as in the proposed revised Burr-Feinstein bill. Such “solutions” would make us less secure, not more so. Instead we need to increase law enforcement’s capabilities to handle encrypted communications and devices. This will also take more funding as well as redirection of efforts. Increased security of our devices and simultaneous increased capabilities of law enforcement are the only sensible approach to a world where securing the bits, whether of health data, financial information, or private emails, has become of paramount importance.

Or: The FBI needs computer-security expertise, not backdoors.

Patrick Ball writes about the dangers of backdoors.

Powered by WPeMatico

Apple's Cloud Key Vault

Ever since Ian Krstić, Apple’s Head of Security Engineering and Architecture, presented the company’s key backup technology at Black Hat 2016, people have been pointing to it as evidence that the company can create a secure backdoor for law enforcement.

It’s not. Matthew Green and Steve Bellovin have both explained why not. And the same group of us that wrote the “Keys Under Doormats” paper on why backdoors are a bad idea have also explained why Apple’s technology does not enable it to build secure backdoors for law enforcement.

The problem with Tait’s argument becomes clearer when you actually try to turn Apple’s Cloud Key Vault into an exceptional access mechanism. In that case, Apple would have to replace the HSM with one that accepts an additional message from Apple or the FBI­ — or an agency from any of the 100+ countries where Apple sells iPhones­ — saying “OK, decrypt,” as well as the user’s password. In order to do this securely, these messages would have to be cryptographically signed with a second set of keys, which would then have to be used as often as law enforcement access is required. Any exceptional access scheme made from this system would have to have an additional set of keys to ensure authorized use of the law enforcement access credentials.

Managing access by a hundred-plus countries is impractical due to mutual mistrust, so Apple would be stuck with keeping a second signing key (or database of second signing keys) for signing these messages that must be accessed for each and every law enforcement agency. This puts us back at the situation where Apple needs to protect another repeatedly-used, high-value public key infrastructure: an equivalent situation to what has already resulted in the theft of Bitcoin wallets, RealTek’s code signing keys, and Certificate Authority failures, among many other disasters.

Repeated access of private keys drastically increases their probability of theft, loss, or inappropriate use. Apple’s Cloud Key Vault does not have any Apple-owned private key, and therefore does not indicate that a secure solution to this problem actually exists.

It is worth noting that the exceptional access schemes one can create from Apple’s CKV (like the one outlined above) inherently entails the precise issues we warned about in our previous essay on the danger signs for recognizing flawed exceptional access systems. Additionally, the Risks of Key Escrow and Keys Under Doormats papers describe further technical and nontechnical issues with exceptional access schemes that must be addressed. Among the nontechnical hurdles would be the requirement, for example, that Apple run a large legal office to confirm that requests for access from the government of Uzbekistan actually involved a device that was located in that country, and that the request was consistent with both US law and Uzbek law.

My colleagues and I do not argue that the technical community doesn’t know how to store high-value encryption keys­ — to the contrary that’s the whole point of an HSM. Rather, we assert that holding on to keys in a safe way such that any other party (i.e. law enforcement or Apple itself) can also access them repeatedly without high potential for catastrophic loss is impossible with today’s technology, and that any scheme running into fundamental sociotechnical challenges such as jurisdiction must be evaluated honestly before any technical implementation is considered.

Powered by WPeMatico

"Surreptitiously Weakening Cryptographic Systems"

New paper: “Surreptitiously Weakening Cryptographic Systems,” by Bruce Schneier, Matthew Fredrikson, Tadayoshi Kohno, and Thomas Ristenpart.

Abstract: Revelations over the past couple of years highlight the importance of understanding malicious and surreptitious weakening of cryptographic systems. We provide an overview of this domain, using a number of historical examples to drive development of a weaknesses taxonomy. This allows comparing different approaches to sabotage. We categorize a broader set of potential avenues for weakening systems using this taxonomy, and discuss what future research is needed to provide sabotage-resilient cryptography.

Powered by WPeMatico