SSL and internet security news

apple

Auto Added by WPeMatico

Impersonating iOS Password Prompts

This is an interesting security vulnerability: because it is so easy to impersonate iOS password prompts, a malicious app can steal your password just by asking.

Why does this work?

iOS asks the user for their iTunes password for many reasons, the most common ones are recently installed iOS operating system updates, or iOS apps that are stuck during installation.

As a result, users are trained to just enter their Apple ID password whenever iOS prompts you to do so. However, those popups are not only shown on the lock screen, and the home screen, but also inside random apps, e.g. when they want to access iCloud, GameCenter or In-App-Purchases.

This could easily be abused by any app, just by showing an UIAlertController, that looks exactly like the system dialog.

Even users who know a lot about technology have a hard time detecting that those alerts are phishing attacks.

The essay proposes some solutions, but I’m not sure they’ll work. We’re all trained to trust our computers and the applications running on them.

Powered by WPeMatico

Apple’s FaceID

This is a good interview with Apple’s SVP of Software Engineering about FaceID.

Honestly, I don’t know what to think. I am confident that Apple is not collecting a photo database, but not optimistic that it can’t be hacked with fake faces. I dislike the fact that the police can point the phone at someone and have it automatically unlock. So this is important:

I also quizzed Federighi about the exact way you “quick disabled” Face ID in tricky scenarios — like being stopped by police, or being asked by a thief to hand over your device.

“On older phones the sequence was to click 5 times [on the power button], but on newer phones like iPhone 8 and iPhone X, if you grip the side buttons on either side and hold them a little while — we’ll take you to the power down [screen]. But that also has the effect of disabling Face ID,” says Federighi. “So, if you were in a case where the thief was asking to hand over your phone — you can just reach into your pocket, squeeze it, and it will disable Face ID. It will do the same thing on iPhone 8 to disable Touch ID.”

That squeeze can be of either volume button plus the power button. This, in my opinion, is an even better solution than the “5 clicks” because it’s less obtrusive. When you do this, it defaults back to your passcode.

More:

It’s worth noting a few additional details here:

  • If you haven’t used Face ID in 48 hours, or if you’ve just rebooted, it will ask for a passcode.

  • If there are 5 failed attempts to Face ID, it will default back to passcode. (Federighi has confirmed that this is what happened in the demo onstage when he was asked for a passcode — it tried to read the people setting the phones up on the podium.)

  • Developers do not have access to raw sensor data from the Face ID array. Instead, they’re given a depth map they can use for applications like the Snap face filters shown onstage. This can also be used in ARKit applications.

  • You’ll also get a passcode request if you haven’t unlocked the phone using a passcode or at all in 6.5 days and if Face ID hasn’t unlocked it in 4 hours.

Also be prepared for your phone to immediately lock every time your sleep/wake button is pressed or it goes to sleep on its own. This is just like Touch ID.

Federighi also noted on our call that Apple would be releasing a security white paper on Face ID closer to the release of the iPhone X. So if you’re a researcher or security wonk looking for more, he says it will have “extreme levels of detail” about the security of the system.

Here’s more about fooling it with fake faces:

Facial recognition has long been notoriously easy to defeat. In 2009, for instance, security researchers showed that they could fool face-based login systems for a variety of laptops with nothing more than a printed photo of the laptop’s owner held in front of its camera. In 2015, Popular Science writer Dan Moren beat an Alibaba facial recognition system just by using a video that included himself blinking.

Hacking FaceID, though, won’t be nearly that simple. The new iPhone uses an infrared system Apple calls TrueDepth to project a grid of 30,000 invisible light dots onto the user’s face. An infrared camera then captures the distortion of that grid as the user rotates his or her head to map the face’s 3-D shape¬≠ — a trick similar to the kind now used to capture actors’ faces to morph them into animated and digitally enhanced characters.

It’ll be harder, but I have no doubt that it will be done.

More speculation.

I am not planning on enabling it just yet.

Powered by WPeMatico

iOS 11 Allows Users to Disable Touch ID

A new feature in Apple’s new iPhone operating system — iOS 11 — will allow users to quickly disable Touch ID.

A new setting, designed to automate emergency services calls, lets iPhone users tap the power button quickly five times to call 911. This doesn’t automatically dial the emergency services by default, but it brings up the option to and also temporarily disables Touch ID until you enter a passcode.

This is useful in situations where the police cannot compel you to divulge your password, but can compel you to press your finger on the reader.

Powered by WPeMatico

Fighting Leakers at Apple

Apple is fighting its own battle against leakers, using people and tactics from the NSA.

According to the hour-long presentation, Apple’s Global Security team employs an undisclosed number of investigators around the world to prevent information from reaching competitors, counterfeiters, and the press, as well as hunt down the source when leaks do occur. Some of these investigators have previously worked at U.S. intelligence agencies like the National Security Agency (NSA), law enforcement agencies like the FBI and the U.S. Secret Service, and in the U.S. military.

The information is from an internal briefing, which was leaked.

Powered by WPeMatico

Hackers Threaten to Erase Apple Customer Data

Turkish hackers are threatening to erase millions of iCloud user accounts unless Apple pays a ransom.

This is a weird story, and I’m skeptical of some of the details. Presumably Apple has decided that it’s smarter to spend the money on secure backups and other security measures than to pay the ransom. But we’ll see how this unfolds.

Powered by WPeMatico

Apple's Cloud Key Vault

Ever since Ian Krstić, Apple’s Head of Security Engineering and Architecture, presented the company’s key backup technology at Black Hat 2016, people have been pointing to it as evidence that the company can create a secure backdoor for law enforcement.

It’s not. Matthew Green and Steve Bellovin have both explained why not. And the same group of us that wrote the “Keys Under Doormats” paper on why backdoors are a bad idea have also explained why Apple’s technology does not enable it to build secure backdoors for law enforcement.

The problem with Tait’s argument becomes clearer when you actually try to turn Apple’s Cloud Key Vault into an exceptional access mechanism. In that case, Apple would have to replace the HSM with one that accepts an additional message from Apple or the FBI­ — or an agency from any of the 100+ countries where Apple sells iPhones­ — saying “OK, decrypt,” as well as the user’s password. In order to do this securely, these messages would have to be cryptographically signed with a second set of keys, which would then have to be used as often as law enforcement access is required. Any exceptional access scheme made from this system would have to have an additional set of keys to ensure authorized use of the law enforcement access credentials.

Managing access by a hundred-plus countries is impractical due to mutual mistrust, so Apple would be stuck with keeping a second signing key (or database of second signing keys) for signing these messages that must be accessed for each and every law enforcement agency. This puts us back at the situation where Apple needs to protect another repeatedly-used, high-value public key infrastructure: an equivalent situation to what has already resulted in the theft of Bitcoin wallets, RealTek’s code signing keys, and Certificate Authority failures, among many other disasters.

Repeated access of private keys drastically increases their probability of theft, loss, or inappropriate use. Apple’s Cloud Key Vault does not have any Apple-owned private key, and therefore does not indicate that a secure solution to this problem actually exists.

It is worth noting that the exceptional access schemes one can create from Apple’s CKV (like the one outlined above) inherently entails the precise issues we warned about in our previous essay on the danger signs for recognizing flawed exceptional access systems. Additionally, the Risks of Key Escrow and Keys Under Doormats papers describe further technical and nontechnical issues with exceptional access schemes that must be addressed. Among the nontechnical hurdles would be the requirement, for example, that Apple run a large legal office to confirm that requests for access from the government of Uzbekistan actually involved a device that was located in that country, and that the request was consistent with both US law and Uzbek law.

My colleagues and I do not argue that the technical community doesn’t know how to store high-value encryption keys­ — to the contrary that’s the whole point of an HSM. Rather, we assert that holding on to keys in a safe way such that any other party (i.e. law enforcement or Apple itself) can also access them repeatedly without high potential for catastrophic loss is impossible with today’s technology, and that any scheme running into fundamental sociotechnical challenges such as jurisdiction must be evaluated honestly before any technical implementation is considered.

Powered by WPeMatico

iPhone Zero-Day Used by UAE Government

Last week, Apple issued a critical security patch for the iPhone: iOS 9.3.5. The incredible story is that this patch is the result of investigative work by Citizen Lab, which uncovered a zero-day exploit being used by the UAE government against a human rights defender. The UAE spyware was provided by the Israeli cyberweapons arms manufacturer NSO Group.

This is a big deal. iOS vulnerabilities are expensive, and can sell for over $1M. That we can find one used in the wild and patch it, rendering it valueless, is a major win and puts a huge dent in the vulnerabilities market. The more we can do this, the less valuable these zero-days will be to both criminals and governments — and to criminal governments.

Citizen Lab blog post and report. New York Times article. More news articles.

Powered by WPeMatico

Apple Patents Collecting Biometric Information Based on Unauthorized Device Use

Apple received a patent earlier this year on collecting biometric information of an unauthorized device user. The obvious application is taking a copy of the fingerprint and photo of someone using as stolen smartphone.

Note that I have no opinion on whether this is a patentable idea or the patent is valid.

Powered by WPeMatico

Apple Copies Your Files Without Your Knowledge or Consent

EDITED TO ADD (10/28): This is a more nuanced discussion of this issue. At this point, it seems clear that there is a lot less here than described in the blog post below.

The latest version of Apple’s OS automatically syncs your files to iCloud Drive, even files you choose to store locally. Apple encrypts your data, both in transit and in iCloud, with a key it knows. Apple, of course, complies with all government requests: FBI warrants, subpoenas, and National Security Letters — as well as NSA PRISM and whatever-else-they-have demands.

EDITED TO ADD (10/28): See comments. This seems to be way overstated. I will look at this again when I have time, probably tomorrow.

Powered by WPeMatico