Home / Mobile / How private is your iPhone data, and how to protect your privacy

How private is your iPhone data, and how to protect your privacy

iphone_6s_review_6_thumb800

At time of writing it’s US presidential primary season, and privacy is one of the few areas of genuine disagreement. (Ted Cruz is against expansion of governmental surveillance, Trump and Rubio are loudly in favour of it, and Bernie Sanders has called NSA activities “Orwellian”. Hillary Clinton’s position, as on many things, remains somewhat unclear.)
Most of all this battle will be fought in the realm of technology, where corporate behemoths Apple and Google represent (at least in the mind of the average tech user) opposite ends of the spectrum. Apple makes lots of noise about protecting its users’ privacy, while Google… well, we’ll talk about that in a moment.

Still, talk is cheap. If you’re wondering how seriously Apple takes privacy – and about the protections that are in place to protect the privacy of data stored on your iPhoneor other Apple device or service, such as the potentially sensitive medical data stored by CareKit apps – then wonder no longer, because we’ve put together a list of the 5 reasons why we believe that Apple respects customers’ data privacy more than Google.

iPhones are equipped with a number of powerful privacy measures

The iPhone is not easy to break into, and quite aside from Apple’s corporate position on privacy, the smartphone itself has several protective features that help to safeguard your privacy.

Best iPhone privacy measures: Passcodes

First up: we always recommend that readers should set a passcode for their iPhones. This simple measure can be surprisingly effective at stopping people from getting at your data, as the FBI discovered recently.

How to improve your iPhone privacy: As simple as an iPhone’s passcode can be – we’d recommend a custom alphanumeric code, not four digits, but even the latter is a deterrent to casual identity theft – it takes a lot of work to crack one. This is particularly the case because iOS builds in delays after you get the passcode wrong: each computation is deliberately designed to take longer than it needs to, at 80 milliseconds, and if you get it wrong six times in a row the iPhone is locked for a minute; further incorrect guesses result in longer delays. The latter measure in particular prevents hackers from using brute force to machine-guess hundreds of codes in quick succession.

The six-wrong-attempts delay is always activated, but there’s a second more drastic measure you can choose to activate if you are carrying highly sensitive or business-critical data. If you want, iOS will erase your data if someone (including you!) gets the passcode wrong 10 times in a row. Go to Settings > Touch ID & Passcode, enter your passcode and then scroll down to Erase Data. But only do this if you are willing to run the risk of accidentally erasing everything if you get drunk.

change_ios_9_passcode_six_digits_to_four_800home

Best iPhone privacy measures: Touch ID

The iPhone 5s and later come with Touch ID fingerprint scanners. You can use your fingerprint to unlock the device itself, but third-party developers have for some time been able to build Touch ID into their apps – enabling you to fingerprint-protect password keepers, banking data, health data and so on. As of iOS 9.3, you can use Touch ID – and passwords, for the matter – to protect individual notes in the Notes app.

Fingerprints aren’t necessarily more secure than passcodes and passwords – a reasonably long and alphanumeric passcode is extraordinarily time-consuming to crack – but they are far more convenient, which makes it much more likely that we will use them.

But the benefits of Touch ID are not straightforward, and my colleague Glenn Fleischman discusses this in a separate article, The scary side of Touch ID. As he puts it:

“Someone might be able to coerce a password from you with a wrench… But it still requires that threat and your acquiescence. […] Mobile fingerprint sensors change that equation dramatically. An individual who wants some of your information must only get hold of your device, ensure it hasn’t been rebooted, and hold an appropriate digit still for long enough to validate one’s fingerprint.

“As I touch, touch, touch, I think about about Hong Kong and mainland China; about Afghanistan and Iraq; about Ferguson, Missouri, and police overreach and misconduct; and extrajudicial American operations abroad and domestic warrantless procedures and hearings about which we know few details. I think about the rate of domestic violence in this country.

“As a nonconsensual method of validating your identity wherever you’re carrying a device, coupled with software that likewise recognises it, Touch ID requires a bit more thought than just registering your fingerprints.”

How to improve your iPhone privacy: Here’s a small related item of interest, to anyone who wishes to keep their iPhone as private as possible. It’s been ruled, in the US at least, that police can force a suspect to use Touch ID to unlock a device – following the reasoning that a fingerprint is a piece of physical evidence – whereas a passcode is viewed as knowledge and is protected by the Fifth Amendment… not that there is any logical way for police to extract this information short of waterboarding.

In other words, for the extremely privacy-conscious, securing an iPhone with a passcode alone is actually a better choice than using Touch ID.

iPhone-5s-Touch-ID-App-Store

Best iPhone privacy measures: Secure Enclave

We’ll be talking again about Apple’s privacy battle with the FBI in more detail in a bit, but it’s worth discussing one technical aspect of that case here. The iPhone belonging to one of the shooters in the San Bernardino case (or rather, belong to his employer) is a 5c model, and this – the company claims – is crucial in Apple’s ability to open it up. iPhones more recent than this are equipped with security measures that mean even Apple’s own engineers wouldn’t be able to access the data inside.

As well as introducing Touch ID, the iPhone 5s was also the first iPhone to feature a security measure that Apple calls the Secure Enclave. This is an area of the processor chip – a separate processor in its own right, essentially – that stores the fingerprints and other security-critical data. But it is also a crucial part of the encryption setup.

“The Secure Enclave uses a secure boot system to ensure that it the code it runs can’t be modified,” explains Mike Ash, “and it uses encrypted memory to ensure that the rest of the system can’t read or tamper with its data. This effectively forms a little computer within the computer that’s difficult to attack.”

(I’m obliged to Mike for virtually all of my understanding of the Secure Enclave’s technicalities, but he acknowledges in turn that his findings partly derive from Apple’s published security guide: the security measures mean that a lot of the Secure Enclave’s details remain unverifiable.)

The generally agreed plan for Apple to break into the shooter’s iPhone 5c involves the company’s engineers creating and installing a custom build of iOS – one that doesn’t have the same security measures that prevent brute-forcing of the passcode. The OS on the Secure Enclave, it is surmised, features defensive measures that would delete the keys to the encrypted data if new firmware were installed.

Apple is publicly committed to user privacy

Following the San Bernardino shootings of December 2015, the FBI obtained a warrant to search an iPhone 5c belonging to one of the shooters, Syed Rizwan Farook (the phone was technically the property of Farook’s employers, which was a factor in obtaining permission to do this). Yet the FBI were unable to get into the device because it was locked with a passcode, and sought – and obtained – a court order instructing Apple to open the phone up.

But Apple refused, and published its reasons in an open letter on 16 February 2016 from the CEO, Tim Cook.

apple_privacy_message_800home

“The implications of the government’s demands are chilling,” the letter reads. “If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

“Opposing this order is not something we take lightly.”

Indeed, at its 21 March ‘Let us loop you in’ launch event, Apple took time before mentioning any of its new products to reiterate its determination to stare down the FBI.

“We did not expect to be in this position, at odds with our own government,” said Tim Cook. “But we have a responsibility to help you protect your data and protect your privacy. We owe it to our customers and we owe it to our country. We will not shrink from this responsibility.”

iphone_6s_review_22_thumb

Apple has talked about the importance of data privacy many, many times the past, but this is the clearest statement yet that the company is prepared to take concrete action for that principle.

I personally feel that Cook has been outmanoeuvred to a certain extent. It’s about the worst case on which to make a stand that you could imagine: the most deadly domestic terrorist attack the US has faced since 9/11, a subject on which the US public will surely, surely take the side of law enforcement. (Sure enough, a Pew Research Center poll found that 51 percent of Americans think Apple should hack the phone, compared to 35 percent who think it should not.)

And it’s the worst time: presidential primary season, when Republicans are queueing up to act tough (Donald Trump has asked who Apple think they are for making this statement, but then again this is the genius who said they should make “their damn computers and things” on home soil) and Democrats won’t dare support an unpopular cause.

But this makes the move even more admirable. I don’t think Apple is doing this because it’s a good strategic move – although caring about your customers is a pretty good business model that’s served Apple well over the years – but because it believes this is the right thing to do.

Lots of tech companies talk about privacy, and indeed in this case many other major tech firms, including Microsoft and even Google, have come out in solidarity with Apple’s stance. But there’s a difference between saying and doing.

I also couldn’t help but notice that there was a fair gap between Apple’s statement and most the supportive comments, as if they were looking to see who else would commit themselves before jumping in. In fact, NSA whistleblower Edward Snowdentweeted on 17 Feb 2016 at 4:43pm that “silence means @google picked a side, but it’s not the public’s”, and Google boss Sundar Pichai’s admittedly admirable responsecame more than seven hours later.

Apple is powerful enough to stand up to overreaching governmental prying, and it has a business model that depends on loyal customers that love the company and its products so much that they are willing to pay more than the going rate for their smartphone. It also makes sense for the company, from a PR point of view, to act in a way that highlights Google’s philosophy.

Apple has the means, and it has the motive, to safeguard its users’ privacy.

Leave a Reply

x

Check Also

Ebook readers

The Digital book revolution is happening right now and it’s a great ...