skip navigation
skip mega-menu

Apple "thinks differently" about encryption and privacy

Man looking at Apple laptop

Backup encryption comes to iCloud

In what it calls “Advanced Data Protection”, Apple has announced that it’s finally  implementing end-to-end backup encryption for most iCloud services.

Always big on data privacy for users, Apple have been considering full encryption of user data for many years but have been loath to implement it, due in part to concerns from law enforcement agencies, such as the FBI about lack of access to user data in cases of national security.

On the other hand, Apple had been pressured for a number of years by groups such as the Electronic Frontier Foundation who argued that end to end data protection was a must to safeguard user privacy.

However, end to end user encryption is finally here for most - but not all - iCloud services. The only services which won’t be encrypted on backup are iCloud Mail, Contacts and Calendar due to the need to keep these integrated with other vendor systems.

Advanced protection

With Standard Protection, Apple hold the encryption keys for your data in Hardware Security Modules in its datacentres. This means that if you’re somehow locked out, Apple can provide the keys and let you back in. Of course, holding the encryption keys means that, with Standard Protection, Apple could ultimately gain access to your files.

However, with Advanced Protection, the keys are held only on your trusted devices, meaning that only you have access to the data.

You also have the responsibility of remembering your keys, so if you can’t remember your iPhone / iPad passcode and your Mac password, well, bang goes your data, it’s unrecoverable.

With Advanced Protection for your iCloud data, Apple - and everyone else - has absolutely zero way of accessing your stored files, pictures and messages.

Of course, this sparks off a rumour mill on sites like Hacker News. User data, of course, is a great revenue stream for a lot of tech firms. So why are Apple relinquishing access to it?

Apple repositions itself

In an era when all user content is invisibly monetised, Apple aren’t seemingly interested in scanning your photos and data to then attempt to make money from them. Google and Meta will - behind the scenes - scan your locations, photos and purchases and then apply predictive AI to try and make money out of you.

Of course, Apple differs from them by primarily selling hardware with a “bought in” consumer base. Apple is effectively growing trust amongst its user base by saying no, we don’t want to monetise your data.

Nor does it want to spend huge amounts of time and money scanning user data, either.

Is privacy a good thing?

Complete privacy of all user data is a contentious issue. Apart from the need for Government agencies to be able to access user data where terrorist activity is suspected, there’s also the thorny issue of sexual abuse images being hidden by users.

Apple had considered implementing client side scanning - pattern matching images uploaded to iCloud - to see if abusive images could be detected, [but paused the implementation]  after criticism that the technology returned false positives and the possibility that scanning for one type of illegal content would lead to the scope being inevitably expanded.  

The company policy is now to work with community advocates and offer “communication safety” procedures via Siri, Safari and Spotlight search to provide on the spot resources to spot CSAM images and provide help.

Apple’s policy, however, puts it at odds with the UK Government’s proposed Online Safety Bill - which demands transparency of data to protect children.



Subscribe to our newsletter

Sign up here