In regards to Apple’s open letter to customers

In regards to Apple’s open letter to customers

On the 16th of February 2016, Tim Cook of Apple published an open letter to their customers explaining that the United States government and the FBI had asked them to take unprecedented measures in order to provide them with access to specific user’s iPhone data in instances where they had not been given access by the phone’s owner.

Cook’s letter is an important milestone in the debate over personal data encryption and how we treat the agency of others who use technology. As our devices get smarter, we trust them with more of our data. However, as a user of any kind of technology, does wanting your emails, personal data, health records, SMS texts and more to be encrypted make you a ‘threat’ to society and the justice system? This is a question I see being incredibly important in the years ahead, and it’s part of a bigger debate over personal liberty and privacy which, in my opinion, dates back to 9/11, if not earlier.

Apple are taking a stance on behalf of their customers who choose to store a lot of important, private data on their devices. If any tech company was in a place to encourage this debate, it’s Apple. iPhone is considered to be the market-leader of the smartphone industry and their high-end desktop computer systems are favoured by many. Apple encrypts iOS smartphone data by default and encourages users to setup pin codes or use their thumbprints for securing that data: no passcode, no entry. On OS X, encryption is opt-in with Filevault, but on iOS and in particular, iMessage, it’s set as standard. Your passcode acts as a secret key to unlocking the encryption on your device, without it, there is no way to unlock that data without breaking the encryption. The FBI are asking that this system be overhauled for their usage, that Apple (and other tech companies I presume), include back door access to bypass (or at least brute force without danger of deleting data) the lock-screen passcode system. The FBI and the United States government has made assurances that such access would only be made use of in extreme circumstances, yet, Apple are not convinced that this level of access would not be misused or fall into the wrong hands, creating a major security risk for every single person who owns an Apple smartphone.

I feel that Apple and Tim Cook are right to take this stance and I hope that other technology companies follow suit. We now know that GCHQ have the ability to ‘tap’ internet connections and pass that data to the NSA and other international intelligence agencies. Such capability means that data transactions can be recorded at the point where data passes from your device to another server. There has been much said about how the NSA are now able to break encryption standards for information transferred by internet, but right now, to access data that is only stored internally on one device (without spending hundreds of years attempting to brute force entry), intelligence agencies need backdoor access. If such backdoor access was given, how could any user of technology assume a level of security over their information? How could anybody trust assurances that access would be kept internal to the FBI when data breaches do happen! In the case of Edward Snowden, we know for a fact that external contractors are able to find, access and leak protected data, how could anybody trust that such access wouldn’t eventually find its way into the wrong hands?

There’s no confidence to suggest that without encryption, crime would simply move further underground and terrorists might resort to more blackhat methods to avoid their communications going noticed. In the November 2015 Islamic State attacks in Paris, it was found that the attackers communicated using unencrypted SMS texting, yet NSA director Michael Rogers was quoted suggesting that the attacks were enabled by encryption. Whilst the NSA continue to attack encryption, using terrorist attacks as scaremongering in order to convince the public that their right to privacy is effectively null and void (lest we ‘let the terrorists win’), personal data security is exceedingly under threat.

Is it wrong to expect a level of privacy in our data? are we potential terrorists for not wanting our emails to be snooped on or our messages to be read by analysts at the NSA/GCHQ/’insert your national Intelligence Agency here’? It is against the law to be offended if somebody reads our most intimate journals or diaries without permission and we should expect that same level of privacy over our digital data also. We all need to stand up for the privacy of our data and not fall for the lies and appeal to emotion fallacies surrounding the encryption debate, lest we face a future where a personal expectation to privacy is no longer a default human right.