Sunday, February 21, 2016

Would Apple's "Backdoor" Create More Issues than it Solves?

taken from http://imgur.com/mhYai1M


The decision of whether Apple should be forced to create a system to unlock the phone of the San Bernandino shooter has tremendous implications on the future of the security of our information, and this precedent will be the first step in a series of decisions that will either continue to limit or dangerously expand the exposure of our personal data. Apple’s argument for denying the request, backed by what seems to be a general consensus among tech companies and security experts, is based on the possibility that creating this “backdoor” would essentially be creating a master key that can be used to hack into any iPhone in the future, and the technology would eventually leak to hackers and criminals, sacrificing the security of encrypted personal information everywhere. 

It’s important for the FBI to have the capability to access devices if it leads to solving a crime, especially when it’s related to a large tragic terrorist case. This seems clear cut, although in this case both sides’ arguments are grounded. On one hand, tech companies like Apple are usually cooperative when it comes to assisting law enforcement and this is important as it expands the resources and reach of the investigation. On the other hand, the more often companies design ways to circumvent their own systems, data security experts suggest the more likely it is that criminal and government hackers will figure it out and be able to do the same. So although intentions are good, one could argue a vicious cycle will be created if the request is followed, in which data-based companies are forced to create the tools of their own demise, resulting in more hacker crimes and dissatisfied customers. Security professional Rich Mogull explains such increased exposure and the civil rights implications setting this precedent would entail:

…No legal case applies in a vacuum. If this goes through, if Apple is forced to assist, it will open a floodgate of law enforcement requests. Then what about civil cases? Opening a phone to support a messy divorce and child custody battle? Or what about requests from other nations, especially places like China and the UAE that already forced BlackBerry and others to compromise the security of their customers?
And once the scale of these requests increases, as a security professional I guarantee the tools will leak, the techniques will be exploited by criminals, and our collective security will decline. It really doesn’t matter if it’s the iPhone 5c or 6s. It really doesn’t matter if this is about dead terrorists or a drug dealer. It doesn’t matter what specific circumvention Apple is being asked to create.

Donald Trump also weighed in after Apple’s refusal, siding with the government to say the technology would only be used once in this particular case to “find out what happened and why it happened.” According to legal and security experts, this is simply not the case. Because no precedent currently exists for law enforcement to coerce companies to actually create tools to hack their own security, and due to the likeliness of the techniques created spreading into the wrong hands, the outcome of this situation goes far beyond this one iPhone.


The population is about evenly split in a USA TODAY poll conducted by SurveyMonkey on whether they support the FBI’s request. But if we look at some more recent developments, the FBI’s motives and handling of the investigation thus far have become a little foggy. It turns out engineers from Apple have already been very involved in helping law enforcement access the phone, as they always do, and they are just now being publicly called out and being made to look like their refusal is all a marketing scheme. 

What didn’t come to light during the original request the public saw along with the letter to Apple customers was that a San Bernandino official reset the iPhone’s password in an attempt to access information shortly after the attack, and according to Apple, doing this made the information in the phone impossible to obtain without creating the "backdoor" that sacrifices the security of all devices. It was later revealed the orders to attempt to reset the phone were given to San Bernandino county by the FBI, and Apple suggests if this hadn't happened, they would have had better luck accessing the information using other methods that don’t break down their entire operating system. So it seems the FBI either set this situation up to willingly use this large emotional case to catalyze a favorable legal precedent that might make their jobs easier in the future, or made an early mistake handling the iPhone when it was first retrieved and wanted Apple to tend to their screw-up, or maybe a combination of both. 

So Apple at least deserves some credit for protecting the population’s information instead of immediately being used as a scapegoat to cover the FBI. Not to undermine the importance of law enforcement, this tragedy, and figuring out all the details, but if it will really sacrifice data security for years to come (as experts say it will) in an increasingly data-based country and world, forcing Apple to create this key to all of its encrypted information will likely create more security problems than it solves, inducing more technologically able criminals for law enforcement to deal with. Apple made a socially conscious decision denying the request.

2 comments:

  1. Apple creating a backdoor would in turn create a slew of problems for the company and its customers. I do believe that Apple has a financial motivation to prevent the leak from happening, because they are risking losing their customers to other companies like Microsoft. If a company can come up with a way to protect its customers' information better than Apple can, then they can possibly take great market share away from Apple. Not only that, Apple's creation of a backdoor can create numerous social and financial risks for its customers that may not have been expected before.

    ReplyDelete
    Replies
    1. Very true, Apple definitely does have some sort of financial motivation to protect their operating system. Other tech companies like Microsoft, however, have actually risen up in support of Apple during this time. I think the most pressing issue regarding this is not this particular backdoor, but the fact that they did it for the government once sets a precedent and makes it so all companies, even Microsoft, would have to comply with such demands. It's very interesting as it is one of the first times we see this new wave of cloud technology being considered with respect to privacy law. But since a lot of experts in the field think this is horrible for our data security in the future, and it's already a pretty rampant issue, so I hope law enforcement and the courts keep all of this in mind.

      Delete