From the readings and in your opiniong, should technology companies implement backdoors in their products for the benefit of the government? Are companies like Apple ethically responsible for protecting the privacy of their users or are they ethically responsible for helping to prevent violent or harmful activities that their platforms may enable? How are these two conflicting goals to be balanced in a world of free-flowing communication and extreme terrorism?
- If you are supportive of government backdoors, how do you response to concerns of privacy and intrusion? Are worries about Big Brother simply paranoia? When and why does national security trump individual privacy?
- If you are against government backdoors, how do you response to conerns of national security? Isn’t save lives or protecting our nation worth a little less individual privacy. How do you counter the argument: If you’ve got nothing to hide, you’ve got nothing to fear?
I would like to preface this by saying that the San Bernadino shooting was a terrible tragedy and in no way to I support terrorism or acts of violence against residents of the United States—citizen or not. Also, this blog post involves a lot of rambling, so I apologize in advance—I had a lot of thoughts on this one.
I understand Apple’s apprehension in creating an operating system that would be able to break through any of the security encryptions on an iPhone. As seen in this very case—a lot of people use iPhones (even terrorists), and so to have created this program in the first place that would be able to unlock and phone with sensitive information on it would be useful in this situation, and maybe other federal investigations similar to this one, but who says that’s all that it would be used for?
We are not asking to expand the government’s surveillance authority, but rather we are asking to ensure that we can continue to obtain electronic information and evidence pursuant to the legal authority that Congress has provided to us to keep America safe. (Going Dark)
I know that the FBI is meant to protect me, but I feel like the quote above is really just them saying they want more access to information they shouldn’t necessarily have access to.
Also, FBI employees expect their own personal phones and devices to stay encrypted, then why are they asking Apple to write this program that could ultimately put themselves at greater risk? It seems to me that the FBI is being a tad short-sighted while Apple is looking at a bigger picture of what creating this backdoor could mean for iPhone users across the world. There is a reason such backdoor does not already exist, even though there is the capability for it to exist. I think that Apple saying that they don’t have a program to break into a locked phone is something I find comfort in—as an iPhone user, I feel reassured knowing that my information is important to them, and they will protect it in the same way they’d want their own information to be protected. After watching a few episodes of Mr. Robot, I know see how easy it is for a hacker to get my information and to hack my computer and, in the case of the show, blackmail me to do terrorist activities, and isn’t that what this whole case is fighting against?
Eileen Decker brought up three points in looking at the FBI’s case against Apple: the company’s distance, or “remove” from the case; whether the government’s request places an “undue burden” on Apple; and whether the company’s assistance was “necessary.” And in this case, Apple does not seem to meet any of the criteria above. Said iPhone, though designed by Apple, is not their specific problem anymore—they have no claim on the device. It would be an additional burden on Apple to take the time to write this decryption code which, if in the wrong hands could be very dangerous. They are asking Apple employees to take on forensic roles in this case—doesn’t the FBI have people for that already?
Ken Dreifach, an Apple lawyer explained their side very well, ultimately explaining that the FBI is overstepping it’s bounds by asking Apple to do this. They are imposing power they do not rightfully have over this company in order to learn something that may or may not help them in their investigation—for all we know their could be nude pictures saved on the phone, or maybe nothing at all.
In response to the argument “if you’ve got nothing to hide, you’ve got nothing to fear,” I will say there are some things that I would rather people didn’t know about me. I find it is always better to keep an air of mystery about me—but really, I don’t have anything to hide from people, but there are some things I also don’t feel fully comfortable with the entire world having the possibility of knowing. An experience earlier this year, cemented my position on this: I don’t have anything to hide, other than my bank account numbers and social security number, and the things of that nature, but there are also some things that not everyone needs to know about me. If I want someone to know something about me, then I would like to be the one sharing it rather than someone finding out without my knowing. That I think is my biggest issue—not being the one in control of how my information is spread. And if Big Brother has his hand in the cookie jar, then who’s to say he won’t share such information without my knowledge or permission? Or what if someone is able to take that information from Big Brother (maybe using Apple’s supposed knew decryption algorithm), then where does that leave me?