1 2 3 >  Last ›
 
   
 

[“Meat without Misery” Podcast]: Apple v. FBI and the binary nature of encryption

 
ColinF
 
Avatar
 
 
ColinF
Total Posts:  2
Joined  20-02-2016
 
 
 
20 February 2016 12:23
 

For a non-computer-scientist this can be very hard to understand but the reason why the tech community stands with Apple is because they are computer scientists and they know the following to be true:

Either everyone gets encryption, or no one does.

The phrase “secure back door” is an oxymoron.

If Apple builds a key, the key can be used by anyone who guesses the key. The key opens the door, even if the FBI isn’t the one holding it. Every person in the world gets a copy of the lock and they get to go to work on it in private, taking all the time and resources they want to figure out how to get in. And when they do, no alarm will be tripped, no one but them will know, and the key will open every similar lock in the world.

In a connected world, the “room” analogy you provided is misleading because “the room” isn’t in my house (which sounds removed from the rest of the world), the room is everywhere and accessible to everyone. We all use these rooms every day to store our sensitive information and because the room is within reach of everyone all the time, either it has to be made to be impenetrable to everyone but me, or it’s penetrable by anyone.

For your DNA analogy, it would be more apt if taking the concoction made every person who took it immune to all disease, but also made their DNA untraceable. Now is it ethical for the drug company to make it? The benefits far outweigh the risks.

If you want someone to talk to about this on your podcast, I can think of no one better than Bruce Schneier.
https://www.schneier.com/

Best regards,
Colin

[ Edited: 20 February 2016 12:33 by ColinF]
 
Jeff_Lebowski
 
Avatar
 
 
Jeff_Lebowski
Total Posts:  8
Joined  20-02-2016
 
 
 
20 February 2016 14:14
 

Thank you Collin for posting this explanation. It was so disappointing and painful to listen to Sam’s take on this issue. He demonstrated a profound ignorance of the science of encryption. His position is based off of emotional and philosophical arguments rather than the scientific facts. He was dismissive of the scientists who actually know the facts and, instead, based his opinion on the ignorant whining of some law enforcement types. Depressing.

Anyway, it is worth repeating the following scientific fact:

Either everyone gets encryption, or no one does.

That is not hyperbole. That is not paranoia. It’s not marketing. There is no middle ground. It’s a fact like gravity is a fact.

Collin’s link is great. Here’s another one:

https://securosis.com/mobile/do-we-have-a-right-to-security/full

[ Edited: 20 February 2016 14:24 by Jeff_Lebowski]
 
ColinF
 
Avatar
 
 
ColinF
Total Posts:  2
Joined  20-02-2016
 
 
 
20 February 2016 14:22
 

Thanks Jeff. I was cringing as he spoke as well, but I can’t blame Sam for not knowing any better.

We all have our strengths and I think he adequately framed his lack of knowledge on the topic, and said he hadn’t thought deeply about it. You can tell in what he does say that he doesn’t understand why there is a trade-off, and that’s understandable.

I hope my explanation or someone else’s helps make it clearer for him. Getting Bruce or other security expert on the show would be a great education for all. The government’s spin machine is going full bore on this issue and they’ll get their way if people aren’t made aware of what’s at stake.

[ Edited: 20 February 2016 14:24 by ColinF]
 
SkepticX
 
Avatar
 
 
SkepticX
Total Posts:  14327
Joined  24-12-2004
 
 
 
20 February 2016 16:17
 

This is similar to but more far reaching than the American Library Association’s refusal to keep accessible records on what library patrons borrow (and presumably read) according to the Patriot Act. The “secure back door” is a pretty obvious privacy ethic fail to me—in the same category as “if you haven’t done anything wrong you have nothing to fear”, but then as I understand it plenty of software “out there” has back doors.

 

[ Edited: 20 February 2016 16:45 by SkepticX]
 
 
davide
 
Avatar
 
 
davide
Total Posts:  2
Joined  20-02-2016
 
 
 
20 February 2016 16:17
 

I hope Sam reads this thread as I think there may be a number of posts that set him on more enlightened view of data security. For my part, I would like to appeal to Sam’s intuitive feeling on a earlier point he made in this podcast and how it melds with this issue: Ted Cruz holding the keys to all the private information of every citizen and organisation within US jurisdiction (and many without). I’m sure Sam’s imagination won’t fail him in realising why such a situation needs to be avoided.

 
harlan
 
Avatar
 
 
harlan
Total Posts:  13
Joined  20-02-2016
 
 
 
20 February 2016 17:15
 

As a fellow computer scientist,  I have to agree that Sams arguments regarding encryption fail to say anything of worth on the topic of encryption.  I also think the standard pro-security position may be premature. The truth is that now and for the foreseeable future digital security is apparently binary. As others have already stated,  either everyone gets encryption or no one does. While accepting this claim does end the current debate regarding apple, it opens up another far more interesting question. If it is the case that either everyone gets encryption or no one does, should anyone have encryption? Is it actually a good thing that information can be hidden completely from others? Does the good of preventing governments from breaking people’s privacy out weight the bad of making it impossible for the public to effectively monitor the actions of government?  Are technologies that inherently act to constrict the flow of information ever a good idea? What other concerns and tradeoffs are imposed by the dichotomy of digital security?

 
Jeff_Lebowski
 
Avatar
 
 
Jeff_Lebowski
Total Posts:  8
Joined  20-02-2016
 
 
 
20 February 2016 17:53
 
[quote author=“harlan” date=“1456017322” If it is the case that either everyone gets encryption or no one does, should anyone have encryption?

Harlan,

If you haven’t read the article I linked to, please give it a read. The consequences of weakened security or no encryption at all are far worse than the negative consequences of encryption for all.

https://securosis.com/blog

 
harlan
 
Avatar
 
 
harlan
Total Posts:  13
Joined  20-02-2016
 
 
 
20 February 2016 20:11
 
Jeff_Lebowski - 20 February 2016 05:53 PM
harlan - 20 February 2016 05:15 PM

  If it is the case that either everyone gets encryption or no one does, should anyone have encryption?

Harlan,

If you haven’t read the article I linked to, please give it a read. The consequences of weakened security or no encryption at all are far worse than the negative consequences of encryption for all.

https://securosis.com/blog

I don’t think that article answers my question, it just restated my question in different terms. The article you linked to is pointing out that digital security is a dichotomy. I accept that claim and ask the obvious questions that the article brings up but never discusses, do we actually have a right to security? What are the tradeoffs? Can we actually justify a right to security in today’s environment?  In general?  My intuition says the answer is yes, but I don’t have any good arguments.

For the sake of playing devil’s advocate. In general, the spread of information is a good thing. Only by exposing ideas can good ideas be discovered and spread and bad ones be proven bad and allowed to die. Security, in the digital world, acts in direct opposition to the spread of information. Is this justified?

[ Edited: 20 February 2016 20:14 by harlan]
 
EN
 
Avatar
 
 
EN
Total Posts:  19017
Joined  11-03-2007
 
 
 
20 February 2016 21:05
 

I’m totally for Apple on this one.  Giving the government access is death.

 
NL.
 
Avatar
 
 
NL.
Total Posts:  5257
Joined  09-11-2012
 
 
 
20 February 2016 21:07
 

I think he said something about the difficulty of finding analogies in this case, and there I agree - with new technologies and new situations, it’s difficult to know what analogies are appropriate (In the Aaron Scwartz case, if I remember correctly, there was argument over whether what he had done was analogous to checking out too many library books and not returning them or like stealing). For example, Harris gives some emotionally intense examples of murder victims whose cases have, in some cases, been solved because they were able to record the crime on an unlocked phone; in others it was known that they were using the phone when they died but the phone remains locked and no one knows what’s on it. But what is this analogous to? To not being able to get a warrant to search a house? To not being able to routinely torture people during interrogation if you know they have information ‘locked’ in their heads? To not being able to have cameras on street corners to record possible crimes? To not being able to have cameras in houses to record possible crimes? That we could take actions that would increase the ability to solve crimes is certainly true in many equations - what’s not clear is where we draw boundaries and make trade-offs as high as murders going unsolved for the sake of privacy or rights protection.


Harris used two analogies that I heard - one about being able to build a room in your house that no one could get into but you, one about taking a pill that could change your DNA. I really didn’t understand the first one as an argument for his position, since so far as I know you are actually well within your rights to build a room on your own property that only you can access. It might be logistically impossible but to the degree that it isn’t, I don’t think the legality of such an act would be in question at all. Maybe I’m wrong on that, though. The ability to scramble one’s DNA speaks more to the intuition that certain kinds of “encryption” could be very harmful, although it doesn’t have an analogy in terms of why one would want to scramble their DNA for entirely honorable reasons (hackers from around the globe cannot, at present, get into your DNA and do things to it or use it in nefarious ways.)


My personal intuition is that the ability to access a phone is like the ability to access a house, car, storage locker or any other piece of tangible physical property - the way our world is set up today, it has always been acceptable for law enforcement to access those things with a warrant. If there is a case to be made that phones and computers are different, however, I think it is that a house is only accessible to people in the immediate environment whereas a phone is accessible across the globe, so that proper security from criminals and hackers simply isn’t possible without across-the-board unbreakable protection. I don’t know enough about the technology to know if that’s true, though.

 
 
GAD
 
Avatar
 
 
GAD
Total Posts:  15567
Joined  15-02-2008
 
 
 
21 February 2016 01:11
 

If Apple can unlock the phone they should, not a backdoor to every phone, but if they can do it on this one they should. These are mass murders, why wouldn’t they? What is the counter argument, that the government might see who you are having an affair with, what your sexual fantasies are, who your drug dealer is? For this you would make sure mass murders are protected? If it were someone you loved that they murdered would you feel the same? It’s really easy to promote fear and paranoia when it’s just anonymous people on TV who got murdered, bad luck for them, but your public Facebook, Twitter and Instagram are far important.

 
 
Twissel
 
Avatar
 
 
Twissel
Total Posts:  2262
Joined  19-01-2015
 
 
 
21 February 2016 02:18
 

I, too, am very disappointing with Harris’ stance on the subject. I can only think that he is not really aware of the scope of the subject.

For once, it is incredibly narrow-minded to view the subject from the POV of the US only: oppressive regimes worldwide have made massive investments in technology designed to let them spy on their population, dissidents in particular. Any weaking of encryption will make it much easier for them to oppress opposition.

But even staying in the US, we have this tiny thing called the 5th Amendment: the right against self-incrimination: whatever I put under encryption on my electronic device I obviously consider private and do not wish others to know. And I should not be forced to reveal that information if it might open me or people I wish to protect up for prosecution. Using my data, even after my death, to investigate me and others is tantamount to incriminating myself.
For this reason alone, encrypted data of mine should not be admissible in court against me.

The scenario of a phone being the only source of intel in your own death is pretty far stretched. But even if true, there would be a simple technological work-around: a 2-level encryption scheme, one level for my-eyes-only encryption, and another for emergency-unlock encryption, with an emergency key deposited with spouse/parents and/or your lawyer. This is 100% similar to what people would tell others in confidence (maybe making a death-bed confession) and what secrets they chose to take to their graves.

These are questions that will have a massive impact on the future of everything: soon we will start to do all identification and payment with phones and the like. Blanket weakening of security to help law-enforcement in a very few cases is risky for everyone. After all, politicians and law-enforcement themselves love iPhones because of their strong crypto.

Concerning the DNA-analogy: again Harris’ is far behind the state-of-the-art. Even today, criminals sometimes use DNA-bombs, i.e. they collect traces of DNA from very public places like buses, subways, Coffeeshops etc. and dispense them over the scene of their crime. This does not remove their DNA, but it hides it under a stack of other DNA, making it basically impossible to prove that you and only you were there.

 
 
ColinF
 
Avatar
 
 
ColinF
Total Posts:  2
Joined  20-02-2016
 
 
 
21 February 2016 06:07
 

There seems to still be some misunderstanding on this issue, so I’ll try to clarify. There are two aspects to this issue: technical, and sociopolitical.

Technical

There is no such thing as a secure back door. It’s mathematically impossible. The technical limitations are not about will or politics.

There are sometimes ways around the encryption and Apple has always been very helpful working with authorities on a device-by-device basis. They are only saying they won’t, essentially, remove security from all their devices. (Because adding a back door is the same as removing security altogether; It would only stop those who wouldn’t try to get in anyway.)

Sociopolitical

Do we have a right to think and act on new ideas that challenge the establishment?
Do we have a right to free association?
Do we have a right to privacy at all?

If we aren’t putting the digital genie back in the bottle, then we now live in a world where whatever we are thinking is put down in a digital format that is electronically accessible from anywhere on the planet. Our free associations happen in the cloud. If our ideas and associations cannot be protected from those in power, then we are necessarily deciding that nothing should ever change without the approval of those in power.

Does that sound like a healthy structure for a society to adopt or does it sound like a recipe for despotism? It’s quite obviously the latter. Asking that everyone give up secrecy/privacy is the old “If you aren’t doing anything wrong then you don’t have anything to hide” canard, in new clothing.

And a back door doesn’t just hand over our security to a trusted government, we’ll be giving it up to anyone with the will to get through it (i.e., criminals).

Anyone interested in stealing your identity will have a way to do it. Anyone interested in gathering enough information about your family to convince your child they are a family friend and should go with them, can. In general, anyone who can see a way to use the private details of your life to exploit them for their own gain, can.

Yes, people can do bad things with secrecy. Yes, we can dream up an exceptional situation where the only ethical choice is to break our standard protections for individual liberty.

But what’s at stake right now is about our *standard* protections, not exceptional situations. The government wants day-in/day-out access to all information all the time [edit: because mathematically, conditional access can’t be granted—it’s all or nothing]. It’s a reckless and dangerous path to go down.

Any time some politician asks for this, we should be looking at them as being as bumbling and clueless as Keystone Cops or “the internet is a series of tubes” guy. Their aspirations are ignorant and shortsighted.

[ Edited: 21 February 2016 06:16 by ColinF]
 
EN
 
Avatar
 
 
EN
Total Posts:  19017
Joined  11-03-2007
 
 
 
21 February 2016 06:52
 

What we share in private are our private thoughts, and it is a dangerous precedent to allow the government to invade those, even if we are talking about murderers.  In the future perhaps even the thoughts in our heads will be subject to investigation - I would prefer to have the limits of government investigation established now.  This looks like as good a place as any to draw a line in the sand.  There is an expectation of privacy that should not be violated.

 
Dennis Campbell
 
Avatar
 
 
Dennis Campbell
Total Posts:  19783
Joined  20-07-2007
 
 
 
21 February 2016 07:26
 
EN - 20 February 2016 09:05 PM

I’m totally for Apple on this one.  Giving the government access is death.

That makes two of us.

[ Edited: 21 February 2016 07:30 by Dennis Campbell]
 
 
Jontwojcik
 
Avatar
 
 
Jontwojcik
Total Posts:  1
Joined  21-02-2016
 
 
 
21 February 2016 08:03
 

Something that I have yet to see someone post is that even if theoretically encryption was either made illegal or back doors are required to be put on by law, there is no stopping other countries or individuals from building their own encryption which most certainly include criminals and terrorists.

 
 1 2 3 >  Last ›