Re: Guest Post - Apple

1

To me it seems like a considerable overreach of government power, in that government is claiming the right to order any amount of work on the part of people who did not commit a crime. The employees at Apple did nothing wrong -- how much labor can the government compel out of them?


Posted by: Walt Someguy | Link to this comment | 02-29-16 7:46 AM
horizontal rule
2

I don't know why you can't trust Apple in this. Their business and technical interests are perfectly aligned with what they're saying.

Anyhow, fuck the FBI on this. They specifically and carefully constructed this case so as to have maximally unsympathetic defendants and a maximally difficult problem so that they could create a legal precedent that companies can be forced to build backdoors for not-otherwise-backdoored products. They are doing this because they have tried for years (decades, even) to legislate bad computer security and have failed.

Apple can and will build a new system that doesn't allow firmware updates without a passcode, but that's also bad, because automatic -- or at least frictionless -- installation of security updates is the only way to even wave at real computer security. What a finding in favor of the FBI will mean is that software which accepts automatic updates is presumptively backdoored, and only systems which are not updated are presumptively secure, except that systems that are not updated are not secure, because discovered security issues have not been patched.

(I probably am not going to stay in this thread because Tigre (at least) will presumably show up and be a moron and I don't need that in my day, so I tried to get all my thoughts out at once.)


Posted by: Beefo Meaty | Link to this comment | 02-29-16 7:46 AM
horizontal rule
3

note that: 1. one of the courts that might use the precedent is a secret rubber stamp that doesn't reliably disclose its orders (FISC) 2. if Apple can be forced to sign a malware update here, most tech experts think it could be forced to do so remotely


Posted by: Katherine | Link to this comment | 02-29-16 7:52 AM
horizontal rule
4

Hmm, the author is not Ttam, for the record.


Posted by: heebie-geebie | Link to this comment | 02-29-16 7:56 AM
horizontal rule
5

Slippery slope!


Posted by: E. Messily | Link to this comment | 02-29-16 8:03 AM
horizontal rule
6

In 3 weeks Apple will be announcing the iPhone SE, which will basically be an iPhone 6 (maybe 6S) in an iPhone 5 body, or at least and iPhone 5-size body.


Posted by: JRoth | Link to this comment | 02-29-16 8:04 AM
horizontal rule
7

The other day I heard a (former?) gov't official say, "If Apple doesn't want China to demand the same thing the FBI is demanding, then they can just not do business in China." This was intended as a serious statement--that the FBI opening the floodgates on this issue isn't a problem, and Apple should just suck it up. Christ, what an asshole.


Posted by: JRoth | Link to this comment | 02-29-16 8:08 AM
horizontal rule
8

7: the questions that come to mind are:
Has the Chinese government already asked Apple to do this?
If "yes", what did Apple say?
How sure are we that Apple would not, on orders from the PSB, create such a backdoor and then refuse to admit its existence?

If "no", why not? Has it just never occurred to them? Will whether Apple agrees to this or not in the US actually make any difference to the PSB?
(I suspect that the PSB simply uses rubber-hose cryptanalysis, anyway.)


Posted by: ajay | Link to this comment | 02-29-16 8:14 AM
horizontal rule
9

Yeah, if I were in your situation, I would hold out for the 5se. It's worth noting that it's not actually in a 5 case. It's a 6-like case (rounded edges! metal back!) but the size of a 5. So it's the best of both worlds really.


Posted by: Unfoggetarian: "Pause endlessly, then go in." (9) | Link to this comment | 02-29-16 8:15 AM
horizontal rule
10

I'm stuck on 1- Surely Apple can keep the modified OS offline, in their possession, they get the phone from the FBI, modify OS, find the passcode, modify the OS back, return phone to the FBI with a piece of paper that says "1-2-3-4-5" then send them a bill for $2000 per person hour required for the unlocking. And they only do so in the future when required by a valid court order. Where's the broader security risk?


Posted by: SP | Link to this comment | 02-29-16 8:24 AM
horizontal rule
11

Legal slippery slope, related to the above: if a Californian court can compel Apple to co-operate in this case, other courts in other states will attempt the same thing and Apple will be forced to comply. Either Apple protects all phones or it protects none. One case will be followed by others. But I can't take this seriously as a slippery slope, because the courts are there, surely, to decide between proper and improper applications. Of course law enforcement will try it on. But it is one of the functions of the judiciary to stop them doing so, and if you can't trust the courts to do that, you have a problem that is social, legal and political, not technical.

I'm not sure I understand this. The courts (OK, a court) have already decided, and said that it's proper. So the slippery slope has already been slid down, at least for that court's jurisdiction.


Posted by: Ginger Yellow | Link to this comment | 02-29-16 8:25 AM
horizontal rule
12

I'm firmly in Apple's camp here, but my stance on privacy is so far out of the mainstream that I have no idea how this is perceived generally. I suspect that most people are sort-of sympathetic to the FBI's stance, since people in the US seem eager to jettison civil liberties if you wave a Muslim at them.


Posted by: togolosh | Link to this comment | 02-29-16 8:26 AM
horizontal rule
13

2 - go fuck yourself. I don't have strong views on this case, but how surprising that you personally would show up to be a shill for them. What a fucking asshole you are.


Posted by: R Tigre | Link to this comment | 02-29-16 8:37 AM
horizontal rule
14

My sense on the legal slippery slope (without any authoritative specific knowledge) is that in general it's a real concern, just because of the nature of precedent as kind of a one way ratchet on things like this.

You start with a precedent saying that under really important, national-security-related circumstances, law enforcement can compel Apple to crack a phone, and that the courts will decide when it's important enough. And then the FBI makes applications in a thousand less important cases across the country. 995 judges tell them to go pound sand, but five judges (or two judges a couple of times each), say yes. Now the FBI has precedent in less important circumstances to show the next set of judges, and it probably persuades a few more judges to loosen the standard, and so on, and so on.

It's basically the same principle as "Mom already said it was all right", and it gives rise to a tendency for what are supposed to be limited exceptions to general rules to expand.


Posted by: LizardBreath | Link to this comment | 02-29-16 8:43 AM
horizontal rule
15

10: That's what I've been wondering. I think the answer is that (1) isn't really the concern. They really care about (2) and (3) and are just throwing (1) in because it is convincing to members of the public who aren't going to be thinking through the issue completely.


Posted by: rob helpy-chalk | Link to this comment | 02-29-16 8:46 AM
horizontal rule
16

Apple is being asked to do two things. (1) to disable the counter that limits incorrect password attempts to 10, allowing unlimited attempts, and (2) to disable the timeout between password attempts. Both of these modifications are to the firmware of the phone. I have also read, but don't know the details, of a third "fix" which would disable a check that the password is actually being entered on the keyboard, as opposed to generated in software. All these changes are to make cracking the password by brute force feasible.

They are not, so far as I know, being asked to actually crack the password itself. The FBI will do that, presumably relying on the fact that most passwords are easily crackable ("password," "1234," etc.). If the password is a hard one it might literally take years to discover it, but it probably isn't.

I presume also that if the FBI is doing the cracking, they will have possession of the phone, which means that it is likely that they can copy the "fbiOS" and use it as much as they want.

Apple is planning to make what they are being asked to do impossible in future iPhones. However, that has potential negative effects, as Sifu wrote in 2.


Posted by: DaveLMA | Link to this comment | 02-29-16 8:49 AM
horizontal rule
17

I don't see why Apple can't do the cracking too. A brute force attack, once it's allowed, it's the most complicated thing in the world, hence the name. That way they don't have to hand the modified firmware over and risk distribution/unauthorized use. Maybe if they do need special attack methods they can tell the FBI to hand over their software and watch the FBI come up with excuses about why that's a national security risk, Apple might release it to the wild/misuse it in the future, etc.


Posted by: SP | Link to this comment | 02-29-16 8:54 AM
horizontal rule
18

it's s/b isn't


Posted by: SP | Link to this comment | 02-29-16 8:54 AM
horizontal rule
19

On the merits, it seems like it almost entirely hinges on a technical question where Apple and the government disagree. If the request is merely to tak this particular phone (with the consent of its owner, which was the county of San Bernadino) and do some work to unlock it, that's 100% consistent with practice for lots if other technologies under the all writs act and leads to no slippery slope away from where we are now at all, that I can see. (There might be bad international consequences for Apple from this in other countries that force Apple to do similar work for worse reasons, but note that the same is true right now for other US companies with other technologies). On the other hand Apple says doing this will lead to a readily-available and exploitable hole that makes their phones less securable for everyone and makes them vulnerable to breach absent the kind of specific court order at issue here. If that's true, that's a risk for everyone and one that a court should take into consideration. Since I have no idea what the facts really are and both sides have an incentive to shade them, it's hard to come to a strong conclusion. I do strongly believe that tech companies shouldn't be above the law, but on the other hand the courts shouldn't force widespread ancillary privacy problems nationwide just to get additional information in a single case.


Posted by: RT | Link to this comment | 02-29-16 8:55 AM
horizontal rule
20

I honestly don't understand how people can be skeptical about the slippery slope argument here, as we've seen it in action for 15 years now. It's not just that most "terrorism" cases are less compelling than this (but this precedent would be used for them): it's that we've seen law enforcement agencies use the Patriot Act against drug dealers, because why not?

As for the "just write it and unwritten it" argument, what are you going to do, murder the engineers once the project is done? We know it's technically feasible, but that doesn't mean it's a solved problem. Once the problem is solved, it's Pandora's box. In fact, come to think of it, there's a pretty clear path (aside from direct leak) for the information to escape: the next iOS contains code specifically to prevent this particular path of hacking, which code acts as bread crumbs for outsiders.


Posted by: JRoth | Link to this comment | 02-29-16 8:58 AM
horizontal rule
21

Anyway, in terms of the government ask, it's not insubstantial, and it's sort of weird* that it tends to get hand-waved. The government can't traditionally compel private companies to do extensive work on their behalf (there's phrasing along the lines of "reasonable assistance"). They can make a locksmith cut a duplicate key, they can't make him design a new lock.

*I suspect it comes down to blind faith in tech--obviously if I can think of it, Apple (or whoever) can do it fairly easily


Posted by: JRoth | Link to this comment | 02-29-16 9:05 AM
horizontal rule
22

seems to me that demonstrating that such a workaround can be done is the problem. it's not that Apple's particular hack will get out, it's that people will learn that such a hack is possible - once the world knows it's possible to get around the security measures, then other people will get to work on doing their own versions of it. and Apple probably already knows that such a hack could be accomplished in the wild. they just don't want to acknowledge that's it's possible at all.


Posted by: cleek | Link to this comment | 02-29-16 9:07 AM
horizontal rule
23

13 seems unnecessary. 2 seems more or less correct.

I agree with 19 that, "I have no idea what the facts really are and both sides have an incentive to shade them" but I am strongly on Apple's side on this one.

Mostly because (a) I think it would be bad precedent. (2) based on very little information* I have the sense that the FBI is almost comically bad at understanding and dealing with computer crime and issues of either technology or privacy. (3) Post Snowden, it's clearly become good PR for tech companies to push back against requests by the government to enable backdoors, and I think that's a very good thing. It's not a substitute for a real public conversation about the trade-offs between privacy and access, but I think that sort of public push-back makes a serious conversation more likely.

* Mostly the book about cybercrime Fatal System Error but also something like this recent article about the Zoe Quinn case in which the FBI seems completely uninterested -- perhaps that's outside of their mandate, but it doesn't fill me with confidence.


Posted by: NickS | Link to this comment | 02-29-16 9:11 AM
horizontal rule
24

You start with a precedent saying that under really important, national-security-related circumstances, law enforcement can compel Apple to crack a phone, and that the courts will decide when it's important enough. And then the FBI makes applications in a thousand less important cases across the country. 995 judges tell them to go pound sand, but five judges (or two judges a couple of times each), say yes. Now the FBI has precedent in less important circumstances to show the next set of judges, and it probably persuades a few more judges to loosen the standard, and so on, and so on.


My understanding, though is that in this case they didn't even have to use the "really important national security issue" argument (in court). The order is just based on needing to execute a search warrant.

same is true right now for other US companies with other technologies). On the other hand Apple says doing this will lead to a readily-available and exploitable hole that makes their phones less securable for everyone and makes them vulnerable to breach absent the kind of specific court order at issue here.

My understanding is that what the FBI is demanding they do is create a custom firmware that would only be installable on this one phone, but only because it's keyed to the phone's unique ID. It would be trivial to make it work on other phones (of pre-Touch ID models) just by changing the key.


Posted by: Ginger Yellow | Link to this comment | 02-29-16 9:11 AM
horizontal rule
25

China hasn't asked Apple to do this, because a) these things exist for older phones and b) they already have the capacity to sniff traffic that's being backed up to the cloud anywhere in the country. But yeah, once the tool exists, I don't see any reason why they wouldn't come to Apple over and over with legal-in-China requests, even if they don't get a copy of the magic firmware themselves.

The endgame for Apple, I guess, is to make a rock so heavy they cannot lift it remove the ability to update the security firmware if the passcode hasn't been entered from the next generation of iPhone.


Posted by: | Link to this comment | 02-29-16 9:13 AM
horizontal rule
26

20 was me.

21.last -- Apple said in their pleading that it would take a team of about 10 engineers about four weeks, so it's real effort but not unimaginably so. (I saw Paul Kedrosky scoffing at this as de minimus inconvenience to Apple, but I suspect that's their honest guess about the level of effort required, and it's still saying "almost an engineer-year".)


Posted by: snarkout | Link to this comment | 02-29-16 9:16 AM
horizontal rule
27

20 -- the only problem with the slippery slope argument is that it's very standard, right now, for the government to order big companies to do technical work on their products to aid in a search warrant. The classic example is a "pen register" on phone calls in the 70s, but there are all kinds of other examples, including things done by chemical companies. So if that's the slope, we've already slid down it for quite some time now. So long as the search is (a) not remote from the criminal case and (b) doesn't lead to the infringement of rights of people in no way subject to a criminal case or warrant, the government can do this kind of thing right now and has since anyone here has been born. The core question for the Apple case is whether the government's order will lead to substantial privacy breaches (whether by hackers or someone else) in situations *other than* the relatively controlled situation if a validly issued federal warrant. And that seems like an issue on which people are saying diametrically different things.


Posted by: RT | Link to this comment | 02-29-16 9:17 AM
horizontal rule
28

22: No, it couldn't be accomplished in the wild. I mean, maybe the NSA (or China's equivalent) could do it, but Apple can sign a firmware update that the phone will install. Actually writing the firmware probably is something that a dedicated team of people could do in a ton of places/situations, but only Apple can issue it to the targeted phone as a valid update.


Posted by: snarkout | Link to this comment | 02-29-16 9:18 AM
horizontal rule
29

But pen registers are covered by specific legislation, not the All Writs Act, and the holdings about it have rested on the fact that the phone company already knows who you're calling, whereas Apple's point is that they don't know the passcode for the phone. (Additionally, for forty years the governing law in the US was that wiretapping by the government wasn't a violation of the Fourth Amendment, which seems to me to be obviously different than ordering a private actor to unlock someone's computer.)


Posted by: | Link to this comment | 02-29-16 9:25 AM
horizontal rule
30

Damn it, 29 was me again.


Posted by: snarkout | Link to this comment | 02-29-16 9:25 AM
horizontal rule
31

Here's the FBI application that resulted in the order, if people are interested. The national security stuff is background to the original search warrant, but it doesn't play any role in the All Writs application proper, which is all about whether Apple is connected to the matter, whether it would be burdensome to comply, and whether it is necessary for them to do so to effect the warrant.


Posted by: Ginger Yellow | Link to this comment | 02-29-16 9:25 AM
horizontal rule
32

seems to me that demonstrating that such a workaround can be done is the problem.
Isn't that the argument companies use to prosecute white-hat hackers, and don't people usually call bullshit on that argument?


Posted by: SP | Link to this comment | 02-29-16 9:45 AM
horizontal rule
33

29 - I think that the AWA applied to pen registers before there was a statutory regime (could be wrong, not gonna look it up) but in any event I think that's a side issue. The AWA clearly under current law allows the government to order manufacturers to perform some technical steps to assist the government in executing a valid warrant, but those steps must be reasonable and not unduly affect the interests of people who have nothing to do with the investigation.

Think about it this way. If you by an ultra-secure physical safe from a manufacturer, and put your key mafia conspiracy documents in there (using this hypothetical because the core issue has nothing to do with "national security") the government can and has been able, at least in the 20th century, to make the safe manufacturer open the safe under the All Writs Act. On the other hand, the government probably can't (and shouldn't) be able to order the safe manufacturer to build a universal key to the safe which easily could get into the wrong hands and put the contents of people's safes who has nothing to do with the mafia investigation. The question here is whether what the government is asking Apple to do is more like the former or the latter, and I for one have no real idea.


Posted by: RT | Link to this comment | 02-29-16 9:57 AM
horizontal rule
34

Eh, ironically iphone related grammar errors in that comment, but you get the idea.


Posted by: RT | Link to this comment | 02-29-16 9:59 AM
horizontal rule
35

26.1 was giving me existential angst, but I think it's supposed to refer to 25.

Anyway, 29 gets to an important point: we don't typically use a lot of All Writs Act because there's usually other, more specific legislation. The reason there isn't here is that they passed a specific law on this a few years ago that covers this sort of thing generally (it's under that law that Apple has done less intrusive things under previous warrants*) but doesn't cover this. The FBI tried to get this covered in another law, and Congress decided not to pass it. Which goes back to LB's "Mom said it was OK" metaphor: Congress said no, so the FBI is claiming that it's OK under some other authority. But the no has already been stated by, you know, the duly elected authorities.

*or whatever


Posted by: JRoth | Link to this comment | 02-29-16 10:17 AM
horizontal rule
36

2) Might not the FBI have multiple, overlapping reasons for wanting this? It seems to me that there is an element in some responses of never giving law enforcement anything it wants or asks for, and I don't see how that follows from the fact that it often wants or asks for things it shouldn't have.

The idea that large tech companies can pick and choose which laws they should obey is disturbing.

I quite see that Apple has to fight this case, for reputational reasons if nothing else, but it doesn't seem to me automatically catastrophic if they lose it. LB's case, that the government would then play "But Mom said" in 1000 courts, is the strongest argument, but why in that case should the wrong decision become the precedent?


Posted by: Mustapha Mond | Link to this comment | 02-29-16 10:23 AM
horizontal rule
37

I thought the 6 form factor was annoying... not so much of itself but the phone didn't ride comfortably in places where I used to put it. BUT it's flat enough that I can put it in a water-resistant case that doesn't feel like a brick, so I use one. Also, the camera is really really better.

As to #1, the amount of work required of Apple is trivial, even if NSA can't supply the desired code mods. (Compared with ordinary government regulation.) The bad thing would be if the FBI were able to insist that Apple provide this kind of capability in future models.


Posted by: mud man | Link to this comment | 02-29-16 10:35 AM
horizontal rule
38

36: What the fuck does "The idea that large tech companies can pick and choose which laws they should obey is disturbing." have to do with anything in this case? Apple isn't entitled to challenge court orders because it's a large tech company?


Posted by: Walt Someguy | Link to this comment | 02-29-16 10:41 AM
horizontal rule
39

The idea that large tech companies can pick and choose which laws they should obey is disturbing.

I don't find it remotely as disturbing as the deference that the judicial branch has shown to the national security state.


Posted by: JRoth | Link to this comment | 02-29-16 10:42 AM
horizontal rule
40

Further to 38, the order explicitly gives Apple the right to make an application saying compliance would be unreasonably burdensome (let alone formally appealing the order).


Posted by: Ginger Yellow | Link to this comment | 02-29-16 10:53 AM
horizontal rule
41

38: Of course it is entitled the challenge the court order. But there is a tone in some commentary that the FBI is not entitled to ask for the order, nor the court to grant it. Sifu, for example, seems furious that the FBI has chosen the strongest possible case, as if that were cheating. It's not. First, their case is genuinely strong, and second, that's how the game is played.

39: If those are really the alternatives, we're pretty fucked. My preference would be for a state where the judiciary acted to check both the executive and large powerful companies.


Posted by: Mustapha Mond | Link to this comment | 02-29-16 10:54 AM
horizontal rule
42

the FBI has chosen the strongest possible case

"fabricated" would be a better word than chosen; there's good reason to think that the FBI directed San Bernardino to change the password on the account in order to create this circumstance.

"We had to use C4 to blow the door because it was locked."

"Why didn't you use a key or pick the lock?"

"We threw the key in the ocean and melted the lock with a torch, so you can see we had no choice."


Posted by: JRoth | Link to this comment | 02-29-16 10:58 AM
horizontal rule
43

there's good reason to think that the FBI directed San Bernardino to change the password on the account in order to create this circumstance.

Apple's also already cooperated with the FBI in this case; they suggested that they try to connect the phone to a WiFi network it had previously been connected to, to see if that would trigger an automatic iCloud backup. Only the FBI couldn't do that with the guy's home network, because the landlord let the press in...


Posted by: Josh | Link to this comment | 02-29-16 11:02 AM
horizontal rule
44

35's not generally the way the law works but raises an interesting point. It's important to understand that the All Writs Act is just a law that says courts can issue various kinds of traditional orders ("writs") in support of their jurisdiction. Generally it's now understood that this includes orders to third parties reasonably necessary to the execution of a valid search warrant. The government asks for the order, but the Court has the responsibility of balancing various factors and issuing it. The Court either has authority under the All Writs Act to issue (based on the court's discretionary analysis of the government's request) a court order asking for this kind of thing, or it doesn't. In a very close case the existence of other legislation might be a reason for finding the request unreasonable or excessive, but this is an issue broadly seen as in the discretionary domain of courts, not Congress.

Now, to be clear, Apple is a big enough deal and is making a big enough deal from this case that undoubtedly part of their long-term strategy is to get a ruling from an appellate court that the AWA -- which is nothing more than a statement that federal courts have the power to issue certain kinds of orders ancillary to their jurisdiction -- should be applied extremely narrowly (perhaps especially to tech companies). IOW they will ultimately be trying to use the judicial system to move and change the law away from judicial discretion in this area. Trying to move the law in this way (asking for common-law judges to change or limit common-law rules) is how our system works and is absolutely their right. But we should understand it as a change -- current law does allow many of these kinds of decisions and balancings of interest to be done by judges through basically common-law decisionmaking, rather than by statute.

The question of which branches get to do the balancing, (executive+judicial, or legislative+executive+judicial), and how that gets done, are interesting and Apple will certainly be pushing on this area of the law. Right now a court clearly has a discretionary power, within limits, to order a third-party manufacturer to take steps reasonably necessary to effectuate a proper governmental search. The balancing of interests is generally done by a court. Is that less democratic than balancing done by the legislature? Hard to say. Court discretion in this area hasn't traditionally been seen as a problem, it's just a delegation of an issue to judicial/common law rather than legislative authority. But Apple will try to, and may well succeed in, changing the law in this area, based on arguments about their technology. Note that generally shifting decision making strictly to Congress is likely in Apple's interest because as a gigantic company they expect to influence legislation, and perhaps also because in our current broken system legislation is difficult to impossible to obtain and they feel like they're better off with no rules than judge-made rules. On the other hand we do often think of rules set by statute as more "democratic" than rules set by judges -- but not always.


Posted by: RT | Link to this comment | 02-29-16 11:04 AM
horizontal rule
45

8: I think we can be absolutely certain that if China had made such a request of Apple, and Apple had given in, neither party would publicise the fact.

Also, where are Apple's chips actually fabbed?


Posted by: Nworb Werdna | Link to this comment | 02-29-16 11:14 AM
horizontal rule
46

I found this post pretty interesting. It's asking what additionally has to be done if the FBI wants to present the extracted information as evidence, in terms of being able to defend the forensics in court. Does this imply a lot more disclosure than the FBI's order directly requests? That author thinks so. I'm a little less clear why it's necessary - it seems like a magic oracle that gives you the device PIN (Apple's custom-created fbIOS and the FBI's brute-forcing together) doesn't have to be defended itself, once it's shown that the PIN works to unlock it.


Posted by: Nathan Williams | Link to this comment | 02-29-16 11:20 AM
horizontal rule
47

That author's other posts are interesting and on point, too. He seems to think that "unlock the phone" is a sketchy way to access the information, in terms of what else the phone might do while booted up.

He also thinks there isn't likely much to be found on it, which makes sense to me, and raises a point I'm unclear on: what is the FBI seeking that gets them a warrant for this phone? I don't like the idea that being accused of a crime opens up a fishing expedition into all of a person's electronic data.


Posted by: Nathan Williams | Link to this comment | 02-29-16 11:26 AM
horizontal rule
48

47 - Your last question has an easy answer, or really two easy answers. There's reason to think that the phone contains messages used in the immediate planning or aftermath of the attack, and of course the government can look for evidence of that. Moreover, there's no actual privacy issue here at all, because the phone's actual owner, which is the County of San Bernardino, has consented to the search. Absent the technical problem is with accessing the information, there is no legal question whatsoever that the government could search the phone in full. The only thing at issue here is the propriety of the government request to force Apple to do whatever it is asking Apple to do to get the information -- there's no dispute that the underlying information is 100% validly accessible by the government.


Posted by: R Tigre | Link to this comment | 02-29-16 11:32 AM
horizontal rule
49

He also thinks there isn't likely much to be found on it, which makes sense to me, and raises a point I'm unclear on: what is the FBI seeking that gets them a warrant for this phone?

They say they're looking for contacts suggesting a wider network of conspirators, or at least terrorist contacts. Seems unlikely given what he did with the non-work phones.

46: I don't think they have any intention of using it as evidence in court.


Posted by: Ginger Yellow | Link to this comment | 02-29-16 11:36 AM
horizontal rule
50

13: and yet he was totally right.


Posted by: Asteele | Link to this comment | 02-29-16 11:37 AM
horizontal rule
51

OK, the point that County owns it is certainly relevant, and I've even pointed that out to other people. I should have remembered that (But it seems like there's starting to be a line of other applications for this tool, and they probably aren't all owner-consented searches).


Posted by: Nathan Williams | Link to this comment | 02-29-16 11:44 AM
horizontal rule
52

On the technical point of applying this tool to other phones if it gets out of Apple's control: it seems quite possible, but I don't think all the necessary details are public. Only Apple can create a valid signed OS image, and they can make it check the device's ID, but it's entirely possible that you can take a random iPhone and change its ID, with a moderate level of electronics sophistication. So even with a tool that just unlocks Farook's device, it may be that you can trivially make any other phone of interest claim to be Farook's in the relevant ways.

That ID data may well not have been kept in the "Secure Enclave" part of the newer phones, but it probably should be going forward.


Posted by: Nathan Williams | Link to this comment | 02-29-16 11:48 AM
horizontal rule
53

Additionally, "pretend to be another device so the software does X" is a mainstay of video-game-console hacking/unlocking tricks, so there's a lot of prior experience to bring to the problem. If the image gets out I would assume that it can be used on all such devices in short order.


Posted by: Nathan Williams | Link to this comment | 02-29-16 11:53 AM
horizontal rule
54

51 -- I can't read that article, but it's certainly right that other searches aren't (and of course won't be) all owner-consented. But, they'd have to be valid searches, with warrant issued, for other reasons. Obviously the government can't ever search iPhones with an *invalid* warrant, and it's not claiming it can.

To the extent that searches of iPhones (whether in this case or some other case) *only* affect people whose information *is* validly searchable, and makes the information would *only* accessible in the context of an otherwise-proper search, it's not a problem under current law, at all.

Once again, the real question, I think, is whether the technical work done by Apple could lead to reading iPhone information of people whose phones are *not* the subject of valid searches or by people who are *not* government investigators acting pursuant to existing law. However, I have no idea at all about how the technology in fact works here or whether the government's request would in fact meet those conditions.

(This puts aside the issue raised in 44, which is that Apple would like to take this kind of thing out of the ordinary realm of things that are discretionary to a court and put it into some kind of special legislative regime. But that's not the way the system currently works, and if the actions aren't unduly burdensome for Apple and wouldn't in any way affect the rights of anyone not under government investigation, then Apple under most understandings of the law could be validly ordered by a court to do this)


Posted by: R Tigre | Link to this comment | 02-29-16 12:01 PM
horizontal rule
55

That it would take 10 engineers for 4 weeks seems a ridiculous reason not to do it (unless the sin really shines from their back sides - which they might well think it does).

On the other hand, it is like a creating a strain of a potentially deadly virus. You can take every precaution to keep it locked, but if it gets out there's no defense.

Everything has sides so maybe as well to let the courts at it.


Posted by: JoB | Link to this comment | 02-29-16 12:14 PM
horizontal rule
56

they could create a legal precedent that companies can be forced to build backdoors for not-otherwise-backdoored products

The privacy folks I follow on Twitter seem to think that this is the heart of the matter. This fight isn't going to end. The government really seems to think that encryption is illegitimate, and they'll keep trying to do everything they can to undermine or get around it.


Posted by: ogged | Link to this comment | 02-29-16 12:33 PM
horizontal rule
57

Also, where are Apple's chips actually tabbed?

I believe some chunk of them are actually made in Austin, but it varies: from generation to generation (maybe within generations?), the chip manufacturer has varied (Apple is doing the design, so it's a contracting deal, not a purchasing one).

Anyway, it's been discussed in the past whether Apple is vulnerable to political pressure from China because of the manufacturing, or even just because they want to be in the market.


Posted by: JRoth | Link to this comment | 02-29-16 1:09 PM
horizontal rule
58

56 - I barely understand what the word "backdoor" means but I have read the government's brief on this. If the argument is that the specific order here leads to a slippery slope on encryption, the government specifically and strenuously objects to this. The government goes to great lengths to argue precisely that this is not what it wants and is not what would happen. I am not remotely qualified to know whether the government is right or wrong about this in fact. But the government concedes that encryption is legitimate generally and says that all it wants is a technical fix that applies only to individual phones (that is, individual phones already subject to valid search warrants, and nothing else). The government goes to great lengths to emphasize that it believes what it wants poses no threat whatsoever of a broader technological undermining of encryption or phone privacy.

Rhetorically, the government's lawyers are absolutely explicit that the government is *not* trying to establish that all encryption is illegitimate. In fact, they are encouraging the Court to weigh the *absence* of this problem in their favor in upholding the order.

Apple says that the government is wrong. Apple claims that *this order itself* would lead to privacy risks for people not subject to a valid search warrant and who have nothing to do with this or any other criminal case. But the specifics of the order is territory on which both sides are fighting -- not a broader claim that encryption itself is or is not invalid.

Whether or not these claims from government lawyers are sincere, they are there. It makes it very unlikely that whatever precedent arises from this case is going to lead to the government going into Court and saying "aha! you found in our favor! Even though encryption wasn't at issue in that case, now all encryption is illegitimate," because the case has been argued on very very different, essentially contradictory, grounds. That doesn't mean that Apple shouldn't win *if this specific court order* puts the privacy of totally unrelated people at risk and is itself a "backdoor" (assuming I understand correctly what that means).


Posted by: R Tigre | Link to this comment | 02-29-16 1:09 PM
horizontal rule
59

56: I understand that this is the claim -- and I'm sure the sincere belief -- of some of those who oppose this order. They see a slippery precipice rather than a slippery slope. And if you take the view that the US government is completely corrupt and has no interest in the rights of criminals, and that it will immediately renege and cheat on the outcome of any deal reached here, then that makes sense.

Elements within the state no doubt do feel that encryption is in itself illegitimate. Others are probably more grownup.

But the fight, even if it's not going to end, is going to be fought on fresh ground. Strong, widespread encryption is out there now and that's not going to change. People who really want it have access to all kinds of privacy, although it's effortful and time consuming.


Posted by: Mustapha Mond | Link to this comment | 02-29-16 2:08 PM
horizontal rule
60

Apple claims that to digitally sign the new OS, which must be done to install it, they are affirming that this is a valid product. That the government is making them say something they do not believe, a violation of their first amendment rights.

Apparently there is precedent for treating code as speech, going back to when cryptographic software was export-controlled in the 90s.

And they have Ted Olson for their lawyer, somebody who knows about being a victim of terrorism and arguing all the way up to the Supreme Court.


Posted by: Robert | Link to this comment | 02-29-16 2:26 PM
horizontal rule
61

"All encryption is illegitimate" is obviously a ridiculous claim that would make e-commerce impossible, for example. The government is fine with encryption constraining everybody else, just itself.


Posted by: Walt Someguy | Link to this comment | 02-29-16 2:32 PM
horizontal rule
62

I know, how about they make a workaround for the firmware that checks whether firmware is digitally signed!


Posted by: SP | Link to this comment | 02-29-16 2:32 PM
horizontal rule
63

59: "if you take the view that the US government is completely corrupt" or even if one doesn't take that view, the idea that the government or Apple could keep the breakable IOS out of the hands of other states or corporations for very long is belied by the history of secrets so far.


Posted by: Biohazard | Link to this comment | 02-29-16 2:34 PM
horizontal rule
64

Why would a government not legitimately believe encryption - I mean absolute unassailable encryption - is illegitimate? I'm all for checks and balances but when everything is checked - and everything balanced - up to and including the Supreme Court ... what's the hold-up?


Posted by: JoB | Link to this comment | 02-29-16 2:36 PM
horizontal rule
65

63: Really? Obviously we only get to hear about the cases where secrets have not successfully been held, but there are quite a lot of things we suppose Apple has managed to keep secret successfully -- starting with the keys with which they sign their software. If I read the court order rightly, the fbiOS it's asking for need never leave Apple's office. It is also valid for a pretty limited subset of iPhones: earlier ones can be broken without it; later ones can't be broken with it. If it ever did escape into the wild, they could issue a patched update over the air.

It seems to me that you're assuming the slipperiness of the slope which is exactly the point at issue.


Posted by: Mustapha Mond | Link to this comment | 02-29-16 3:15 PM
horizontal rule
66

Its entirely possible that both Apple's network and the FBI are already compromised by Chinese hackers, and probably also by the NSA. The risk of a copy of the software getting out is well above zero.


Posted by: Spike | Link to this comment | 02-29-16 3:51 PM
horizontal rule
67

65: If there's slope at all it comes with some slippery stuff spilled on it. It's only a question of how much and how viscous. I'll grant you they've managed to keep the exact composition of the angel-killing chemtrails and the Area 51 tech secrets but the trivial stuff like A-bombs and Snowden files gets out.


Posted by: Biohazard | Link to this comment | 02-29-16 4:16 PM
horizontal rule
68

45. I believe at TSM, so Taiwan. I also believe that SM is pretty much pure foundry, not at all a chip designer.

65, and others asking about backdoors: Schneier and others explain. The "later ones can't be broken with it" requires disabling something that exists now and is very useful, automated updates. This unfortunate consequence should be compensated by some real social benefit if it must be done anywhere. What amounts to legal pawn-pushing over a phone unlikely to contain anything useful (honestly, which shitty jihadi websites the guy looked at, does it matter?) is not a benefit.


Posted by: lw | Link to this comment | 02-29-16 4:21 PM
horizontal rule
69

Apparently the lawyers might find interesting the denial of a similar government motion a New York federal judge just issued today.


Posted by: Minivet | Link to this comment | 02-29-16 4:59 PM
horizontal rule
70

69 -- that is indeed a really really interesting order. It pushes existing law very hard, and takes an extremely narrow reading of judicial power the all writs act (undoubtedly prompted by some good lawyering by Apple) and in a way that could have some really bad consequences in other cases.

It's effectively a ruling about implied Congressional pre-emption of judicial power by Congress' failure to do something while acting in a somewhat related area. If it's anything like the history of implied preemption in other areas, that bodes extremely ill -- Courts and companies can basically just pick and choose through a range of statutes until they find some basis for claiming that Congress's ability to act in an area where it didn't act preempts the Court. I predict that this aspect of the ruling will not survive review, but I could be wrong. The Magistrate finds the search order unnecessary based on a traditional factor analysis, making his legal argument dicta anyway, and I'd expect that part of his order to hold up.


Posted by: R Tigre | Link to this comment | 02-29-16 5:15 PM
horizontal rule
71

judicial power *under* the All Writs Act.


Posted by: R Tigre | Link to this comment | 02-29-16 5:16 PM
horizontal rule
72

Oh, PDF warning for 69.


Posted by: Minivet | Link to this comment | 02-29-16 5:46 PM
horizontal rule
73

The line in one of the government's filings that made people think they were against encryption as such was this one.

"Apple has attempted to design and market its products to allow technology, rather than the law, to control access to data which has been found by this Court to be warranted for an important investigation,"


Posted by: ogged | Link to this comment | 02-29-16 8:40 PM
horizontal rule
74

There's no question whatsoever that the government is generally against encryption as such. Anyone who argues otherwise is completely unfamiliar with the history of the subject.


Posted by: Josh | Link to this comment | 02-29-16 9:24 PM
horizontal rule
75

I think the issue is the cryptographic *signing*, that says this blob of code is genuine Apple software, and (only somewhat) implicitly that this blob of code is not going to do anything malicious. So I consider this less a case of compelling Apple to *break* into the phone but of compelling Apple to authenticate a malicious piece of code.

Analogy warning:
have there been cases where the government has compelled a notary to participate in some sort of sting?


Posted by: BA | Link to this comment | 02-29-16 9:32 PM
horizontal rule
76

Once an fbiOS has been created, it will be an immensely valuable item to a variety of white hats and black hats, government and non-government. Suppose Apple has it stored in a deep vault.

What if a group of bad guys now kidnaps an Apple executive's family, demands a copy, and makes a credible threat to torture and kill them? Does the FBI have the right to create that possibility, subject Apple employees and their families to that danger?

Even ignoring that possibility, the existence of a fbiOS has lessened the security of the contents of 200 million iPhones. Suppose 1% of their owners sue Apple for recompense.


Posted by: Bob Munck | Link to this comment | 02-29-16 9:38 PM
horizontal rule
77

76 seems a bit extreme. I don't know how Apple's update signing systems works, but couldn't someone already kidnap someone with permissions to access the key and create a properly signed update?


Posted by: sral | Link to this comment | 03- 1-16 12:50 AM
horizontal rule
78

76: The awesome power of stipulation. You forgot only that the kidnapped executive's family have been strapped to timebombs and will be released into a farmers' market if she does not co-operate. Because, obviously, knowing how to crack passwords on a 2013 iPhone would shift the balance of global power the same way that knowing how to make an atom bomb did and anyone would commit any crime to get at it.

More generally, if Apple's network is already compromised by the Chinese, or by the NSA then the thing we are worrying about has already happened and all this is theatre. Given Apple's signing keys, which are the real secrets here, bad state actors can write anything they like and shove it on any phone they want. To quote Bruce Schneier

Have the Chinese, for instance, written a hacked Apple operating system that records conversations and automatically forwards them to police? They would need to have stolen Apple's code-signing key so that the phone would recognize the hacked as valid, but governments have done that in the past with other keys and other companies. We simply have no idea who already has this capability.

We have no idea, but the only sane way for the discussion to proceed is to assume that it does deal with realities, and that the FBI can't do what they want without Apple's co-operation and that Apple can in fact limit that co-operation.


Posted by: Mustapha Mond | Link to this comment | 03- 1-16 12:59 AM
horizontal rule
79

73: Isn't it right on principle that the law should ultimately be in control of access to data? I mean they can search houses if they follow the right process. That does not mean they are on principle against keys.

I understand the problem people have with what government is asking here is: a. this could lead to a situation where it will become very easy to search houses (like installing camera's in all of them) and b. so easy that some government officials will be tempted to just skip the right process (which they have the track record of doing). In the end the balance is not easy and has to come from the judicial and legislative processes. So it's essentially about precedent and there is nothing wrong with it, certainly not in a judicial system where precedent is crucial.

Now, apparently nobody contests that Apple can do it if they want. Isn't that scary as well because, who controls them? If the answer is: the market, isn't that even more scary?

What if somebody decided to dispense with the kidnapping & just hire those engineers?


Posted by: JoB | Link to this comment | 03- 1-16 1:51 AM
horizontal rule
80

41 is the single most cynical thing said on this entire thread, more cynical about the government than anything said by the pro-Apple side. It's how the game is played? The FBI is playing a game where it tries to maximally infringe on our civil liberties, and we're supposed to be understanding of that? I don't think so. It's not the job of the FBI to try to maximize FBI power.


Posted by: Walt Someguy | Link to this comment | 03- 1-16 2:05 AM
horizontal rule
81

79: Apple is trying to put themselves in the position where they can't do it. Unfortunately the FBI has identified a way to make them do it by exploiting the software update mechanism. This means the next iteration of the iPhone will have a worse software update mechanism to take away this attack vector on the phone.

Anyway, how far as you willing to go? If I encrypt a file with my own private key, how far are you prepared to let the government access that data?


Posted by: Walt Someguy | Link to this comment | 03- 1-16 2:12 AM
horizontal rule
82

It's not what I want in this specific case. It is that I want the law and courts to have the last say. If you encrypt a file that holds a description of your tax evasion and breaking that encryption has no knock-on effects (it's just the knock-on effects that make the Apple case difficult to judge), sure: let 'em break it (if they have probable cause and a court order and all that).

Also, the 'unbreakable' thing, I don't buy it. You can encrypt one specific transaction in e-commerce in 'almost' unbreakable ways, but to deny access to a memory in an unbreakable way? And, in the end, why make something unbreakable? It is not like this bit coin thing attracted the nicest people in the world. Make it such that it can be only broken following due process. I know that will sound naïve to most of you but Snowden did show that breaking without due process is also pretty difficult to hush up.

In short I disagree with what Snowden says here: http://techcrunch.com/2015/06/17/but-bring-the-hammer-if-it-betrays-us/


Posted by: JoB | Link to this comment | 03- 1-16 2:35 AM
horizontal rule
83

But you want to arrogate to the government the authority to not just break the encryption, but compel third parties to contribute.


Posted by: Walt Someguy | Link to this comment | 03- 1-16 2:44 AM
horizontal rule
84

81 -- surely the answer is "as far as is reasonable." That is, the answer depends on balancing the government's legitimate need to obtain information in specific cases without being prevented from obtaining it by corporate design (remember these are all by definition legitimate, warrant-issued searches) and the risk to the general public that the government's need for more information in any particular case, even if legitimate, will increase on net the general public's risk of a security breach by non-state actors or cause other problems.

The relevant balancing is likely to look very different given the specifics of what is actually being asked for and why, but somebody has to do the balancing. It could be courts or it could be Congress or it could be some other regulator. But someone has to do it. I don't think anyone wants to live in a world where there's an unlimited right of corporations to create "encryption" of any kind that can't be accessed by governments of any kind for any public purpose. I also don't think anyone wants to live in a world where governments in fact make information on people's phones generally or materially less secure by demanding "fixes" for encryption that do in fact break phones and leave them less secure. Weighing the relevant factors is going to be fairly fact-specific and case-specific. Neither "encrypting corporatiions good, government bad!" nor "government good, encrypting corporations bad!" seems like at all the right answer. This is a regulatory balance problem, and these things need some kind of legitimate fact-specific regulation, not argument from first principles.


Posted by: R Tigre | Link to this comment | 03- 1-16 2:48 AM
horizontal rule
85

But a world in which "write a custom version of your operating system" is reasonable is a world in which reasonable barely acts as a constraint on government power. I could probably do the hack myself if you gave me the phone, the key, and the iOS source code. Can the government order me to do it?

Anyway, JoB seems to arguing for some sort of maximalist position, not constrained by "reasonableness".


Posted by: Walt Someguy | Link to this comment | 03- 1-16 3:04 AM
horizontal rule
86

85 - Maybe the government can make you write a custom version of your operating system as part of a criminal case -- if the costs to you are low, the public need is material and great, the risks to the general public as a whole are minimal, and there's legitimate due process (review by a legitimately authorized court or a regulator). If all that's true, why not? It's consistent with all kinds of things the government can make you do right now. If none of those things are true (costs are very high, the public need is low, risk to general public of inadvertent bad consequence is high, there is no process or ability to challenge the order) then the government certainly shouldn't make you do that. But in general this is the kind of balancing you're going to need.


Posted by: R Tigre | Link to this comment | 03- 1-16 3:14 AM
horizontal rule
87

79: I don't know that 41 is that appallingly cynical. I didn't mean that the sole or overriding purpose of the FBI is to maximise its own power: the game I was referring to was the business of getting favorable legal precedents. The FBI's lawyers, just the same as Apple's, will attempt to get the most favorable interpretation possible of the law. That's their job and ultimately it is courts and judges which decide whether they succeed.

As for the question of whether you should be coerced to reveal your private key, I'm not sure. There are rules against self-incrimination, and rules governing the kind of pressure that can be put on people (rubber hose cryptography). I don't know enough about American law to go beyond that. If, as a general principle, the courts can sometimes demand that you hand over evidence in a criminal case, I don't see that encrypting that evidence changes that principle, though it does change the facts.

Some forms of coercion do seem widespread. I believe, for example -- lawyers please correct -- that a court could demand that you hand over large sums of money gained by criminal means that you have hidden in a swiss bank account. They can't force you to hand over the bank details. They could prolong your sentence if you did not co-operate. That doesn't seem improper or unprecedented even if its application in specific cases can be both.


Posted by: Mustapha Mond | Link to this comment | 03- 1-16 3:36 AM
horizontal rule
88

Of course it's appallingly cycnical. It's not the job of the FBI to get the most favorable interpretation possible of the law for the FBI. The FBI should be constrained by the general public interest. It's this kind of cynicism -- the knowledge that the FBI will try to game the system to its advantage separate from the general public interest -- that drives people to the pro-Apple side of the argument.


Posted by: Walt Someguy | Link to this comment | 03- 1-16 3:44 AM
horizontal rule
89

But it is surely the job of the FBI's lawyers to do exactly that? I agree that the FBI has interests separate to the general public interest, which will sometimes conflict with it. So does Apple. To recognise this is simply to acknowledge the importance of politics in their widest sense. But it doesn't mean that either or both will always be acting against the GPI -- even assuming general agreement about what the GPI demands. Both the FBI and Apple could disagree in good faith about that, and I'd have thought that the courts were the place to decide and to weigh up the contending interests and interpretations.


Posted by: Mustapha Mond | Link to this comment | 03- 1-16 5:08 AM
horizontal rule
90

85: No, I'm all for "restricted by reasonableness" as expressed by law and adjudicated by courts.

In essence my faith in the FBI's good intentions when left to its own is slim, and my faith in Apple even slimmer, which is why I like laws and courts and appeals and public debate.

Also I don't think you should be compelled to give your key - as long as you know your house is going to be searched. And in Apple's case, it's not their house that is being searched so that analogy doesn't stick.


Posted by: JoB | Link to this comment | 03- 1-16 6:20 AM
horizontal rule
91
Manhattan District Attorney Cyrus Vance will also testify in support of the FBI, arguing that default device encryption "severely harms" criminal prosecutions at the state level, including in cases in his district involving at least 175 iPhones.
Vance's office has drafted legislation it wants Congress to enact that would go beyond the single court case and require companies like Apple to ensure that their devices could be accessed in unencrypted form.

I don't really get people like this, who are so mentally captured by their jobs. As a citizen, does it really seem like a good idea to do away with the possibility of private communication? Who really thinks that?

Anyway, I'm in agreement with Walt in this thread. I was completely unaware that the judiciary could sentence people ad hoc to forced labor, without them being accused, let alone convicted, of a crime. That seems like the craziest part of this whole thing.


Posted by: ogged | Link to this comment | 03- 1-16 6:45 AM
horizontal rule
92

I was completely unaware that the judiciary could sentence people ad hoc to forced labor, without them being accused, let alone convicted, of a crime. That seems like the craziest part of this whole thing.

Of course it can. That's what subpoena powers are all about. The government can sentence you, an innocent person, to go through (for example) all your sales records to find the name of the man who bought a certain pair of shoes which he then wore while committing a robbery. It can ban you from going into a room in your own hotel for days at a time just because it says it's a "crime scene". It can even sentence you to several weeks' imprisonment in a draughty court room listening to incoherent idiots give evidence (or indeed to several days giving evidence yourself).


Posted by: ajay | Link to this comment | 03- 1-16 6:57 AM
horizontal rule
93

None of those is really the same thing, as I'm sure you realize.


Posted by: ogged | Link to this comment | 03- 1-16 7:19 AM
horizontal rule
94

I don't think anyone wants to live in a world where there's an unlimited right of corporations to create "encryption" of any kind that can't be accessed by governments of any kind for any public purpose.

Why not? (And why is encryption in scare quotes?)


Posted by: Josh | Link to this comment | 03- 1-16 7:38 AM
horizontal rule
95

93: Why not? They're different in the the wider implications an fbiOS itself potentially has, but not in the ad hoc forced labor sense. A third-party subpoena is always going to be a pain in the ass for an innocent bystander, and they're subject to a similar burden analysis. But if the court concludes that it's really not a big deal (in comparison to its relevance/usefulness) for my organization to pull together and turn over a ton of crap so it can be used in a lawsuit to which it is not a party, we have to oblige. The prospect of an fbiOS creating harm outside this particular case is certainly relevant to whether the "forced labor" should be forced here, but the issue isn't crossing the line into "forced labor" itself.

94: Because it would allow them to conceal evidence of wrongdoing that it would otherwise be obliged to disclose? Do you seriously think that, if you were suing a corporation and sent them a whole bunch of perfectly legitimate document requests for stuff you'd otherwise be allowed to see, and they said "sorry, they're encrypted," that should be the end of the matter?


Posted by: potchkeh | Link to this comment | 03- 1-16 8:00 AM
horizontal rule
96

91: I take it you have never filled out a tax return, then. That is the quintessence of the government forcing you to labour for them.

AS to the substantive point about Cyrus Vance -- that seems to me exactly the sort of thing that the courts should slap down. Cops being captured by their jobs is hardly a new thing or one that was summoned into existence by technology and it's one thing the rest of the judicial apparatus exists to check.

94: The obvious reason why not is tax evasion. Or, if you want to be dramatic, paedophilia


Posted by: Mustapha Mond | Link to this comment | 03- 1-16 8:02 AM
horizontal rule
97

quintessence

It's getting Crooked Timber all up in this thread.


Posted by: Bave | Link to this comment | 03- 1-16 8:07 AM
horizontal rule
98

The obvious reason why not is tax evasion. Or, if you want to be dramatic, paedophilia

All these arguments seem to me to boil down to "the private might be criminal, therefore, nothing should be private."

As for the labor thing, yeah, the government can compel you to do all sorts of inconvenient stuff, but in this case it's trying to conscript expertise in a way that seems different to me.


Posted by: ogged | Link to this comment | 03- 1-16 8:18 AM
horizontal rule
99

None of those is really the same thing, as I'm sure you realize.

They are: they are all examples of the judiciary's power to impose considerable (though reasonable) burdens on innocent people in order to help it do its job of enforcing the law and ascertaining guilt.

Do you seriously think that, if you were suing a corporation and sent them a whole bunch of perfectly legitimate document requests for stuff you'd otherwise be allowed to see, and they said "sorry, they're encrypted," that should be the end of the matter?

That is a slightly different matter. "They're encrypted", for your own documents, is no more of a plausible defence than "they're locked up" and both would land the corporation in front of a contempt of court charge. The Apple case is different because Apple does not actually have access to the documents itself - they have been encrypted by a third party, who is not available to be subpoena'd because they're dead.


Posted by: ajay | Link to this comment | 03- 1-16 8:22 AM
horizontal rule
100

All these arguments seem to me to boil down to "the private might be criminal, therefore, nothing should be private."

Or alternatively "the private might be criminal, therefore law enforcement should have the power to issue and to effectively execute search warrants".


Posted by: ajay | Link to this comment | 03- 1-16 8:26 AM
horizontal rule
101

99.last: Yes, they're slightly different, but I took it to be what Tigre was getting at up in the quoted bit that 94 was taking issue with (perhaps wrongly).

98.1: No, they're arguments that privacy isn't absolute--if someone convinces a court that privacy should be pierced, the fact that there are technologies that make the piercing prohibitively difficult shouldn't be the decisive factor. That's not to say anything about whether privacy should be pierced in any particular case, and the possibility that piercing it here might entail piercing it everywhere certainly ought to be relevant to what should happen here.


Posted by: potchkeh | Link to this comment | 03- 1-16 8:28 AM
horizontal rule
102

97: The OP piece is way too short for that ;-)

98: The private might be criminal therefore nothing should be irreversibly secret. Also, the government might be criminal so therefore everything should be able to be done privately. That difference between secret and private is then down to the law and the courts. In this case, well, it's an interesting case.

Compelling a company to do something is not the same thing as ordering an individual.


Posted by: JoB | Link to this comment | 03- 1-16 8:30 AM
horizontal rule
103

100: Exactly. Privacy is not binary. There are many gradations between something known only to me and something broadcast to the whole world.

98.2: The thing-that-is-not-at-all-an-analogy which springs to mind is wartime, when the government certainly has a right to conscript your expertise. But it's probably a red herring here, because the war on drugs certainly doesn't qualify as a real war, and the "war on terror" probably doesn't either. So whether the government has a right to conscript expertise in peacetime is the relevant question here. Ex Recto, I think it can probably conscript companies but not individuals.


Posted by: Mustapha Mond | Link to this comment | 03- 1-16 8:48 AM
horizontal rule
104

whether the government has a right to conscript expertise in peacetime is the relevant question here

It's a tricky one, actually. I mean, it can conscript you because of attributes of your position: you were in the bank the day it got robbed, say, or you are the landlord with the pass key that will open the door of the building where the crime took place. But it can't conscript you to do stuff just because you would be good at it; you can't subpoena a forensic scientist to look at your sample, you have to ask her nicely, and she's at liberty to say no. What it's asking of Apple falls into both categories; it's being asked to both write the fbiOS and sign it.


Posted by: ajay | Link to this comment | 03- 1-16 9:00 AM
horizontal rule
105

Couldn't the FBI write the OS themselves and then just make Apple sign it?


Posted by: nosflow | Link to this comment | 03- 1-16 9:05 AM
horizontal rule
106

In essence my faith in the FBI's good intentions when left to its own is slim, and my faith in Apple even slimmer

I'm mystified by this. I know some people really, really disliked Steve Jobs, but AFAIK he never encouraged anyone to commit suicide by blackmail. WTF has the FBI ever done to justify even a tiny bit of trust on the part of the American people? I suppose it's somewhat more respectable than the CIA?

Here's the thing: look at it purely cynically. The FBI's narrow interest is to get access to more data (not just this phone, but communications generally). Their broader interest is accruing power. They are, per 41, therefore pushing the law to its limits here in order to achieve those goals.

Apple's broader interest is to sell more phones (and other devices). They believe that a key avenue for doing this is providing maximum security on devices--this is well-established and longstanding, but has also grown in importance.

So this is the source of their disagreement. But my point is, which of these two has interests at variance with the public at large? Unless you have a child-like faith in the bona fides of the FBI, one of these organizations wants to intrude on your rights, and the other wants to sell you a phone that will (ostensibly) protect your rights. I don't see an angle, in this case, where Apple's interests diverge from the general public's. Investigators have already said in public that they don't expect to find jack shit on the phone, so let's set aside any self-important bullshit about Protecting America. After all, if this case is only about this one phone, then we know that the national security stakes are low. Or maybe the FBI is full of shit when they say it's just this one phone?


Posted by: JRoth | Link to this comment | 03- 1-16 9:08 AM
horizontal rule
107

So this "conscription " discussion is exactly the kind of ignorant dude bullshit that makes for unhelpful comment threads.

Setting that aside, 100 is profoundly stupid. Everyone who understands encryption and computer security says you can't have a method that lets the government execute warrants by accessing encrypted data without that method being pretty readily available to non-state actors including criminals. So it's not just a question of enabling a certain type of warranted search. It's effectively breaking the security measure in question, rendering it untrustworthy.


Posted by: Bave | Link to this comment | 03- 1-16 9:08 AM
horizontal rule
108

100.last law enforcement should have the power to issue and to effectively execute search warrants

We don't live in an ideal world, and Apple is being asked to produce a dangerous technology/data blob (basically a fraudulent signing certificate, for starters). In spite of their best efforts, it may leak. The more often Apple is asked to do this, and they will be asked again, the more likely this outcome will be.

At that point one really doesn't need a search warrant to use it, one just uses it.


Posted by: DaveLMA | Link to this comment | 03- 1-16 9:14 AM
horizontal rule
109

Hard to see good faith in the argument that both the FBI and Apple are trying to maximize a favorable interpretation of the law and so they're just doing the same thing. Apple is trying to do that in this case, but they didn't bring the case and they haven't been working for years to position themselves to bring the case. They're not out there looking for ways to force their encryption upon the world because they believe tech companies can pick and choose what laws to follow, or whatever. They're just offering a feature to customers* that the US government is trying to make illegitimate in whatever way it can.

*Not that it matters, but I'm not one, unless you count an 11 year old iPod.


Posted by: fake accent | Link to this comment | 03- 1-16 9:16 AM
horizontal rule
110

107 is unusually (I hope) dickish.


Posted by: ajay | Link to this comment | 03- 1-16 9:17 AM
horizontal rule
111

105: Wouldn't that require handing the FBI the existing iOS to modify? I don't think they could possibly write it from scratch. Doing that would be insanely insecure.


Posted by: JRoth | Link to this comment | 03- 1-16 9:21 AM
horizontal rule
112

110 "dickish"

Perhaps "exasperated" would be a better word. If you read up on the technical issues (for example, read this) and see how they are totally not understood (or deliberately distorted) by the press, it's very frustrating.


Posted by: DaveLMA | Link to this comment | 03- 1-16 9:23 AM
horizontal rule
113

110 is pretty rich.


Posted by: Josh | Link to this comment | 03- 1-16 9:24 AM
horizontal rule
114

I don't think anyone wants to live in a world where there's an unlimited right of corporations to create "encryption" of any kind that can't be accessed by governments of any kind for any public purpose.

"Corporations" here is a red herring, or possibly bad faith. A slightly different case would have seen the FBI working out on some indie developer or university professor who volunteered to be an OpenSSL maintainer or such.


Posted by: Alex | Link to this comment | 03- 1-16 9:33 AM
horizontal rule
115

112.2: The FBI already got to it! Or maybe Apple.


Posted by: Walt Someguy | Link to this comment | 03- 1-16 9:34 AM
horizontal rule
116

111: more to the point, it would involve handing the FBI the Apple key in order to authenticate the modified OS which they had written, which would be even more insanely insecure, because then the FBI could hack any other similar iPhone without Apple even getting involved. (And because of this would be much less likely to pass the court's scrutiny in terms of acceptable burden.)


Posted by: ajay | Link to this comment | 03- 1-16 9:35 AM
horizontal rule
117

Everyone is aware, I think but want to check, that the government has actively been trying for the same power in many cases, not just this big one - the link in 69 was about a drug dealer's iPhone.


Posted by: Minivet | Link to this comment | 03- 1-16 9:38 AM
horizontal rule
118

107 -- well, to what extent 107 is true as applied to very specific facts is precisely what's at issue in this case (or, really, materially important -- you could, theoretically, have a security breach to allow the government to search otherwise-inaccessible data that has only an very small risk of broader public harm, or one with a huge one). In this particular case, much of the government's argument is that 107 is simply not true here. That is, it says the search it's asking for has no real risk of causing negative consequences to the general public. The government migjt be wrong about that, but that's the ground they're fighting on and the issue isn't simply being ignored.

101 was indeed the kind of thing I was trying to get at. On "conscription," I mean, I'm not the government, but I signed and issued myself (as a lawyer in a civil case) a records and testimony subpoena yesterday on a third party big company that will require a few of their employees to spend potentially several weeks, and considerable money, and engage in considerably technical searches for documents, simply because the evidence is significantly relevant to a civil case in which this company is not a party. That's not quite the *same* thing as what the government is asking Apple to do but it is most definitely forcing a third party company to spend substantial time and money and uncompensated labor gathering evidence for my client. And this is in a purely private civil case where only money is at issue between the parties. The government can and does do much more.


Posted by: R Tigre | Link to this comment | 03- 1-16 10:01 AM
horizontal rule
119

114 -- "corporations" was meant to be illustrative of the risks involved, but, sure, the same principle should apply to any private party. I don't think we want a world in which there is an absolute right of private parties to create encrypting technology which prevents governments or courts from accessing information for any purpose at any time in any circumstance. At the same time there are obvious reasons to not want the government to inadvertently facilitate widespread private-party theft of private information by, in fact, "breaking" security systems. Balancing those two concerns is a classic kind of regulatory issue -- there are better and worse ways to do it, but there are competing legitimate concerns on both sides and somebody needs to balance them based on the specific facts of any given case.


Posted by: R Tigre | Link to this comment | 03- 1-16 10:10 AM
horizontal rule
120

I don't think we want a world in which there is an absolute right of private parties to create encrypting technology which prevents governments or courts from accessing information for any purpose at any time in any circumstance.

This is... not what "unbreakable encryption" means, nor has anyone ever suggested such a thing. (Not least because if it existed it would be impossible for the person who encrypted a thing to ever decrypt it.)

What exactly do you think encryption *is*?


Posted by: Josh | Link to this comment | 03- 1-16 10:17 AM
horizontal rule
121

120: Insert "... without the cooperation of the data's owner" as read, and I think that is many forms of encryption as it exists today. My safe can be drilled open without my cooperation, but decrypting the disk image that I use for "secret stuff" without my cooperation requires (a) guessing my key, hopefully I've made that hard-to-impossible (b) breaking AES-256, impossible as far as we know (c) finding some technical flaw in the implementation, more likely than the other two but not at all guaranteed or possible to do in bounded time.

For many legal purposes I can be compelled to reveal that key and thus the data, which is probably the most important legal means of access, but if I'm dead that data is just gone forever.


Posted by: Nathan Williams | Link to this comment | 03- 1-16 10:25 AM
horizontal rule
122

I really wish Sifu had stayed in this thread instead of just that one drive by because he knows a lot about this subject and a hell of a lot more than me but at least I didn't spend 20 years obsessively reading the cipherpunks email list and Bruce Schneier for nothing.

The private might be criminal therefore nothing should be irreversibly secret.

This is nonsense.

There are only two kinds of cryptography. Those that will keep your kid sister from reading your diary and those that will keep major governments and their alphabet agencies from reading your files (paraphrasing Schneier here from his Applied Cryptography).

In truth nothing is irreversibly secret since attacks only get better. This is axiomatic. Hardware also gets faster and more capable which means that attacks only get better and faster. What may be reasonably secure today will be broken tomorrow or next week. If we're lucky what now appears to be secure until the heat death of the universe may last a hundred or so years

The practical consequence of this is that if you want to protect your ecommerce session from Chinese hackers, or your government OPM files from being hacked by Chinese hackers, you need the kind of encryption that will keep major governments out. Basically if you want a modern electronically connected global economy you need the kind of encryption that will keep major governments out.

Now that we seem to be heading inexorably towards the internet of things we'll need unbreakable cryptography even more. Unless we want our internet capable thermostats, or pacemakers or whatnot to be hacked.

All this aside I think that the breakability of strong publically available encryption is something that is not immediately relevant to this case. The FBI is asking for Apple to do something that will irreversibly damage one of the major security tools available to us. One that solves a major security problem that took a long time to solve and that depends upon trust. And for what? Even if there were actionable information to be gleaned from that phone (and c'mon, really?) the social costs are far too high.


Posted by: Barry Freed | Link to this comment | 03- 1-16 10:27 AM
horizontal rule
123

119

I don't think we want a world in which there is an absolute right of private parties to create encrypting technology which prevents governments or courts from accessing information for any purpose at any time in any circumstance.

That's the underlying issue, really. A lot of people think that that is exactly the world we should have, rather than one in which governments (not just our own) have an absolute right to access any piece of information for any purpose at any time.

The FBI must believe that a lot of people do want a world where there is a right to strong encryption, given the effort they have put into saying "No, no, this is a minor case and has nothing to do with the enabling the panopticon."


Posted by: DaveLMA | Link to this comment | 03- 1-16 10:27 AM
horizontal rule
124

...but at least I didn't spend 20 years obsessively reading the cipherpunks email list...

Apparently 20 years was not long enough to know it's spelled "cypherpunks". *sigh*


Posted by: Barry Freed | Link to this comment | 03- 1-16 10:29 AM
horizontal rule
125

119 isn't that exactly the right claimed by Phil Zimmerman and the other crypto libertarians of the nineties? The whole point about pgp was that the government simply could not access the information thus encrypted without the right key and that anyone, terrorist, child abuser, banker, hobbyist, could access the technology because the alternative would be government tyranny. Whether they were right or wrong, that is the world we live in now.

Of course most people can't be bothered with strong encryption but of you really need to hide something it's available.


Posted by: Mustapha Mond | Link to this comment | 03- 1-16 10:34 AM
horizontal rule
126

Hey, can't the FBI just use ResEdit? That's what I would do.


Posted by: Barry Freed | Link to this comment | 03- 1-16 10:37 AM
horizontal rule
127

121 -- yes, that's exactly right, "without the explicit key being provided by the owner" is the key phrase and gets at the issue exactly. To be clear ,it may well be that on net it's necessary for some (otherwise 100% validly accessible) information to be permanently lost to the government and the courts when its owner can't provide a key, to protect information privacy more broadly for the general public. But in some cases that might not be necessary, and you need someone to look at specific facts and balance the harms.

To 122, again, the government in these cases 100% acknowledges this now-standard argument as a theoretical possibility, but says it doesn't apply w/r/t what it is asking Apple to do here. The government could well be wrong about that. But they aren't ignoring the issue or pretending like it doesn't exist. They are just saying that on the specific facts at issue the balance of harms favors their request to Apple, and that the parade of horribles about generally undermining useful encryption is not, in fact, applicable and won't, in fact, happen. So general cyberpunk bromides or whatever aren't likely to be helpful.


Posted by: RT | Link to this comment | 03- 1-16 10:55 AM
horizontal rule
128

They are just saying that on the specific facts at issue the balance of harms favors their request to Apple, and that the parade of horribles about generally undermining useful encryption is not, in fact, applicable and won't, in fact, happen.

I honestly didn't think you were this naive.


Posted by: Josh | Link to this comment | 03- 1-16 11:12 AM
horizontal rule
129

Here is the real question: is what the FBI is actually asking Apple to do IN FACT likely to lead to increased widespread illegitimate information theft by private parties, or not? The government says no, Apple says yes, both sides have strong incentives to shade the truth in their favor, and knowledgeable technical people (as far as I can tell, which isn't very far) appear to be taking both sides. Whatever prior pro- or anti- Apple or pro- or anti-FBI bias you have is not going to give you the answer to that question. If the answer is "no," and the burden on Apple not too great, then the government should probably get the information.* If the answer is "yes," the government shouldn't get to order Apple to do this.

* this ignores some interesting statutory and First Amendment arguments that Apple has, which may be right or wrong, but our discussion has also largely ignored them.


Posted by: R Tigre | Link to this comment | 03- 1-16 11:33 AM
horizontal rule
130

both sides have strong incentives to shade the truth in their favor

I don't understand this claim. We know that Apple has cooperated in a significant number of other cases. So Apple does not perceive their market position to be undermined by cooperating with government efforts to recover data from phones. So if this case is really no big deal, why are they claiming otherwise? What's their incentive? Jobs I could imagine taking a pure Fuck You stand, but that's not Cook's style. The manpower ask is IMO significant, but not significant enough to justify this battle on its own.

So what story are you telling? What is Apple's incentive to shade here? I mean, any legal filing pushes and shades, but if the underlying claim (that this will "break cryptography", for want of a better term) is false, why are they making it at all?


Posted by: JRoth | Link to this comment | 03- 1-16 12:23 PM
horizontal rule
131

130 -- they get a large PR boost as an American company, particularly abroad in Europe and Asia and countries where the NSA scandal was a big deal. They (maybe) avoid having to do the same thing in some other countries (though a truly nefarious country will make them do it anyway, and be silent about it). They boost their own reputation for security. And, since their business model is basically to own and license hardware and software on devices that they want to be (and largely are) ubiquitous internationally, they both increase their own reputation for truly internationally-safe products, avoid the risk of ubiquitous government cooperation, and dissasociate their brand from the United States government. That's just off the top of my head. They probably have pure cost-based reasons to resist, too, especially if they anticipate 100s of similar requests a year. Compliance with subpoenas is a huge pain in the ass that does absolutely nothing for a company's bottom line.


Posted by: R Tigre | Link to this comment | 03- 1-16 12:34 PM
horizontal rule
132

130 -- It's also possible that Apple believes that if they give in once on anything, they will have to give in -- or fight -- 100s of times a year on every case that any agency brings.
The kind of case by case reasoning between different principles that Tigre advocates is not only reprehensible to some temperaments but time consuming and expensive for the companies involved.


Posted by: Mustapha Mond | Link to this comment | 03- 1-16 12:42 PM
horizontal rule
133

Yes, that's probably right. Also don't discount the fact that they have good and creative litigators who probably sold them on their ability to push new statutory and constitutional arguments. Given the political climate they probably figured the risk/reward was worth pushing back and trying to fight on the law a bit. Sometimes companies make these kinds of decisions, which is excellent news for people like me (and I wish, specifically, me, but sadly not often) when it happens.


Posted by: R Tigre | Link to this comment | 03- 1-16 12:46 PM
horizontal rule
134

||

For Tigre -- this seems like his kind of thing

[T]his is also, roughly, the thesis of Ghettoside, Jill Leovy's 2015 book about homicide in LA's predominantly black southern neighborhoods. Ghettoside was met with hosannas when it was published in January of last year, in part because it made a remarkable case for the roots of endemic urban violence. The trouble, Leovy argued, was not that poor black people lacked discipline or restraint or civility, as conservatives have long alleged, but it was not so simple as overbearing, brutal police practices either.

Rather, Ghettoside went, the unrest south of the 10 was the consequence of both over- and underpolicing: cops too quick to earn distrust by harassing citizens for minor crimes and too slow to take the big cases seriously, barely investigating murders and assaults that could not be immediately solved.

...

Leovy's theory of violence may be newly popular, but reading Eternity Street it's clear that the Los Angeles Leovy saw was not new. If anything, it was one of the last vestiges of the old Los Angeles, the way the whole city was for a long time. Failures of authority, retribution, sublegal systems of law: These were not some strange features of particular neighborhoods in a particular time, but rather something carried forward from the oldest strands of Los Angeles DNA.

If nothing else, Eternity Street adds context to Ghettoside, an understanding of continuity that's vital to the discussions Leovy has inspired. It is our good fortune that these two books came out so close to one another, that they independently came upon similar theories of Los Angeles, that they speak to the centuries before and after the 20th, allowing a city with a history and a future beyond periods of Hollywood -- as troubling as that future and that history may be.

|>


Posted by: NickS | Link to this comment | 03- 1-16 12:56 PM
horizontal rule
135

Thanks. Ghettoside is great and I would recommend it to literally anyone with any interest in police issues, anywhere.


Posted by: R Tigre | Link to this comment | 03- 1-16 1:01 PM
horizontal rule
136

Thanks. I take your recommendation seriously -- I know that you're interested in the issue.


Posted by: NickS | Link to this comment | 03- 1-16 1:07 PM
horizontal rule
137

131: But they also take a PR hit as a company that refuses to fight terrorists. And there's no PR cost to cooperating, because unless they refuse, this case just falls into the broad category of "helping the FBI", which, as I've said, we know they've done hundreds of times before.

I take your point that there's upside to the battle, but I think the track record shows that Apple's default response to the FBI is yes, not no. So there are presumably objective facts pushing in the direction of this fight. Perhaps you're right that it's burden, but Apple's team deems security the higher moral ground.

Probably worth noting here that DoJ completely fucked over Apple on the e-book ruling, including forcing Apple to pay ridiculous sums for an oversight guy who's basically gotten the ultimate sinecure. So there's probably an institutional resistance to being asked to do large amounts of free work for the gov't.


Posted by: JRoth | Link to this comment | 03- 1-16 1:16 PM
horizontal rule
138

Probably worth noting here that DoJ completely fucked over Apple on the e-book ruling, including forcing Apple to pay ridiculous sums for an oversight guy who's basically gotten the ultimate sinecure. So there's probably an institutional resistance to being asked to do large amounts of free work for the gov't.

I stupidly hadn't thought of that despite once being (very, very peripherally) involved in that case. But of course you're right that their in-house lawyers truly hate the DOJ in general these days.


Posted by: R Tigre | Link to this comment | 03- 1-16 1:19 PM
horizontal rule
139

138: I don't think I've seen it mentioned anywhere, I was just sort of spinning out the implications of each case taking close to an engineer-year to complete, and the FBI bringing them a hundred (or more) phones a year. They'd basically have a dedicated in-house iPhone-cracking division.

Probably not every crack would be unique, but I am skeptical about the various ways people want to downplay the FBI's ask. One of the cans of worms I think this opens is basically requiring Apple to implement any workaround that anyone can conceive of; that is, this case is clearly a big ask, technically speaking, so I'm not sure why anything that's remotely possible would be off-limits. And I tend to think that would slide towards things that Apple doesn't even think are possibly, but could conceivably be. Why not ask them to throw a few thousand hours at the problem?


Posted by: JRoth | Link to this comment | 03- 1-16 1:39 PM
horizontal rule
140

so I'm not sure why anything that's remotely possible would be off-limits.

Well, under totally standard subpoena law, it's a perfectly valid objection for Apple to say "this is a ridiculous and undue and unnecessary burden," and set out evidence explaining why. But that's a very inexact standard, and there's considerable uncertainty about when a burden becomes "unreasonable." And of course things that might look manifestly unreasonable for a small company or an individual become harder to sell as unreasonable when you're the most profitable company in the world.


Posted by: R Tigre | Link to this comment | 03- 1-16 1:44 PM
horizontal rule
141

140: Right, and I think that's precisely the worry. It would be insane for the FBI to demand 1600 hours* of uncompensated labor from me, but it's so easy to say that Apple is a huge company with many employees. But which employees can do this work? I have only an outsider's sense of these things, but is this scutwork that any aspiring programmer can do? Even if it is on a technical level (which I tend to doubt), do they really give someone like that access to what is pretty much their most valuable intellectual property? So do they now have to pull some of their best employees off of real work to do this work? That means other work is being done by inferior employees, ones who can't do or can't be trusted to do this work. Does Apple literally have to hire top notch programmers and engineers just to be dedicated to this work? Is that reasonable simply because they're wealthy?

I get that the judges will in theory be balancing these questions, and also that it's probably not the core issue, but it also makes me very uncomfortable.

BTW, Comey did, indeed, testify today that of course they'll come back and ask Apple to open more devices. Shocker.

*actually, the amount quoted above is 10 people for 4 weeks; is that 4 weeks at crazy Silicon Valley hours, or is that 8 hours a day because screw the government, or is it 8 hours a day doing FBI work and 8 hours doing Apple work?


Posted by: JRoth | Link to this comment | 03- 1-16 2:33 PM
horizontal rule
142

Apple probably has -- maybe 5? -- people right now who do almost nothing but work on subpoenas, and probably 20-30 total who do nothing but work on document requests of various kinds for court actions in which Apple is a party. Including certainly some very well-trained IT people, and, for them, maybe a real serious products engineer or two. Mostly civil requests, not criminal, but the same group would probably deal with both. That'd be a different group than whomever would be doing whatever is asked for on the iPhone operating system. More or less pulling a semi-informed number out of my butt, I'd expect that Apple already spends hundreds of thousands of dollars per year right now exclusively complying with third-party subpoenas, and multiple millions on document production in first-party litigation. Those aren't trivial costs but they are a big company.


Posted by: R Tigre | Link to this comment | 03- 1-16 2:51 PM
horizontal rule
143

The Horrible Example is BlackBerry, who actually did give into one blackmailer and ever since have had to constantly fight demands from jurisdictions ranging from iffy to flagrantly tyrannous.


Posted by: Alex | Link to this comment | 03- 1-16 3:12 PM
horizontal rule
144

Are there any programmers in this thread who think that it's possible to create a backdoor that can be used for government warrants/subpoenas without also severely damaging the security of the product?

I don't believe that I know any programmers who have any experience with security who believe it's possible to create a backdoor without compromising security. I'm sure that some exist somewhere, but I've never met one and if you are one, I'd be curious to hear what the actual proposed mechanism is.


Posted by: sral | Link to this comment | 03- 1-16 4:09 PM
horizontal rule
145

I'd be interested in hearing the answer to 144,

But, also, is what the government is asking for in this case a "backdoor"? The government says explicitly in its papers that it is not asking for a "backdoor" and that what it is requesting does not require Apple to create or apply a backdoor applicable to all iPhones. That's not a rhetorical question -- if it is in fact a "backdoor," that would be interesting to know.


Posted by: R Tigre | Link to this comment | 03- 1-16 4:17 PM
horizontal rule
146

You know, in retrospect, the OP here was incredibly strong, and no one has really dug into the arguments. I don't know if there's anything lamer then reposting the OP, but this all seems interesting and un-addressed:

There is also one fact: strong encryption is unbreakable and widely available. The security services lost that fight for good in the 1990s. So there are some things that Apple simply cannot help the FBI with, even should they want to do so. The question then becomes, why shouldn't they do what they undoubtedly can?
So far as I understand it, three arguments have been put forward why they shouldn't. All seem variants of the slippery slope, though one is technical, one legal, and one geopolitical.
1) The technical argument is that you can't make the necessary fbiOS so that it works on only one phone. If it works on the phone of interest, then it will work on all others of the same model. I don't know if this is in fact true: if the phone remains in Apple's physical custody it's hard to see how the fbiOS could be copied off it by the authorities for further, unauthorised use. But let's assume that the program does escape from this particular phone and is available to the security services. The technical version of the slippery slope argument is that they would then deploy it without any legal constraints.
1 b) Technico-legal: if Apple can be compelled by a court order to help get at the contents of this particular model of phone, then subsequent court orders might compel them to co-operate with any other model of phone they make. This looks like a hybrid technico-legal slope: older iPhones can be cracked without difficulty by Apple and it's clear this used to happen on a routine basis. Post-Snowden Apple made the decision to build their phones so it would be much harder to crack them (see encryption point above) and the company could presumably in the future make them proof against fbiOS-type upgrades. Of course that would make it impossible to fix bugs in the firmware, too, but that's a tradeoff the company and its customers might think worthwhile.


So, the real question, as I understand it, is "would whatever Apple does with a "fbiOS" in fact escape and be available to people who want to hack into iPhones in non-judicially regulated situations, or not"? The OP, if I understand it right, seems to contemplate the risk as being that the government would get improper control over the "fbiOS." That seems extremely unlikely to me -- whatever Apple did wouldn't be used by the FBI without Apple's permission and without court supervision in other cases. If it was used, it would only be used to search phones the government has a legal right to search anyway. The real threat would be the thing leaking out into private hacker, not US government, hands, but the (presumably informed) OP author doesn't seem to think this is a threat or a reasonable possibility. What are the facts?


Posted by: R Tigre | Link to this comment | 03- 1-16 4:35 PM
horizontal rule
147

144:

Paradigmatically, a backdoor in a system provides some form of access that isn't subject to normal authentication procedures. That's not quite what Apple is being asked to create: fbiOS would weaken the authentication capabilities of the system so as to make it possible for the FBI to force their way through security barriers, which isn't the same thing as providing a means of bypassing those barriers altogether.

But if the legal issue comes down to balancing law enforcement needs against possible damage to the security interests of uninvolved iOS users, then the difference between creating a backdoor and making a system brute forceable where it previously wasn't doesn't seem of much importance.


Posted by: lambchop | Link to this comment | 03- 1-16 5:04 PM
horizontal rule
148

Thanks. Is there a difference in how likely the fbiOS would be to escape to parties outside of Apple? Obviously, the FBI or law enforcement could ask Apple to use it again (as long as they had a legitimate need to search a phone and the use of the program was necessary to do the search). But what would be the impact of this thing on users whose phones are not subject to criminal subpoenas by the federal government?


Posted by: R Tigre | Link to this comment | 03- 1-16 5:30 PM
horizontal rule
149

144:
the backdoor already exists. It is the fact that as long as the software update being loaded on it is (cryptographically) signed by Apple, that it will be accepted. (I believe more recent iPhones only allow software updates if the user unlocks them.)

To me the scary part is the misuse of the signing key. I see it as a form of coerced lying. To me the next step on the slipper slope is that Apple is asked to include some government-supplied trojan software in the next update of say iMessage that is sent to user 'BA'. When I click "update" I trust
(1) that the update summary is accurate and when it says "fixes numerous security vulnerabilities" that this is what the update is about; and
(2) that the phone verified (the cryptographic signature) that the software is genuinely from Apple.
In this example both of these trusts are violated. In the present case, the FBI is only asking for (2). But I don't see a big difference between (1) and (2).


Posted by: BA | Link to this comment | 03- 1-16 5:58 PM
horizontal rule
150

149: One thing I don't understand is, why is making the actual fbiOS expected to take a large amount of effort?

I would have guessed that all they need to do is make a new version that removes a single check, and then sign it. Maybe they have a qualification process for releases that takes a month, but if they skip that (and why not?) what's the actual work involved that's going to take 40 person weeks? I must be missing something obvious here because everyone seems to think it's obvious that it should be a lot of effort to create this program.


Posted by: sral | Link to this comment | 03- 1-16 6:53 PM
horizontal rule
151

150: That would be my guess. For Apple, a bug that affects 1% of users is a disaster. So, if they dispense with their usual carefulness, I'd assume that it could be done much faster.


Posted by: BA | Link to this comment | 03- 1-16 7:07 PM
horizontal rule
152

148: You'll get varying estimates of this, but my guess is that if there's a good chance the program will leak out, if not publicly, at least to some third party.

Microsoft certainly didn't want the Windows source code to leak out, but it has. My understanding is that are even multiple different versions floating around in public. In 2009 (ish?), some hackers compromised a large number of major American companies, including Google, which has the best security practices of any large company I've seen.

The thing I don't understand from 150 is, shouldn't this also apply to the key? I'd expect that an attacker who can compromise Apple and obtain the key they use to sign updates can also produce a slightly modified version of a binary with one check removed and then sign it. The latter seems trivial compared to the former.

I don't think there's a good solution to this problem. In the pre-automatic updates world, most people ran out of date software with known vulnerabilities that could be exploited remotely. If you require consumers to sign off on updates with a PIN, a large fraction simply won't do it and they'll run out of date and easily compromised software. But if you don't, anyone with the appropriate key can push a fake update.


Posted by: sral | Link to this comment | 03- 1-16 7:10 PM
horizontal rule
153

If I'm reading 149 correctly, the change only matters (or would weaken other people's security) if Apple made (or encouraged) users update to a new version of their software which contained the new security flaw. But, other than the government maybe wanting this to happen in the future as it slides down the slippery slope, is there any evidence that Apple actually would do this, or would have to do it?


Posted by: R Tigre | Link to this comment | 03- 1-16 7:13 PM
horizontal rule
154

152 -- thanks.


Posted by: R Tigre | Link to this comment | 03- 1-16 7:14 PM
horizontal rule
155

How likely is it that, once it exists, NSA and its ilk do not get ahold of it by other means - whether by a secret FISA order or completely extralegally?


Posted by: Minivet | Link to this comment | 03- 1-16 7:19 PM
horizontal rule
156

150, 151: Obviously I'm no expert, but a couple things occur to me:

1. This is a complete iOS replacement. Even though it's only changing a couple things (not just one: it's allowing unlimited attempts, removing timeouts between attempts, and allowing input via cable. The last one is potentially a big deal, I don't know, but it's a far sight different from changing the value of the "timeout=X" line), every other aspect of the OS needs to work. Big chunks can be ignored, but I doubt the code is so well wrought that it's a simple matter to go in, change a few lines, and be done*.

2. The FBI will be really, really pissed if they brick the phone. Indeed, given the situation, I'm 100% certain the FBI would accuse Apple of sabotaging the process. So that 1% failure rate is actually too high here. You don't necessarily need as much testing as for a gold master public release, but this isn't hobbyist messing around, either.

3. They've never done anything like this. Deconstructing a wall isn't the exact same skillset as building one. Nobody involved is going to feel 100% confident/comfortable.

I'm not saying the 1600 hour estimate is clearly right, I'm just saying that I don't think it's obviously way wrong. 25% heavy, sure, maybe. But is it really 1 manager, 1 engineer, and 2 code monkeys for a week? I doubt it.

*example: for a number of functions, you don't need the passcode to get into the phone at all (emergency call, taking pictures, receiving calls). But then the passcode is triggered if you rummage around from there. Presumably an ideally-written OS would just call back to a single module, but A. who knows if it's that well-written?, and B. you still have to check.


Posted by: JRoth | Link to this comment | 03- 1-16 7:43 PM
horizontal rule
157

I'm not saying the 1600 hour estimate is clearly right, I'm just saying that I don't think it's obviously way wrong. 25% heavy, sure, maybe.

Not that it's particularly relevant to the larger argument, but since everyone seems fixated on this 40 person-week estimate, that's not what Apple said in its pleading. Here's what they said:

Although it is difficult to estimate, because it has never been done before, the design, creation, validation, and deployment of the software likely would necessitate six to ten Apple engineers and employees dedicating a very substantial portion of their time for a minimum of two weeks, and likely as many as four weeks.

So we're talking something between 12 and 40 person-weeks (really less than that, because it's a "substantial portion" of their time, not all of it, but whatever). Even by Apple's own reckoning, the 1600 hour figure could be a good deal more than 25% heavy.


Posted by: potchkeh | Link to this comment | 03- 1-16 8:06 PM
horizontal rule
158

Hesitantly delurking: Did I miss it, or has no one yet linked Apple's motion to vacate the judge's order compelling them to write fbiOS?

IANAL, but the judge apparently issued the motion ex parte, without giving Apple any chance to argue.


Posted by: spork | Link to this comment | 03- 1-16 8:12 PM
horizontal rule
159

Also: The judge denied the FBI's motion yesterday,


Posted by: spork | Link to this comment | 03- 1-16 8:14 PM
horizontal rule
160

157: Ah, OK, thanks. All I'd seen was the 10x4 quoted upthread. Anyway, even their lowest estimate is 3X the sort of minimal effort contemplated in 149/150.


Posted by: JRoth | Link to this comment | 03- 1-16 8:19 PM
horizontal rule
161

158: Yes, the judge issued the order ex parte, but also expressly provided Apple an opportunity to apply for relief if it believed the order was unnecessarily burdensome. IANACriminalL but from what I understand there's nothing procedurally unusual about that (and aside from some innuendo, there's nothing in Apple's brief suggesting that the ex parte nature of the order rendered it improper, and you'd certainly expect them to raise it if it did).

159: Different judge, and pwned by 69.


Posted by: potchkeh | Link to this comment | 03- 1-16 8:22 PM
horizontal rule
162

161.1: "And more importantly, by invoking 'terrorism' and moving ex parte behind closed courtroom doors, the government sought to cut off debate and circumvent thoughtful analysis."

(The whole document makes for some good reading.)

161.2: Ah, whoops.


Posted by: spork | Link to this comment | 03- 1-16 8:30 PM
horizontal rule
163

162.1: sure, but that's just color commentary, it's not a legal argument.


Posted by: potchkeh | Link to this comment | 03- 1-16 8:36 PM
horizontal rule
164

I don't know much about cryptography, but Diffie and Hellman just won the Turing award.


Posted by: fake accent | Link to this comment | 03- 1-16 11:37 PM
horizontal rule
165

155 -- I'd say that's extremely unlikely to happen, except maybe to examine a specific phone held by a specific suspect who is already detained in custody, or something, and then only in connection with a court order. But the NSA isn't going to be taking this up to randomly unlock and surveil iPhone users.


Posted by: R Tigre | Link to this comment | 03- 1-16 11:52 PM
horizontal rule
166

The NSA pattern seems to be to unlock everything they can and claim they're only looking at targets. They'd prefer iPhones that can't be locked, rather than have to pick ones to unlock.


Posted by: fake accent | Link to this comment | 03- 2-16 12:48 AM
horizontal rule
167

I don't think the NSA is strictly relevant to this argument. Neither the legal nor some of the technical restraints on the FBI seem to bind them. But we will never know.


Posted by: Mustapha Mond | Link to this comment | 03- 2-16 1:34 AM
horizontal rule
168

149 is a very clear statement of the practical drawbacks for consumers.


Posted by: Mustapha Mond | Link to this comment | 03- 2-16 1:35 AM
horizontal rule
169

To me the scary part is the misuse of the signing key. I see it as a form of coerced lying. To me the next step on the slipper slope is that Apple is asked to include some government-supplied trojan software in the next update of say iMessage that is sent to user 'BA'. When I click "update" I trust
(1) that the update summary is accurate and when it says "fixes numerous security vulnerabilities" that this is what the update is about; and
(2) that the phone verified (the cryptographic signature) that the software is genuinely from Apple.
In this example both of these trusts are violated. In the present case, the FBI is only asking for (2). But I don't see a big difference between (1) and (2).

It should be noted that this is exactly what GCHQ did (though not with Apple-owned software, as far as I know), according to various Snowden leaks. And not just against criminals, but for instance to infiltrate the communications infrastructure of allies.


Posted by: Ginger Yellow | Link to this comment | 03- 2-16 6:25 AM
horizontal rule
170

I will bet dollars to donuts that when people started to record their thoughts using tape recorders, instead of stenographers, some hipster said the cops have a right and need to look at my steno transcriptions but not the dicto tapes cause [insert mystical nonsense here] the tapes have some magical privacy associated with them.

One can only pray that 10 years from now, when this is old as arguments about how to redo the limestone cover on the great pyramid, that the silly fools who side with apple will be at least a little embarrassed.


Posted by: Ezra Abrams | Link to this comment | 03- 4-16 4:00 PM
horizontal rule
171

This thread is dead, but:

Id rather have (good, cake) donuts than dollars, but here's a hearty fuck you to whoever Ezra Abrams is. There's nothing remotely noble about sneering at privacy as a value. Ten years from now, apologists from the FBI will be as revered as McCarthyites. Who knows--maybe Exra thinks those guys were awesome, too.


Posted by: JRoth | Link to this comment | 03-16-16 9:30 PM
horizontal rule
172

Goddammit, -x +z


Posted by: JRoth | Link to this comment | 03-16-16 9:31 PM
horizontal rule
173

Goddammit, -x +z


Posted by: JRoth | Link to this comment | 03-16-16 9:31 PM
horizontal rule
174

Goddammit, -x +z


Posted by: JRoth | Link to this comment | 03-16-16 9:31 PM
horizontal rule