Apple vs. U.S. government: a big dilemma

Most of you know about this already: the FBI, investigating the December murderers of 14 people in San Bernardino by two Muslim extremists, Syed Farook and Tashfeen Malik, have asked Apple to unlock Farook’s iPhone 5c, which may contain clues about the murders or other terrorists. A federal judge ordered Apple to create the software necessary to unlock that phone. Apple, however, is resisting on the grounds that this might compromise the privacy of its customers. It also argues that this would give the government sweeping powers to make technology companies part of the prosecution in fighting crime.

Current law says that firms or institutions must in general comply with such “unlocking” orders unless they pose an onerous burden on the company. And, it could be argued, asking Apple to create new software to unlock a phone (it would have to try gazillions of passcodes) could be seen as onerous. But Apple’s arguments are really intended to reassure its customers that it cares deeply about their privacy—and that’s important to iPhone users.

The New York Times has a brief explanation of the situation.  What the FBI wants Apple to do is create code to bypass a feature that, after ten failed attempts to enter a phone’s passcode, would erase all of the phone’s data. Only Apple can do this since its operating systems have special code “tags” known only to the company.

The judge agreed that Apple could retain and later destroy the new unlocking software, and that the phone would be “hacked” by Apple in its own secure facility, although of course the information would go to the FBI.

This is a real dilemma. What is to be done? Apple has until February 26 to respond. And this is a question of ethics, somewhat analogous to the dilemma of whether to torture someone who has information that could lead to saving thousands of lives by revealing the location of a time bomb. I am of course sensible of the difference between torturing someone and creating software, and between people dying from a ticking bomb versus having their data compromised; but both dilemmas instantiate a weighing of relative harms.

This, I think, shows the problem of arguing for an “objective” morality. On one hand we have the possible (but not certain) revealing of data about terrorist networks, with the “well being” constituting the possible saving of lives. On the other we have the creation of a precedent that could allow the government to act intrusively, on the merest excuse, to get people’s private data. The “well being” here is the safety of people against losing their private information, and of creating a precedent that could be misused. Now how on earth can you possibly weigh these different forms of “well being”, even if we could know perfectly all of the consequences of both actions? (And, of course, we can’t.)

My own feeling is that Apple should comply with the government’s request, as this is tantamount to fulfilling a search warrant—with the exception that Apple has to create new software for the FBI. But there are ways to mitigate the harms of doing that. Apple could, as it will, destroy the software so that nobody else can have it. It can extract the data without the government being present when it does so. And, in the future, Apple could, I’m told, even create an iPhone whose passcode could never be hacked by any software, something that seems perfectly legal.

This case is likely to go up to the Supreme Court, for Apple doesn’t want to be seen as compromising its interest in customer security.

I’m asking readers to weigh in below on this issue, as my own opinion, while leaning toward the government (after all, nobody is being tortured here), is susceptible to change. Are you on the side of Apple, or of the FBI?

592 Comments

  1. Frank Bath
    Posted February 20, 2016 at 12:38 pm | Permalink

    Same Harris’s most recent podcast has sensible words to say on this. http://bit.ly/1oS5jCi Hope that works.

    • Scote
      Posted February 20, 2016 at 4:49 pm | Permalink

      Why did you use a link shortener? They make it hard to tell where a link is going to without clicking on them and serve no purpose other than to track clicks unless you are tweeting the link and need it to be short.

      The URL to the podcast “Meat Without Misery” (that includes a discussion of encryption)is:

      https://www.samharris.org/podcast/item/meat-without-murder

      The encryption monologue starts around 10 minutes in. Harris can’t even be bothered to sum up the Apple case accurately, falsely claiming,

      “Apple built the lock but they didn’t build the key. And now they are telling us that building the key would put us all at risk. And this is being spun as a defense of our Liberties…even at the end of a court, a fully responsible process, and a laborious one, that gets a court to sanction their search for information, people still believe the government has no right to search our smart phones.”

      Pretty impressive how few words it takes for Harris to get this so wrong. First, the passcode is the “key”, and apple isn’t being asked to make a key, or decode the passcode. Instead, the court has ordered Apple to write a custom, cyroptographically signed version of their operating system that lacks the automatic lockout protections that keep anyone and everyone from brute forcing the passcode on an iPhone. Brute forcing short phone pincodes using full electronic access is trivially easy for most pins. And this security busting tool would take all iPhones from being secure from access by the government, with or without warrants, to being insecure. This is an order to destroy the security of all iPhones.

      Second, nobody, but nobody, is saying the government has no right to access this iPhone. The phone belongs to the suspects employer, a government health department. People are objecting to the government being able to force apple to compromise the security of *all* of its phones by having to custom create an insecure iOS that could be used (or modified to be used) on all iOS devices.

      Harris simply isn’t credible on this issue.

      • Diana MacPherson
        Posted February 20, 2016 at 4:59 pm | Permalink

        I also didn’t like Sam’s conclusion that no one should be allowed to have a room that no one can get into because terrorists could use this room too. I find that unconvincing given his stance on personal protection with firearms. Bad guys also use guns. Does that mean good guys shouldn’t have them? I think Sam would say no. If I want to protect myself in an impenetrable room, that’s probably a better way to ensure my safety than me using a gun.

        • Scote
          Posted February 20, 2016 at 5:14 pm | Permalink

          I think the issue of guns causes many gun rights supporters to throw all logic and consistency out the window. When questions about guns are in play they are constitutional absolutists, but when it comes to terrorism, they seem to consider the constitution optional, with no concession that the right to be free from unreasonable searches and seizures is also in the bill of rights.

          I would not be surprised to find that many of the people who support of the FBI on this issue were against the FBI in the cases of the Bundy’s armed occupation of government land. Many are against Federal government power, unless *brown* terrorists, in which case, the more power the better.

          • Jeff Ryan
            Posted February 20, 2016 at 5:53 pm | Permalink

            No, I think you’re wrong. Certainly every “leftie” I know was rooting the FBI on regarding the Bundys. If anything, many “lefties” though the FBI acted too slowly.

            Given the result, though, I have come to agree that they handled it just right.

            • Scote
              Posted February 20, 2016 at 6:07 pm | Permalink

              I’m thinking of the many right wingers who support the FBI in this power grab, because *brown* terrorists, but opposed the FBI when armed *white* militants took over federal property. Their opinions on appropriate levels of government and FBI power swing wildly depending on who the government is investigating.

              • Jeff Ryan
                Posted February 20, 2016 at 6:21 pm | Permalink

                On this we are agreed.

      • Ken Kukec
        Posted February 20, 2016 at 5:29 pm | Permalink

        But, at least as I understand it, Apple will not be compelled to furnish the “custom, cyroptographically signed version of [its] operating system ” to the government. If Apple’s own possession of its creation itself puts its customers’ privacy at risk, then Apple can be compelled under penalty of criminal sanction to destroy its creation once the information has been obtained from the subject telephone.

        • Scote
          Posted February 20, 2016 at 5:44 pm | Permalink

          I don’t really see how that is possible given that the phone is evidence that will remain in the FBI’s custody. They’ll have the compiled code on the device itself, which they will have full access to once they brute force the passcode using the access granted by the modified OS.

          Apple may not have to give the *source code* to the FBI – this time. But next time, since they already have it, the FBI could argue it is not an “undue burden” to give it up. There will be precedent for companies having to compromise the security of their OS on demand.

          This is not an imaginary slope. The government overreach when it comes to cell phone searches is real, and documented (to the extent the government has been caught). Michigan state police used a device to down load cell phone contents at *traffic stops*. So, while they may be using “Scary brown terrorists!” as the excuse for this destruction of phone security, they will use it for anything and everything they can.

          http://abcnews.go.com/Technology/michigan-police-cellphone-data-extraction-devices-aclu-objects/story?id=13428178

          • infiniteimprobabilit
            Posted February 20, 2016 at 6:05 pm | Permalink

            Agreed. The fundamental problem, which will never go away, is that all policemen think they’re the ‘good guys’ and therefore anything they can do to catch the ‘bad guys’ is justified. Big Brother? Police state? That’s just collateral damage.

            And when I say ‘all’ policemen, I’m sure there were many Gestapo agents who thought the same way. (Someone had to stop those terrorists in the Resistance from blowing things up). And in the KGB.

            This is precisely why you have laws against ‘illegal search and seizure’. Not that it bothered Hoover much…

            cr

            (I’m not saying we should do away with policemen. Just watch them very carefully. And never ever let them do what they want just because they want to.)

            • Jeff Ryan
              Posted February 20, 2016 at 6:20 pm | Permalink

              Erm, if I understand correctly, the FBI is petitioning a court, not a dictator. Which means due process must be satisfied.

              Why is this different from any other search warrant case?

              • Scote
                Posted February 20, 2016 at 6:32 pm | Permalink

                Because this isn’t a search warrant case. This is an All Writs Act case.

              • Jeff Ryan
                Posted February 20, 2016 at 7:04 pm | Permalink

                In a court. Where due process applies.

              • infiniteimprobabilit
                Posted February 20, 2016 at 6:52 pm | Permalink

                I’m sure the KGB usually complied with due process. Probably even the Gestapo, mostly.

                But did Michigan State Police comply with due process when snooping on cellphones at traffic stops (as cited by Scote)?

                The problem is the mentality of (some) policemen who see ‘due process’ as a hindrance to be circumvented by whatever means possible.

                ‘All Writs Act’ looks to me like a way to circumvent existing laws on search and seizure. Or any other inconvenient law, in fact.

                cr

              • Jeff Ryan
                Posted February 20, 2016 at 7:13 pm | Permalink

                Ah, the day’s first Godwin!

                I am hardly saying that there aren’t abuses. But I note that if the system were so rigged, how would you know of them? You wouldn’t. For every legitimate use of governmental power you can find abuse. Is it your position that therefore such abuse justifies refusing to use lawful power? Because someone got beat up by bad cops when they were arrested, do we therefore ban arresting criminals? It seems that’s where your logic leads.

              • Scote
                Posted February 20, 2016 at 7:21 pm | Permalink

                “Ah, the day’s first Godwin!”

                It’s only a Godwin if the analogy is unwarranted.

              • Jeff Ryan
                Posted February 20, 2016 at 8:49 pm | Permalink

                I rather thought the point was implicit.

                If you think this is analogous in any way to Nazi behavior, you have lived an incredibly sheltered life indeed. And you dishonor those who actually suffered under the Third Reich. You are trivializing the horrific.

              • Scote
                Posted February 20, 2016 at 9:01 pm | Permalink

                “If you think this is analogous in any way to Nazi behavior, you have lived an incredibly sheltered life indeed. And you dishonor those who actually suffered under the Third Reich. You are trivializing the horrific.”

                As Donald Trump is currently running on a “We should ban all Muslims” platform, trying to claim that comparisons of government compulsory power to defeat the security protections enabled on millions of iPhones is to trivialize reality.

                I used to wonder how the good people could allow aspects of the Third Reich to happen. With the NSA disclosures by Richard Jewel and Edward Snowden, with the CIA in house and external rendition torture programs revealed, with our continued use of an extra-territorial, constitution-free zone jail for terrorism suspects, it really no longer a question. We are seeing it happen right now. Where it will go will depend on where we allow it to continue to go, whether we elect despots like Trump or Cruz, whether we allow continued expansion of government power because of the fear of terrorism.

                You claiming that comparisons to the Third Reich are out of bounds is a call to shut down legitimate and vital conversation, a call for deliberate ignorance and a way to allow history to repeat itself.

              • Jeff Ryan
                Posted February 20, 2016 at 9:32 pm | Permalink

                No, it’s merely asking you to grow up. Silly “Tyranny!” cries are no more convincing in this context than they were from the Bundys.

                I mean, really.

              • Scote
                Posted February 20, 2016 at 11:46 pm | Permalink

                Your blithe dismissal might be more convincing if the government wasn’t warrantlessly already tracking all phone call metadata in the us, scanning and tracking all post card and envelope USPS metadata, wholesale optically tapping fiber optic backbones around the US and sending the splits to the NSA, using celll tower spoofers around the US without warrants and using NDAs to avoid telling anyone, using traffic stops as pretexts to search smart phones, warrantlessly cloning phones and hard drives without individualized suspicion at US border crossing, collecting all of the personal information including credit card data and IP address for every single plane ticket purchased in the US and requiring airlines to get permission from the government before allowing anyone to fly, maintinging a secret list of people to dangerous to fly but not dangerous enough to arrest – a list there is no real way off of and is create without due process. The US government since 2001 has used secret arrests, secret detentions of indefinite length without access to council, secret prisons, secret trials, secret evidence, and even secret laws and secret rationales. And, again, that is just the stuff we know about with certainty.

                And that is just some of the crap we know about with certainty. So don’t give me this song and dance about “Silly “Tyranny!” cries” when the government really is acting tyrannically. This latest All Writs Act order is just one more straw on the cammel’s back.

              • Jeff Ryan
                Posted February 21, 2016 at 1:05 am | Permalink

                Actually, this case is quite different.

                But give my regards to Julian Assange and Ed Snowden. And tell Ed the Russians will surely find him a place on the gulag should his current gig not pan out. That is, after they stop laughing hysterically.

              • Scote
                Posted February 21, 2016 at 1:07 am | Permalink

                “Actually, this case is quite different.”

                Actually, this case is part of a *continuum* of government power.

              • Jeff Ryan
                Posted February 21, 2016 at 1:20 am | Permalink

                I disagree. In this case, the government is seeking to crack a phone whose owner is dead.

          • Ken Kukec
            Posted February 20, 2016 at 8:45 pm | Permalink

            So your solution is to put evidence of a crime or terrorist act contained on a cell phone forever beyond discovery by law enforcement, no matter what? Throw up our hands and say there is no solution that can reasonably accommodate any competing interest?

            • Posted February 20, 2016 at 8:56 pm | Permalink

              So you’re saying that the case is so weak that, if this phone doesn’t get unlocked, there’s no hope of gaining a conviction?

              And you expect us to even pretend to respect a law enforcement official who continues to hold a so-called “suspect” with so little evidence to back the accusations?

              b&

              >

              • Jeff Ryan
                Posted February 20, 2016 at 9:26 pm | Permalink

                Maybe you’re just coming into this, but there’s no one to convict. They are trying to gather evidence to prevent further crimes.

              • Posted February 20, 2016 at 9:29 pm | Permalink

                …and you think that strengthens the FBI’s position?

                They have no crime, not even any suspects. They’re going on a fishing expedition. And we’re supposed to shit all over ourselves and everything we value just because they’re too lazy and / or incompetent to do their own jobs?

                b&

                >

              • Jeff Ryan
                Posted February 20, 2016 at 9:44 pm | Permalink

                No, see, sometimes law enforcement tries to stop disasters before they happen. Maybe they’re just sensitive about all that “The cops aren’t there until the crime’s been committed!” stuff. Or maybe, they are trying to avoid waiting until the victims are actually dead to do something about it.

                It’s crazy, I know.

              • Helen
                Posted February 21, 2016 at 2:30 am | Permalink

                Ben, who is being held?

    • Les
      Posted February 20, 2016 at 5:08 pm | Permalink

      Although this link is fine, it is hard to tell if a link is going to a safe place. To find out where bitly is sending you, append a “+” to the url and it will tell you the actual url without taking you there:
      https://bitly.com/1oS5jCi+

      • Diane G.
        Posted February 21, 2016 at 2:43 am | Permalink

        Good to know about, thanks.

        But I wish it gave the original url on a mouse-over, rather having us click through to a bitly page.

      • Ken Elliott
        Posted February 21, 2016 at 12:56 pm | Permalink

        Thanks, Les. That is very nice to know.

  2. Posted February 20, 2016 at 12:45 pm | Permalink

    I hate terrorism and anyone who visits violence on others. I applaud most efforts to stem such activity.

    As far as I know, Apple has gone to great length to cooperate with the FBI and the ongoing investigation. However, producing malware against its own products is not a reasonable request.

    A search warrant authorizes law enforcement to conduct a search of a person, location, or vehicle for evidence of a crime and to confiscate any evidence they find. It cannot force Apple to produce software at its own expense to help defeat one of the key elements of its iphones.

    What is done in fighting the bad, can be turned around against the good. If we as a people think data encryption is a bad thing, say because it helps criminal or terrorists protect information we would like to get hold of, we should think of outlawing it altogether. But if we believe encryption makes society generally safer (and I would say it does), then we are going to have to come up with other ways in fighting the occasional misuse of encryption technology than to force companies that create encryption technology to defeat their own technology.

    Carl Kruse

    • Mark Sturtevant
      Posted February 20, 2016 at 2:16 pm | Permalink

      This case seems to me to be still much like having a search warrant, with just cause, only here it is like trying to compel a lock manufacturer to make a master key. If we lived in a world where master keys did not exist, routinely, then trying to get a manufacturer to make a master key might seem an unwarranted intrusion by the government. But we are in that world now b/c locks and keys have been around for a long time, so we are used to that sort of court ordered intrusion.
      Smart phones are a new technology, and we are just not used to having master keys around for unlocking these devices. I personally think it is ok, if having master keys and court ordered search warrants are ok.

    • Scote
      Posted February 20, 2016 at 3:29 pm | Permalink

      The implications go beyond Apple and iOS devices though. If the FBI can use the All Writs Act to force 3d parties to do its bidding, no security promises by any company will have any merit. Between the All Writs Act and self-issued National Security Letters that come with built in gag orders, the FBI will be able to force any information out of any and all people, devices and transmission methods.

      The lock analogy seems valid, except with a physical lock, you actually are limited by the fact you have to visit those places to break into them with the master key. Not so with devices attached to the internet, as your cell phone and computer are. Massive, remotely executed surveillance on Americans is already a reality, as Edward Snowden already demonstrated. This would be one more massive hole in an already crumbling edifice of privacy.

      There is no question that the FBI has the right to get into the phone. The question is does the FBI have the right to destroy security nationwide to do it.

      • Jeff Ryan
        Posted February 20, 2016 at 3:41 pm | Permalink

        It is far from certain that it will destroy “security nationwide” to do this. And why is it remarkable to tell customers “We may have to provide information to the government when required by law”? Why is this such a giant step? It has only been true since the founding of the country.

        Don’t like it? Then don’t buy a smart phone. No one’s forcing anyone to do so. Kind of like driving a car: Your license plate is in plain view. Anyone can look it up. Violation of privacy? No.

        It astounds me that modern technology has bred such a self-centered mentality. Such a “You can’t make me” attitude that, frankly, is not a view one can sympathize with. If people want all these new toys, they have to take the bad with the good. And if the bad means risking one’s criminal behavior might be discovered, well, too bad.

        Otherwise, no one in government has the time to invade everyone’s privacy,not the inclination, nor, actually, do they even care. These idiots killed a lot of people. The investigation is legitimate. And there are any number of remedies the court can craft to limit dissemination of what’s found.

        Anyone who thinks that the material they post, the calls they make, the photos they send, the texts they send will remain private is deluded. And it’s not the government that’s broadcasting it. It’s our fellow citizens, every minute of every hour of every day.

        • Scote
          Posted February 20, 2016 at 3:57 pm | Permalink

          “Don’t like it? Then don’t buy a smart phone. No one’s forcing anyone to do so. Kind of like driving a car: Your license plate is in plain view. Anyone can look it up. Violation of privacy? No.”

          Yeah, no. I’m really more of a “Don’t like it? Then don’t allow the government to have expansive powers in the name of terrorism” camp. But, if you want to, you are free to leave the locks on your house, car and business open, and on your WiFi, cell phone and computers. But, I don’t think you or the government should have the right to force the rest of us into your choices.

          • Helen
            Posted February 21, 2016 at 3:16 am | Permalink

            I really think that once you decide to kill your fellow human beings and then follow up on that thought and execute it, you deserve to have all of the expansive powers of government all up your business.

            • infiniteimprobabilit
              Posted February 21, 2016 at 4:42 am | Permalink

              It isn’t *about* the terrorists. Nobody here gives a fig for the rights of the terrorist who is, in any case, dead.

              It’s about the rights of other Apple users, and Apple itself, and whether what the FBI is demanding is reasonable or will be used as a precedent in other (non-terrorist) cases.

              cr

      • Ken Kukec
        Posted February 20, 2016 at 5:45 pm | Permalink

        “If the FBI can use the All Writs Act to force 3d parties to do its bidding, no security promises by any company will have any merit.”

        The FBI should not, willy-nilly, be able to do so without an adequate legal showing. The legal standard employed in analogous circumstances is “probable cause” as found in a court order.

        If that standard is insufficiently protective under the circumstances, perhaps a stricter standard can be fashioned. The court can also make compliance with its order subject to any other conditions that may be needed to protect the interests at stake.

      • Adam M.
        Posted February 20, 2016 at 8:27 pm | Permalink

        If the FBI can use the All Writs Act to force 3d parties to do its bidding, no security promises by any company will have any merit.

        No security promises by any (software) company have ever had any merit. There is no secure online email service, there is no secure cloud storage service, and there are no secure smartphones. Their promises are all false promises, and always have been, and the sooner people understand this the better. Even the famous Hushmail was forced to write custom software to bypass their encryption for the government, years ago. The only encryption that’s trustworthy is the encryption you do yourself with your own key using your own trustworthy software such as GPG.

        • Ken Kukec
          Posted February 20, 2016 at 8:53 pm | Permalink

          So if software companies are evil and untrustworthy, what keeps them from just turning around and selling the information on your phone to the highest bidder on the black market?

          Is your answer really to put a cell phone’s evidence of a crime, no matter how heinous, or evidence of an impending terrorist act, no matter how grievous, complete beyond the reach of law enforcement, no matter what? Really?

          • Posted February 20, 2016 at 8:59 pm | Permalink

            Oh, please. The government has all sorts of ways of investigating crimes. They can still get legal wiretaps. They can follow suspects. They can get court orders to plant microphones. They can examine crime scenes for evidence of all sorts. They can interview known and suspected associates. They can get wiretaps for those associates. And on and on and on.

            Do you really expect us to believe that it’s worth compelling Apple to corrupt their own products for a case so weak that it all hinges on one bloody phone?

            b&

            >

            • Jeff Ryan
              Posted February 20, 2016 at 9:30 pm | Permalink

              What case? They are conducting an investigation to avert other attacks. There may be evidence of such on the phone. They don’t know who to wiretap.

              I mean, seriously? Cell phones now occupy an exalted status above telephones, bank records, your home? “Rip the walls out, boys, but don’t you dare touch my smart phone!”

              Good luck with that.

            • Ken Kukec
              Posted February 21, 2016 at 3:20 am | Permalink

              The discoverability of evidence isn’t determined by a balancing test of how bad the government needs it. If disclosure of the evidence would violate a party’s the constitutional right, that party can’t be made to forgo a fundamental constitutional right simply because the government really needs that evidence. Plus, any such balancing test would require a court to conduct a mini-trial to weigh the entire prosecution case, before every evidentiary decision.

              And in any event, I don’t much care about the outcome of this particular case one way or the other. My concern is with those of you who say such information should never be discoverable, no matter what.

              It’s certainly feasible that circumstances could arise in which a suspect;s cell phone contains the best, most crucial, direct evidence in a criminal prosecution — indeed, it happens quite regularly. It’s also feasible that detection of such information could be needed to avert a national tragedy. Are you folks really prepared to take it all the way, even under those circumstances with the no, never, no way, no how line? To tell the investigators, tough-titty, try your luck elsewhere?

            • Helen
              Posted February 21, 2016 at 3:21 am | Permalink

              How does the FBI get evidence from dead killers Ben?

          • Adam M.
            Posted February 20, 2016 at 10:54 pm | Permalink

            On the contrary, I think Apple should comply with the order. The FBI clearly has a legitimate case for a search warrant. I’m saying that ‘but if Apple complies then their security promises will be meritless’ is not a good argument, because their promises are already meritless. More people need to know that.

        • Jeff Ryan
          Posted February 20, 2016 at 9:15 pm | Permalink

          Agreed. The word “cloud” should have been enough to scare anyone with sense away. Those seeking an absolutely secure system are doomed to disappointment.

          Apple may not be able to figure it out. But there’s a 12 year old in Winnetka who already has. And he’s shared with his buds.

  3. Randy Schenck
    Posted February 20, 2016 at 12:50 pm | Permalink

    Apple is way over the top on this one and should save their fight for something far more important than this.

    We have legal procedures for wiretapping phones with something we call probable cause. You know, to catch the bad guys and all that. The search warrant is used all the time to get at email and inside people’s houses and cars. What makes Apple think this business with the cell phone is any different?

    I don’t hear anyone yelling and screaming at the government for bombing ISIS training camps in Libya, however, maybe they will. The airplanes that did it came from Lakenheath Air Base in Britain. Kind of an Irony there as I was at Lakenheath back when Gadhafi took over in Libya. Anyway, Apple picked the wrong horse on this one as best I can see.

    • Diana MacPherson
      Posted February 20, 2016 at 12:55 pm | Permalink

      The difference is Apple would need to create software to defeat its own encryption. Apple has already complied with handing over the type of information you described. In this case, Apple would create a very dangerous thing that governments (foreign ones who are not US allies) would love to have and would pay dearly for).

      • Randy Schenck
        Posted February 20, 2016 at 1:05 pm | Permalink

        You are overstating the issue. Why would apple creating it’s own ability to open the phone for the government automatically become the property of the government at all? It does not have to be. In fact, if you remember some of the data that NSA wanted the phone companies to provide or allow the govt. to get — I believe they ended up saying that the phone companies would store the info and only give some of it to the government with proper warrants.

        • Diana MacPherson
          Posted February 20, 2016 at 1:36 pm | Permalink

          Because leaks happen. Employees risk their livelihood and reputations to leak new products. These employees are usually fired with cause and sometimes sued for their indiscretions. I know of people fired simply for carrying a beta device in public. So if people are willing to risk that sort of punishment (which often includes never working in the industry again), what do you think they will do when they are giving a jackpot incentive? And it doesn’t have to be an Apple employee. It could be someone else who gained access.

          This is what people arguing against Apple are missing. And you understand it clearly if you’ve worked in the industry.

          • Randy Schenck
            Posted February 20, 2016 at 2:29 pm | Permalink

            Understand what you are saying but because people do bad things or because companies fire them or sue them for indiscretion we should say – U.S. govt. sorry, you cannot get the info on these terrorists or even look for it on this phone. It is simply too open to possibilities of human evil action. I just don’t think it works that way in the decision process of greater good. If there is a likely chance of uncovering addition bad guys by looking at the history of a phone then we do it.

            • Diana MacPherson
              Posted February 20, 2016 at 2:44 pm | Permalink

              I think the extremely likely chance of this code getting out into the wild and inflicting real damage vs. the slim possibility of the authorities finding anything on this person’s work phone that can’t be found more easily on external servers (especially when the terrorist destroyed two other phones but not this one) makes this request unjustified.

              • Jeff Ryan
                Posted February 20, 2016 at 2:50 pm | Permalink

                “The extremely likely chance of this code getting out”? Why don’t they just say, “We can’t trust our people.” ‘Cause that’s what it boils down to.

              • Diana MacPherson
                Posted February 20, 2016 at 3:05 pm | Permalink

                You think companies trust their people? If they do, why do they make them sign NDAs. Sometimes several.

              • Jeff Ryan
                Posted February 20, 2016 at 3:24 pm | Permalink

                Yep, and that’s why they tell their shareholders and customers their folks signed NDAs.

                But, again, try telling a court that. That is as pathetic an excuse as a company could up with. And one easily addressed by a court. Usually, along the lines of “Well, Mr. Apple, let me make myself clear: If it leaks, then you and the board will go to jail. Immediately. Now, any other objections?”

              • Diana MacPherson
                Posted February 20, 2016 at 3:57 pm | Permalink

                Yeah that makes sense. Order a company to make malware for their product under duress then throw them in jail if it gets out and compromised security.

              • Jeff Ryan
                Posted February 20, 2016 at 4:06 pm | Permalink

                I doubt very much that would be the result. Unless the court found that the company failed to keep its internal information secure. And that gets into agency law and any number of things.

                It’s just a piss-poor argument to say that “We can’t possible be expected to to keep confidential information secure”. Good luck with that.

              • Diana MacPherson
                Posted February 20, 2016 at 4:30 pm | Permalink

                A piss poor argument you are straw manning. I never said that. I said it there is a risk and most likely a high risk that such software could be leaked given that it would be highly sought after. People don’t even have to be malicious to leak it – they could be threatened for example. I find it amusing that you seem to suggest leaks never ever have happened in the history of software development.

              • Jeff Ryan
                Posted February 20, 2016 at 4:52 pm | Permalink

                If I misstated your point, even sarcastically, that was not my intent.

                What I am saying is that, to a court, it’s a speculative proposition. And courts hate that. And they usually should.

                The court really shouldn’t care. It isn’t a convincing argument to say, “Well, people will violate the law.”

                After all, if folks are going to require adherence by the government to court orders, then they have a right to expect a private corporation will do the same.

                Besides, as naive as it sounds, Windows inundates my PC every effing day with “updates” that are supposed to fix weaknesses that shouldn’t have been there to begin with. Apple can’t fix this? Maybe not, but it ain’t the fault of the American people.

              • Diana MacPherson
                Posted February 20, 2016 at 5:02 pm | Permalink

                But it ain’t the fault of Appke either. I can’t get behind a company being ordered to build malware for its own devices. Malware that can endanger all devices. There are things that just shouldn’t be built. And it isn’t as if the information isn’t available normally. There were iCloud backups of this phone. Apple provided those.

              • Jeff Ryan
                Posted February 20, 2016 at 5:13 pm | Permalink

                Apple created this monster. If they can’t fix it, maybe they should find another line of work.

                And I’m not just being snarky. For too long Silicon Valley has considered itself above citizenship. It is a place where shiny objects and the billions that can be made off them are far more important than real world consequences.

                I have zero sympathy for them. I certainly recognize their genius. But they embody the worst of what Reaganism brought us. Their price is too high.

              • Diana MacPherson
                Posted February 20, 2016 at 5:18 pm | Permalink

                So it seems you are m against encryption in general. Apple didn’t create that. Encryption has been around for decades. The kindbofbenceyption Apple uses has been around and deployed on handsets since the 90s. There was a hellabaloo over PGP as well not long ago because the US government didn’t like that it couldn’t crack it. So you’re saying it’s the fault of the company for securing their devices in the first place. They should have left them crackable. But of course, no one would want a device like that.

              • Jeff Ryan
                Posted February 20, 2016 at 5:52 pm | Permalink

                Please don’t put words into my mouth. I am not per se opposed to encryption. I am saying that any company that offers encryption (all of them I think) runs the risk that a government, in pursuit of a legitimate investigation, may demand access.

                The federal government has broad powers to regulate how cell phones operate. That it might demand means to defeat encryption should not be a surprise.

              • Diana MacPherson
                Posted February 20, 2016 at 6:12 pm | Permalink

                But if the government is de facto ok with encryption why is the onus on them to suck it up if the government comes knocking no matter the cost, in money or risk

              • Jeff Ryan
                Posted February 20, 2016 at 6:30 pm | Permalink

                They may well be ordered to pay costs. Or not. How much does Apple benefit from the government-supplied roads, bridges, tax breaks, law enforcement, etc.?

                Did Apple build those roads? All of them? I don’t think so. Have they decided not to pay this country its just due in taxes? Damn right. They are doing everything they can to avoid them.

                Allow me to bring out the world’s smallest violin…

              • infiniteimprobabilit
                Posted February 20, 2016 at 6:23 pm | Permalink

                “That it might demand means to defeat encryption should not be a surprise.”

                Nothing the FBI does should be a surprise.

                That’s not to say it should get its way.

                cr

              • Scote
                Posted February 20, 2016 at 6:46 pm | Permalink

                “Apple created this monster. If they can’t fix it, maybe they should find another line of work.”

                And there is the problem with your arguments. There is no “monster” created by Apple. Just a phone designed to protect user data from being easily accessed – something that is important because smart phones contain your email, stored passwords and address books – basically everything and identity thief needs to take over your accounts, steal your cash, and steal money using your identity. To not secure that would be irresponsible. Phone theft and identity theft happen a lot more than terrorism and are real issues that affect people everyday.

              • Jeff Ryan
                Posted February 20, 2016 at 7:08 pm | Permalink

                Which is why the Supreme Court held that the police must have a warrant to access the information. What’s changed? Nothing.

              • Scote
                Posted February 20, 2016 at 7:30 pm | Permalink

                “Which is why the Supreme Court held that the police must have a warrant to access the information. What’s changed? Nothing.”

                Again, this is an All Writs Act case, not a search warrant case. Everybody agrees the government has the right to search the phone – it is a government owned phone.

              • jsrtheta
                Posted February 20, 2016 at 8:52 pm | Permalink

                Were Farook alive, that would not be at all clear.

          • Matt
            Posted February 20, 2016 at 2:47 pm | Permalink

            What you’re describing is impossible. The reason the government can’t do what they’re asking Apple to do is because the phone will reject OS updates that aren’t digitally signed by Apple. If Apple creates an OS update that is tied to a specific phone (a technology they already have in place) it cannot be used on another phone due to that same protection.

            Apple could post the thing on the front page of their website and it wouldn’t matter.

            • Diana MacPherson
              Posted February 20, 2016 at 3:03 pm | Permalink

              Impossible to do again on another phone? Impossible to change? Impossible to share the source code to do that change?

              Usually absolutes are not so impossible. That’s while jailbreaking exists.

              • Diana MacPherson
                Posted February 20, 2016 at 3:04 pm | Permalink

                And by jailbreaking I mean people thought that it would be impossible to do that too.

              • Matt
                Posted February 20, 2016 at 9:29 pm | Permalink

                I would be surprised if the source code changes were non-trivial. Even if they are, that isn’t the real barrier to somebody duplicating what the FBI is asking for.

                The problem for both the FBI and anybody else wanting to do the same thing, is the signature check built into the hardware the prevents non-Apple OS updates to run. Apple signing a custom OS update that is tied to a single phone doesn’t make it any easier for someone trying to break through the protection. Them doing this has no impact on the overall security of iphones.

      • Adam M.
        Posted February 20, 2016 at 8:30 pm | Permalink

        That’s true, but many companies have had to write software to bypass security and/or encryption in the past, so it’s nothing new. Hushmail is a famous example of a service that claimed not even they could decrypt your mail (because they used end-to-end encryption), until they were made to write a custom version of their Java applet that leaked the key.

      • Diane G.
        Posted February 21, 2016 at 2:55 am | Permalink

        The way I understand it, Apple would just have to write software that prevented its existing program from cutting off access after 10 attempts with incorrect passwords. That way the government could crack the passcode by brute computer force. Trying to remember where I read that…

        • Diana MacPherson
          Posted February 21, 2016 at 11:27 am | Permalink

          They need to subvert the software that both erases the phone after 10 tries and delays the wait time between retries. They also need to be able to access the phone remotely instead of entering the codes manually.

          And of course they need to do this securely and test it. It’s a lot of work and it compromises the security of the iPhone by making it hackable from an external source.

  4. Dave
    Posted February 20, 2016 at 12:50 pm | Permalink

    Apple should comply. And if they continue to refuse, then I hope their chief executives end up in jail.

  5. Stephen Barnard
    Posted February 20, 2016 at 12:50 pm | Permalink

    I’m with Apple. This case presents a slippery slope toward what law enforcement really wants: a backdoor into any device. In addition, if Apple caves to this what’s to prevent, say, China from asking them to crack a dissident’s phone.

    • Randy Schenck
      Posted February 20, 2016 at 12:53 pm | Permalink

      I must say here that if Apple or any other American company wants to do business in China, or any other country…they must play by the rules of that country.

    • Diana MacPherson
      Posted February 20, 2016 at 12:58 pm | Permalink

      I guarantee you that this already happens wrt corrupt governments. Companies that value their encryption and reputation for a secure product typically resist as much as they can.

  6. Diana MacPherson
    Posted February 20, 2016 at 12:51 pm | Permalink

    I’m on the side with Apple on this one because software would have to be created to unlock the phones. Sure, it can be destroyed, but once created there is still a risk that it will be copied. Things have a way of leaking despite the best efforts of employees and the corporations they work for. Imagine the incentive to copy such software! Governments would pay you ridiculous amounts of money for such a thing.

    When I worked at BlackBerry, governments (usually corrupt ones) would pressure them for information about encryption and the like and usually such pressuring included threats to the livelihood of the company. Having such things is very sought after by governments.

    I also don’t think there is going to be much information on the phone to justify the risk. The person destroyed two phones beyond recovery but not this one. Why would that be? It could be that there wasn’t time, that he so loved his iPhone 5c or it could be that there was nothing worth destroying on it.

    • infiniteimprobabilit
      Posted February 20, 2016 at 4:24 pm | Permalink

      I normally can’t stand Apple, for their control-freak approach to computing, but in this case I’m on their side. Essentially the FBI is engaging in a ‘fishing expedition’, or rather, demanding that Apple do so on their behalf. I thought such were illegal, or at least not accepted by courts.

      And while I usually dislike ‘slippery-slope’ arguments, I think one applies here. If Apple can be forced to unlock this phone, everyone and his dog is going to use it as a precedent to compel assistance in their snooping.

      cr

    • Ken Kukec
      Posted February 20, 2016 at 6:13 pm | Permalink

      Apple, and any other US-based company, should resist such requests unless compelled to comply by a final order from a US court of competent jurisdiction.

      As to the software at issue here, a court order can compel Apple to destroy the program after the information sought from the phone has been obtained, or to take any and all such other measures as are necessary to protect the privacy interests at stake.

      The US has a federal CIPA statutes (Classified Information Protection Act) that contain stringent procedures in cases involving classified government documents. (I’ve tried CIPA cases involving “top secret” documents and have seen first-hand how stringently those procedures are enforced.) There appears to be no reason why similarly stringent procedures cannot be fashioned under the circumstances present here.

      On the other hand, if Apple can establish at an evidentiary hearing that it is in fact impossible to protect the secrecy of the information at issue, I’m sure the US courts would be willing to revisit the matter to consider alternatives.

      • Diana MacPherson
        Posted February 20, 2016 at 6:22 pm | Permalink

        Apple states that “there is no way to guarantee such control” of limiting to one phone. I suspect that destroying the software is not deemed secure enough to Appke for whatever technical reason or Apple knows that if compelled to do this precedent setting thing, it will either have to recreate and destroy the code each time requested (onerous because of the security required and the signing) or secure and maintain the software. Both have security risks of leaks. Both are costly.

        • Ken Kukec
          Posted February 20, 2016 at 9:04 pm | Permalink

          Apple can say anything it wants. If it is in fact impossible, let Apple demonstrate that to a judge in federal court. That’s why the Good Lord the Federal Rules of Procedure made evidentiary hearings. 🙂

          I’m skeptical that the nation can safeguard its nuclear launch codes, yet not some cell-phone software. I’m pretty sure Hizzoner will be, too.

      • Jeff Ryan
        Posted February 20, 2016 at 6:47 pm | Permalink

        According to the post, a federal court entered just such an order.

    • FiveGreenLeafs
      Posted February 20, 2016 at 10:12 pm | Permalink

      I am with you 100% on this Diana, but I fear this will not end well.

      I don’t know if you follow Bruce Schneier’s blog, but he does not seem to be optimistic about Apples chances to withstand the onslaught. He wrote that he would rather that the headline for the debate should have been: “National security vs. FBI access.”

      It is simply fascinating, in a horrifyingly sort of way, to read a lot of the arguments going around.

      I fear that the combination of technology on a level way beyond what most people have resources and experiences to grasp and understand, paired with a highly charged emotional situation, have thrown any chances of careful, technical relevant and measured reasoning out the window.

      For example, as you say, the idea that Apple in the long run would be able to keep this secure is slim, and just the knowledge that it is possible, and the information that will inadvertently gradually “leak” out, will endanger absolutely everyone. There simply does not exist a way to build a “backdoor”, (which this technically by definition is, since it weakens the device security system), that only the good guys can use.

      Bruce, has a number of good links in his piece, among them this, (in case you have not seen it yet),
      View story at Medium.com

      The idea that this would be a one off instance, or Apple the only company if this is allowed to pass, is just ridiculous. And, if not even NSA and GHCQ can secure their own most prized information from “leaking”, what chances does Apple or innumerable other regular companies have? I think I will go sit in a corner and have a bit of a cry… 😦

      • Diana MacPherson
        Posted February 20, 2016 at 10:23 pm | Permalink

        Thanks for the link. That was a good read.

        • FiveGreenLeafs
          Posted February 20, 2016 at 10:50 pm | Permalink

          You are most welcome. Very late to the party, but thought I would add (reiterate) a few thoughts and my moral support in any case 🙂

      • Diane G.
        Posted February 21, 2016 at 3:02 am | Permalink

        Agree with Stephen, Diana, infinite, FiveGreenLeafs, et.al.

        (Makes me wonder how Apple can be sure none of its capable programmers are bribable, though…)

        • FiveGreenLeafs
          Posted February 21, 2016 at 9:17 am | Permalink

          Diane,
          They can’t, and in all probability, they aren’t.

          That’s one of the truly sad ironies in this debate. We know for a fact (from the Snowden documents), that the NSA have (had) a program that actively tries to place, recruit or force employees in tech companies to help or subvert the security of their companies products, to assist NSA’s capabilities…

          • Diane G.
            Posted February 22, 2016 at 1:28 am | Permalink

            Interesting–I guess I’ve not paid enough attention to what all Snowden uncovered!

        • Diana MacPherson
          Posted February 21, 2016 at 11:32 am | Permalink

          What keeps Apple’s secrets secret is no secret. 😀 NDAs of course are a big part of it, threat of prison or other legal action (and companies do pursue this), fear of never working in the industry again (because word spreads that you were fired with cause for leaking secrets) and genuinely wanting your company to do well against the completion (leaks destroy the work of you and your colleagues).

          So, all these things add up to too much risk. Even so, stupid people still do it because they feel they won’t get caught or they want the glory.

          In a case where something was so desired, a person could be coerced more easily to leak because the payoffs/threats for non compliance would be higher.

          • Diane G.
            Posted February 22, 2016 at 1:30 am | Permalink

            Thanks, Diana.

            I’m sure it’s no easy thing to go up against a mega-power like Apple!

            OTOH, if you’re promised indemnity by the government…

    • Posted February 21, 2016 at 3:50 am | Permalink

      /@

      • Diana MacPherson
        Posted February 21, 2016 at 11:39 am | Permalink

        It’s extra funny for me because I have a friend named Pandora. I wonder if she has an iPhone.

        • Diane G.
          Posted February 22, 2016 at 1:31 am | Permalink

          Interesting that someone would name their child Pandora!

          • Diana MacPherson
            Posted February 22, 2016 at 8:47 am | Permalink

            She’s Chinese so it’s likely her “English name”.

            • Wunold
              Posted February 22, 2016 at 12:04 pm | Permalink

              I wonder which Chinese name would equal a name from Greek mythology when anglicized. 😉

              • Diane G.
                Posted February 22, 2016 at 8:08 pm | Permalink

                Meta-meta!

  7. alexandra
    Posted February 20, 2016 at 12:52 pm | Permalink

    I side with Apple….. merely on the principle that government almost always over reaches and has at its disposal the FBI, CIA, NSA, all the Military secret services and who knows how many other even more secret agencies, overlapping, without accountability, with limitless funds. So why give it Apple, too!

  8. Posted February 20, 2016 at 12:52 pm | Permalink

    I can’t see a problem with expecting Apple to hack a phone, so long as (1) the request has the backing of a court order, and (2) the court order is public.

    • Scote
      Posted February 20, 2016 at 3:33 pm | Permalink

      The order essentially drafts Apple into doing work it doesn’t want to, that is bad for Apple’s business and customers, and into doing it for free. Should the government have the right to serve you with an All Writs Act order to do its bidding, even if the work you are forced to do is against your interests?

  9. Dave
    Posted February 20, 2016 at 12:54 pm | Permalink

    You make a good argument that is, sadly, based on fallacious assumptions. Even if the software is ‘destroyed’, the fact it has been done will mean it can be recreated. Consider the problem an Apple employee living in China or Iran will face when the secret police ask them for the same service, and also mention that they know where her daughter goes to school?

    Also, the concept of making a perfectly unhackable device is naturally so desireable that one wonders why no one has tried to create one before? The evidence seems to suggest that perfection is actually unattainable, although the advent of quantum computing may change the situation for a while.

    • Randy Schenck
      Posted February 20, 2016 at 12:59 pm | Permalink

      The comparison on this issue to what some other country might want Apple to do just does not fly. What any company does in another country is totally dependent on that country. Has nothing to do with this fight in the states.

      • Diana MacPherson
        Posted February 20, 2016 at 1:06 pm | Permalink

        Governments can pressure the company to comply with their requests by threatening to block the sale of their device. In large populations like India and China, this is a very serious economical threat.

        • Randy Schenck
          Posted February 20, 2016 at 1:18 pm | Permalink

          Maybe you misunderstand what I am poorly trying to say here. If Apple goes to India or China to do business they do so based on the will of those countries. If they do not like it, then don’t go. It is not a matter of pressure. Have you not heard the phrase, when in Rome….

          Maybe many travelers do not understand something else. When you get to another country, you are subject to their laws, not the laws back home.

          • Diana MacPherson
            Posted February 20, 2016 at 1:43 pm | Permalink

            Typically what happens is a company goes into a country like China or India and then the government decides it wants to spy on its peopl who are using your device. There was no foreknowledge from the company. And they company has a mandate to protect its customers. So the government threatens the company, in some cases, with bankruptcy, by refusing to do business with them. The company typically doesn’t expect this to happen.

            Further “when it Rome” doesn’t apply. Should companies give and take bribes when working in Russia and China? The law doesn’t think so. Companies in the US and Europe who have have found themselves in legal trouble at home. Further, though companies tend to be amoral, is it really ethical just to shrug and hand over the names of dissidents who use your produce because, oh well, that’s life in China? How would ethics boards respond?

            • Randy Schenck
              Posted February 20, 2016 at 2:46 pm | Permalink

              I am sure you will correct me if this is wrong but did not google go into china with it’s product. And china required many thing of them that google had said they would not do (at least not here in the states). However, google had to weight the profit to be made in china with their own values/standards and in the end, they complied with China. Either that or I think on a couple of issues they packed up and did not enter that part of the business.

              We have to do tons of business with china while at the same time they hack and steal from hundreds of American companies all the time. The ethics the culture of china is what it is. No company goes in there with eyes closed. The company I worked for was exporting lots of things from friendly Philippines. We had American bases there. But when we tried to get stuff out, their customs stopped us cold. Certain people had to be paid or nothing happened. We were not allowed to pay these “bribes”. However, we could hire a local broker and pay him to pay the bribes so our goods would move.

              • Diana MacPherson
                Posted February 20, 2016 at 3:01 pm | Permalink

                Well, yes companies are typically amoral. Their job is to make money. This is why there are laws and regulations. This is why I don’t see Apple as necessarily pushing back on the FBI because they want to protect your privacy because they really believe in privacy. They are doing it to protect their product and their capital. And Google fighting with China is another example – Google didn’t want to risk their product but in the end decided they could accept that risk. The point is though – they are still going to be pressured and since everyone is working with countries like China and India, this code getting in their hands would be terribly bad for cybercrime.

              • Jeff Ryan
                Posted February 20, 2016 at 3:13 pm | Permalink

                Which assumes that Apple cannot control what happens with its work. It cannot segregate one project from the company at large.

                First, that’s rather silly. Second, if that were true, then maybe they shouldn’t be in business. It is hardly the government’s fault if Apple, under a court order, mind you, just can’t keep the work secret.

                Try telling that to a judge with a straight face. Or your shareholders.

              • Diana MacPherson
                Posted February 20, 2016 at 3:50 pm | Permalink

                It wasn’t so silly when their stuff got jailbroke or their betas got leaked. If Apple felt they could secure their stuff they wouldn’t bother with NDAs.

              • Adam M.
                Posted February 20, 2016 at 8:36 pm | Permalink

                Google loudly proclaimed in a press release that they wouldn’t cooperate with unjust Chinese government demands. Within a couple weeks, China called their bluff and moved to shut Google down. Google gave in and began complying. No press release about that for some reason…

        • Jeff Ryan
          Posted February 20, 2016 at 1:25 pm | Permalink

          Yes, and if India and China wish to damage their own economies by banning the sale of products their people want, well I suppose they can. It would be extremely costly for their own economy, and would only result in phones being smuggled in anyway.

          Economic arguments are all well and good, but I doubt anyone here, including me, could really game this out. One thing I do know is that governments are loath to lose money from tariffs and taxes. And if someone wants a product, there is always a way to get it to them.

          • Diana MacPherson
            Posted February 20, 2016 at 1:49 pm | Permalink

            You seem to think my explanations are just theoretical. I can assure you, India did not worry a bit about its economy when it threatened to do this very thing to BlackBerry (then know. As RIM) in 2010. It was willing to cut off its citizens from using the phone and services the majority of its population used at the time. Such a threat, if carried out, would have had a monumental economical impact on the company.

    • Scott Draper
      Posted February 20, 2016 at 1:04 pm | Permalink

      “the fact it has been done will mean it can be recreated.”

      Any iOS developer can already do this, regardless of whether it’s been done before.

      • GBJames
        Posted February 20, 2016 at 1:48 pm | Permalink

        If this were true there would be no issue. The FBI would already have done it.

        • Diana MacPherson
          Posted February 20, 2016 at 1:59 pm | Permalink

          Agreed. It isn’t easy. It’s more complicated than commenting out code.

          • Scott Draper
            Posted February 20, 2016 at 2:34 pm | Permalink

            How do you know, based on the article that Jerry cited?

            • GBJames
              Posted February 20, 2016 at 2:50 pm | Permalink

              As someone who works in the software industry the idea that this is as simple as commenting out some code is laughable.

              Seriously… Check out some basics of how encryption works.

              • Scott Draper
                Posted February 20, 2016 at 2:53 pm | Permalink

                As someone who works in the industry, too, I know that it often is this simple.

                The encryption isn’t relevant, as far as I can tell from the article. Many systems encrypt data, but automatically decrypt it once the user is authenticated. From the article that Jerry cited, it looks to merely be an issue of authenticating a user, not decrypting data.

            • Diana MacPherson
              Posted February 20, 2016 at 2:52 pm | Permalink

              From my understanding of how devices are secured. Check comment 16 above.

              I could ask you how you know it’s as easy as commenting out a line of code.

              • Scott Draper
                Posted February 20, 2016 at 2:58 pm | Permalink

                The data encryption isn’t relevant, as far as I can tell. It only means that the operating system can’t be bypassed to get at the files directly.

                Many systems encrypt the data they use, but once a user is authenticated, the data is decrypted transparently to the user. From the article that Jerry cited, it appears the issue is solely one of authenticating the user.

                Now, the article may be wrong…..

            • Adam M.
              Posted February 20, 2016 at 8:41 pm | Permalink

              It may be as easy as commenting out a line of code (on the older iPhone 5c, anyway), but the phone will refuse to run the new software unless it’s cryptographically signed with Apple’s code-signing key, which is very well-protected. So practically speaking, it can’t be done by a random developer, or by the FBI.

              • Scott Draper
                Posted February 21, 2016 at 3:47 pm | Permalink

                I didn’t say that it could be done by a random developer. I know that it certainly can’t. It must be done by someone who works for Apple in the operating systems group.

              • GBJames
                Posted February 21, 2016 at 4:04 pm | Permalink

                Actually, you said “any iOS developer” which. So we don’t even need to put in the effort to crank up the random number generator.

        • Scott Draper
          Posted February 20, 2016 at 2:33 pm | Permalink

          I’m referring to developers that work on iOS, not developers that write applications for iOS.

    • Ken Phelps
      Posted February 20, 2016 at 1:27 pm | Permalink

      “Consider the problem an Apple employee living in China or Iran will face when the secret police ask them for the same service, and also mention that they know where her daughter goes to school?”

      If we are assuming that it is possible to create the required code, what is to stop foreign governments from blackmailing employees right now? How would Apple doing what everyone seems to believe they can already do change the current situation of an Apple employee?

      Given Apple’s ongoing intrusiveness into the products and services its customers purchase, this seems to me to be little more than hypocritical virtue signalling.

      • Ken Phelps
        Posted February 20, 2016 at 1:34 pm | Permalink

        …on Apple’s part.

    • Ken Kukec
      Posted February 20, 2016 at 6:34 pm | Permalink

      If that’s the case, Apple should perhaps decline to do business in renegade, outlaw countries where its employees are subject to coercion by the secret police — or, at the least, to negotiate agreements with the countries where it does business prohibiting its employees from being subject to such coercion.

      In any event, I fail to see what the circumstances you hypothesize have to do with the case under consideration. Apple employees are, or are not, subject to such foreign coercion regardless how the district court rules in this case — and, if they are, what prevents their being compelled by those foreign governments to do precisely what the court order in this case proposes to compel them to do?

      • infiniteimprobabilit
        Posted February 20, 2016 at 7:09 pm | Permalink

        Because up till now, such an employee could say to the police “It’s impossible. It’s uncrackable by design” and Apple’s own specifications would support it.

        But now the foreign secret police will suspect otherwise. And if the FBI wins and Apple succeeds, they will know otherwise.

        cr

        • Ken Kukec
          Posted February 20, 2016 at 9:11 pm | Permalink

          And you suppose that the foreign secret police are going to accept the Apple employee’s first answer? How many fingernails is the employee prepared to give up before admitting that a work-around can be fashioned? And how is this circumstance different than if the employee tells the interrogator that, yeah, we did this one time before in the States, but we had to destroy the program pursuant to a court order?

          • infiniteimprobabilit
            Posted February 21, 2016 at 12:46 am | Permalink

            Foreign secret police have access to the Internet too, you know. And they’re not stupid.

            If all the information on the net says “iphone *cannot* be cracked” then they will know there’s no point trying to make some poor Apple employee do the impossible. And something the Apple employee ‘knows’ is impossible.

            But if they – and he – know it can and has been done then they are likely to try harder.

            cr

            • Ken Kukec
              Posted February 21, 2016 at 2:56 am | Permalink

              And you don’t think there is sufficient information on the internet right now, including the reports of this case, to at least suggest to our hypothetical brutal secret police interrogator that the decrypt is feasible enough to press forward with the torture?

              In any event, it makes for piss-poor US public policy to fashion US law based upon how some hypothetical brutal foreign dictator and the goons in his hypothetical secret police might react (especially inasmuch as we have no control over what they might do anyway).

              • infiniteimprobabilit
                Posted February 21, 2016 at 4:36 am | Permalink

                “And you don’t think there is sufficient information on the internet right now”

                Well, there is *now*

                Hopefully the secret policeman is intelligent enough to realise that some poor Apple tech doesn’t have a hope in hell of duplicating that.

                Since Apple is a global corporation, and what it does has influence in many other countries, it makes for piss-poor public policy to force it to do things based solely on domestic considerations and completely ignore the possible consequences for anyone else. That may not be a legalistic point of view, but IMO it’s a moral one, and ignoring the effect of your own policies on other countries is part of the reason why you get foreign terrorists wanting to ‘strike back’, IMO.

                cr

              • Ken Kukec
                Posted February 21, 2016 at 7:03 am | Permalink

                So what’s the crucial distinction you’re seeing between the secret police thug holding our hapless Apple tech hostage while demanding that the home office supply him with an original decrypt … and the thug holding the hapless tech hostage while demanding that Apple provided him a decrypt after Apple had once before conducted such a decrypt pursuant to a trial court order issued in California?

              • infiniteimprobabilit
                Posted February 21, 2016 at 5:57 pm | Permalink

                “So what’s the crucial distinction”

                Because now (thanks to the FBI) the SP will *know it can be done*. Whereas previously everyone ‘knew’ it couldn’t be done, therefore no point in leaning on Apple employees.

                Is there something peculiarly difficult to grasp about that point?

                cr

  10. Scott Draper
    Posted February 20, 2016 at 12:58 pm | Permalink

    I’m OK with the FBI’s request to Apple, but I think it should pay Apple for the cost of developing the software. Something more than the cost of labor, in order to discourage whimsical requests.

    I’m skeptical of the description of what they’d have to do, though. Merely removing the password attempt limit seems silly; as long as you’re modifying the code of iOS, you might as well remove the password checking feature completely. This might be as simple as commenting out a line of code. I do this all the time when working on software that requires a login. It really slows down debugging to have to keep typing in credentials.

    This probably would take less than an hour of a developer’s time, although you’d have to factor in how long it would take to generate a new iOS build, test and deploy it.

    • Posted February 20, 2016 at 1:53 pm | Permalink

      That wouldn’t help. It’s not that there’s a password on the phone, it’s that the data is encrypted. Without the code, it’s impossible to read anything.

      • Scott Draper
        Posted February 20, 2016 at 2:33 pm | Permalink

        That isn’t what the article says. It says that the FBI wants Apple to remove the limitation on passcode guesses. The data encryption isn’t necessarily relevant. Many systems encrypt data, yet automatically decrypt it once the system is accessed by an authenticated user.

        The data encryption merely means that you can’t bypass the operating system to get directly at the data.

      • Adam M.
        Posted February 20, 2016 at 8:46 pm | Permalink

        The data is encrypted, but the key is generated based on the user’s PIN and/or password. Most people just use a 4-digit PIN to lock their phones, and it would be easy to generate all the keys corresponding to the possible PINs. The only problem is that the phone will erase the data after 10 incorrect tries, and that auto-erase feature is what the FBI wants disabled.

    • chigaze
      Posted February 20, 2016 at 4:39 pm | Permalink

      The passcode is used as part of the encryption key for the device. They can not bypass it as it is needed by the OS to decrypt the data.

      • Scott Draper
        Posted February 20, 2016 at 5:15 pm | Permalink

        Ok, that was the only scenario I could imagine where the actual passcode was needed. The only reason I thought that unlikely was that everything would need to be re-encrypted when the passcode was changed.

        Even so, this is still a “comment out” solution, assuming the FBI really has the means to try all permutations of the passcode.

        • eric
          Posted February 20, 2016 at 8:20 pm | Permalink

          I don’t think this is an argument over who should pay for the procedure. Apple’s argument is they aren’t obligated to do it at all under current law, and they won’t do it until legally ordered by a court to do so. You are almost certainly right that the resources required to do the hack are in the noise for a company like Apple.

          • Jeff Ryan
            Posted February 20, 2016 at 9:10 pm | Permalink

            They HAVE been ordered to do so. Read the article. “A federal judge ordered Apple to create the software…”

            Just what is unclear about this? We are not talking about whether there is a court order. There is.

          • Scott Draper
            Posted February 20, 2016 at 9:58 pm | Permalink

            I was more addressing the issue of “What is the right thing to do?” rather than the specific legal arguments.

            I don’t buy Cook’s public objections. This doesn’t risk any other user’s data. As for the software “escaping” into the wild, it could easily be written to work on one particular phone or have an expiration date.

            After a crime has been committed, the authorities ought to have the ability to search their stuff. This just seems fundamental.

        • chigaze
          Posted February 22, 2016 at 8:14 pm | Permalink

          Even if removing the 10 try limit was a “comment out solution”, and not knowing all the implementation details means that’s unclear, there is also the issue of creating a version of the OS that will install into the firmware and bypass the installed OS. That is definitely not a “comment out solution”.

          • Scott Draper
            Posted February 22, 2016 at 8:35 pm | Permalink

            Close enough. Creating a deployment is the part of all software development. Let’s not be pedantic.

            • chigaze
              Posted February 22, 2016 at 9:49 pm | Permalink

              No let’s be accurate.

              This is not a trivial operation and requires knowledge only Apple has. In particular they have to create a version of the OS that runs in the firmware rather than off primary storage.

              This is not a matter of commenting out a few lines of code and recompiling as you imply. This is about creating a specialized, stripped down version of iOS that loads into the firmware and bypasses the existing OS on the device.

              Further, this OS has to have Apple’s private keys that allow it to be loaded onto the device in the first place. Therein lies the rub as those keys are worth more than a mint if they ever leaked. Even a basic zero day hack of an iPhone sells for a million dollars plus on black markets.

              Granted they wouldn’t be easy to extract from the SIF but if the SIF ever leaked there’d be a huge expenditure of time and resources put to peeling it open. This is the risk Apple is unwilling to take.

              Do note Apple has aided law enforcement extensively up until this point. This is just the line they are unwilling to cross as it has the potential to compromise security on millions of devices.

              • Stephen Barnard
                Posted February 22, 2016 at 10:01 pm | Permalink

                You’ve put your finger on the problem. To do what the FBI asks, Apple would have to sign off on the firmware update with their digital signature, which is the most closely held secret in the company. It’s so closely held that it’s very unlikely that one person has access to the whole thing, and there’s probably an elaborate procedure to apply it. It’s equivalent to the magic master key to unlock nearly all of Apple’s products.

  11. BobTerrace
    Posted February 20, 2016 at 1:08 pm | Permalink

    I am with Apple on this one. The alternative I see is to write the code, decrypt the phone, delete the software and then eliminate all those involved so it can’t be reproduced.

    The government can not be trusted.

    • Jeff Ryan
      Posted February 20, 2016 at 1:26 pm | Permalink

      And Apple can?

      • BobTerrace
        Posted February 20, 2016 at 1:32 pm | Permalink

        So far, yes. The government, no

        • Jeff Ryan
          Posted February 20, 2016 at 2:15 pm | Permalink

          I wouldn’t trust Apple, or any other high-tech company. They have demonstrated time after time that they don’t give a crap about this country.

          • Stephen Barnard
            Posted February 20, 2016 at 2:22 pm | Permalink

            Apple designs products to be secure. It’s a selling point. They WANT the products to be secure. It’s something that differentiates them from Android and other products. The government is trying to force them to make the products less secure.

            I think your mistrust is misplaced.

            • Jeff Ryan
              Posted February 20, 2016 at 4:09 pm | Permalink

              I’m sorry, but I simply can’t accept that a private corporation’s selling strategy is more important than public safety. I don’t really give a toss about Apple’s bottom line. I do care about people being killed.

              Call me crazy…

              • Stephen Barnard
                Posted February 20, 2016 at 4:20 pm | Permalink

                Apple isn’t asking anyone to trust them. That would be the government.

              • Jeff Ryan
                Posted February 20, 2016 at 4:44 pm | Permalink

                So, I can vote out Apple’s leadership? I didn’t know that.

            • Adam M.
              Posted February 20, 2016 at 8:51 pm | Permalink

              Many companies claim their products are secure, sure, but almost no companies actually make secure products. iPhones are not secure – not even the latest ones. Apple doesn’t put effort into making truly secure products. They put effort into making the public believe they make secure products, and it’s that perception they’re trying to protect.

              • Posted February 24, 2016 at 8:58 pm | Permalink

                I’ve been going back and forth on this topic, and I’ve been thinking the same thing as you.

                With the advent of smart phones and the popular move away from traditional laptops and computers, smartphones now contain most people’s lives and secrets.
                Asking Apple to give up something they claim sets them apart from the ratpack would be tantamount to cutting them off at the knees.

              • Stephen Barnard
                Posted February 24, 2016 at 9:06 pm | Permalink

                Apple has done more than any other phone manufacturer — far more — to make their phones secure.

                Security is hard, but look at to what extremes the FBI has to go to to crack an old generation iPhone. They can’t do it. They need Apple’s help.

                If the FBI wins this case the new generation iPhones will become uncrackable even by Apple.

              • Diana MacPherson
                Posted February 24, 2016 at 9:26 pm | Permalink

                As a former BlackBerry employee, I have to disagree that Apple has done far more with security than any other manufacturer.

              • Posted February 25, 2016 at 4:33 am | Permalink

                “Than any other POPULAR phone manufacturer” then … 😬

                /@

          • jaxkayaker
            Posted February 20, 2016 at 2:58 pm | Permalink

            Apple can’t arrest me, the government can. Who should I trust?

            • Jeff Ryan
              Posted February 20, 2016 at 3:47 pm | Permalink

              The government must justify what it does publicly. It must follow the Constitution and federal and state law. It must allow you to take it to court, to challenge its actions, to seek redress, and provide due process.

              Must Apple? Yeah, right.

              • JBS
                Posted February 20, 2016 at 10:33 pm | Permalink

                “The government must justify what it does publically.” It absolutely does not. I work in a law firm that regularly tries cases against the federal government. They do not play fair by any means. And do you know why that is? Because they’re the government and they know they can get away with it.

              • Jeff Ryan
                Posted February 20, 2016 at 10:35 pm | Permalink

                Yes, the fed can be a bitch. But there are remedies, and the truth usually comes out.

                I was a state prosecutor, and there’s little discouraging that you could say about the fed that I haven’t experienced before.

          • I.V.
            Posted February 20, 2016 at 3:22 pm | Permalink

            One can decide freely whether or not to buy Apple products, but generally not whether one is subject to a government’s jurisdiction.

            • Jeff Ryan
              Posted February 20, 2016 at 3:25 pm | Permalink

              Yeah, pity about that. Governments and all. You know, beholden to all the people, and not just the CEOs.

              Though I see it isn’t stopping Apple when it comes to tax evasion.

          • Ken Kukec
            Posted February 20, 2016 at 6:45 pm | Permalink

            I’m not seeing what “trusting Apple” has to do with the outcome of this case. Regardless what the district court does here, Apple could presumably create the software program in question all on its own, and then sell it to the highest bidder (or to all comers) if that’s what it was of a mind to do.

            Not trusting Apple may be a good reason why we shouldn’t own Apple phones, but it doesn’t seem relevant to this particular case.

            • Stephen Barnard
              Posted February 20, 2016 at 6:50 pm | Permalink

              Trusting Apple has nothing to do with the case. Jeff Ryan simply doesn’t like Apple and that colors his arguments.

              • Jeff Ryan
                Posted February 20, 2016 at 7:10 pm | Permalink

                You are quite wrong about that. I couldn’t care less what company is involved.

              • Stephen Barnard
                Posted February 21, 2016 at 7:35 am | Permalink

                Then why do you bring up the issue of taxes? That has nothing to do with this.

  12. Posted February 20, 2016 at 1:10 pm | Permalink

    Once Apple has made the software. It will receive orders to unlock phones from other criminal cases. So no, they won’t be able to destroy the software. And once the software exists they can no longer argue that it’s a burdon on them to help the police. They’d be stupid to develop the software without being forced to do it.

    • Ken Phelps
      Posted February 20, 2016 at 1:44 pm | Permalink

      “Once Apple has made the software. It will receive orders to unlock phones from other criminal cases”

      So what? Authorities can get a warrant to crawl under your bed, paw through your financial records, or read all your mail. As well they should, if they are to investigate criminal acts. Why should the existence of a new bit of technology derail the properly warranted exercise of that balance between rights and responsibilities?

      • BobTerrace
        Posted February 20, 2016 at 2:19 pm | Permalink

        “Authorities can get a warrant to crawl under your bed, paw through your financial records, or read all your mail.”

        No they don’t. Not without a reason and usually not without a judge’s order.

        • jaxkayaker
          Posted February 20, 2016 at 2:45 pm | Permalink

          This is a non sequitur because Ken already specified getting a warrant. Further, we know that the NSA and FBI warrant requests are rubber stamped by FISA courts, if they even bother to request that sort of permission. Other courts do much the same, but FISA courts are secret, and you don’t even get the opportunity to challenge the request for access.

          • Ken Kukec
            Posted February 20, 2016 at 7:01 pm | Permalink

            If our evil and corrupt government is in cahoots with evil and corrupt corporations, then no procedural safeguards or constitutional protections are worth the paper they’re written on — and we’re off into Milo Minderbinder territory. If that’s the case, it hardly matters what the district court or Apple decide to do here.

        • Ken Kukec
          Posted February 20, 2016 at 6:55 pm | Permalink

          And procedural safeguards at least as stringent as probable cause found in a court order can be required here. Why aren’t such safeguards adequate to protect the phone information at issue in this case?

          Is there something so special about such information that it warrants different treatment, let alone an absolute exemption from any type of disclosure under all circumstances for ever?

  13. rudolphpaul
    Posted February 20, 2016 at 1:14 pm | Permalink

    “in the future, Apple could, I’m told, even create an iPhone whose passcode could never be hacked by any software, something that seems perfectly legal.” If, as you say, a hack proof phone could be built in the future, that undermines the argument for government access now. If it can’t be hacked in the future why hack it now? It just avoids answering the question today. My worry is that China, Russia, Iran, etc. would use this hack to crush dissidents in those countries, leading to a far greater potential death toll than what happened in San Bernardino.

  14. alexandra
    Posted February 20, 2016 at 1:16 pm | Permalink

    I know how you feel about the New Yorker – even so, please read this and review it?

    Amy Davidson
    AMY DAVIDSON

    FEBRUARY 19, 2016
    The Dangerous All Writs Act Precedent in the Apple Encryption Case
    BY AMY DAVIDSON

  15. jaxkayaker
    Posted February 20, 2016 at 1:23 pm | Permalink

    What’s the point in decrypting the data of a dead terrorist? If there is any, is it worth the violation of privacy of untold Americans by the government that will inevitably result? I doubt it, since terror cells are, well, cellular. “Compartmentalized” if you prefer. The rule of law is great, the government should try following it sometime. Kudos to Apple from a libertarian-leaning Android user. By the way, what law exactly is the basis for the government’s demand and how do they square it with the fourth amendment?

    • Jeff Ryan
      Posted February 20, 2016 at 2:18 pm | Permalink

      Farook is dead, so there is no one with Fourth Amendment standing. So the Fourth Amendment is irrelevant.

      • jaxkayaker
        Posted February 20, 2016 at 2:41 pm | Permalink

        Everyone else who uses smartphone encryption and lives have standing for their privacy, though. Also, I generally find the standing issue to be bogus, a pretext for the government to ignore constitutional restrictions on its authority.

        • Jeff Ryan
          Posted February 20, 2016 at 2:56 pm | Permalink

          “Standing” is as old as this country. It is, in fact, a constitutional mandate.

          And no, anyone who uses a smart phone does not have standing here. The information being sought is Farook’s. The argument you’re making simply doesn’t hold water as a legal fact. And if this were 1791, it still wouldn’t hold water.

          • jaxkayaker
            Posted February 20, 2016 at 3:03 pm | Permalink

            As I said, I don’t believe in standing. I don’t care how old it is. But feel free to point it out in the constitution.

            Even if other users don’t have standing in this case (which I’m well aware they don’t), that doesn’t mean this case is irrelevant to other users’ fourth amendment rights, in part because it sets a bad precedent, as well as forcing the development of the ability to violate privacy.

            • Jeff Ryan
              Posted February 20, 2016 at 3:09 pm | Permalink

              Article 3, sec. 2 refers to cases and “controversies.” A controversy is between parties, and it ain’t your phone what’s bein’ searched.

              You have no right to privacy in whom you communicate with if that person keeps a record of it. Which is exactly what happens when you call someone’s phone. You have no right to tell anyone you communicate with that they must delete a record of that, or that the phone company must do so just because you don’t like it. Don’t want anyone to know? Don’t call them.

            • Jeff Ryan
              Posted February 20, 2016 at 3:27 pm | Permalink

              You might not believe in standing, but it exists. And frankly, when it comes down to public safety, there are times it will have to yield to the national interest.

              • infiniteimprobabilit
                Posted February 20, 2016 at 4:35 pm | Permalink

                “when it comes down to public safety, there are times it will have to yield to the national interest.”

                ‘The national interest’? What? As interpreted by the FBI?

                Isn’t public safety (including the right to be safe from unwarranted search and seizure) in the national interest, then?

                What bloody use is the ‘national interest’ if it doesn’t incorporate the rights of its citizens?

                cr

              • jsrtheta
                Posted February 20, 2016 at 4:57 pm | Permalink

                Is that an interest that trumps not being killed?

                You want to make that argument to victims? To a court? ‘Cause you’ll lose.

                Go ahead. Ask people if Apple’s right to privacy is more important than people being shot. Let me know what they say.

  16. anon_tech_person_9
    Posted February 20, 2016 at 1:29 pm | Permalink

    I side with Apple, for a few reasons. The FBI lost access to the device by its own doing, by asking its owner to reset the online password. The device was owned by a local government agency and they already have backups of the device as recent as 6 weeks before the attack. All of the useful metadata about the device’s usage (who texted whom,etc.) is already available to the FBI from the cell carriers. The attackers are dead, so there’s no criminal case to prosecute, and being a work phone, the device most likely wasn’t used for planning the attack.

    So the FBI doesn’t really need this phone, it needs a poster child case to score easy points with the courts, legislature, and public. Supposedly dozens of other requests for Apple intervention are waiting on the outcome of this case. The ripple effects of the precedent of compelling a company to produce new technology to hack its own products will be huge, and aren’t even necessary given the metadata spying tools already in use by security agencies. Making this tool for.the FBI is likely to cause a trickle-down effect where similar orders of assistance are eventually used in petty drug cases. Also, contrary to the suggestions of other comments, the global consequences of this case cannot be ignored. Apple is an international corporation, and the US economy is an international economy with a small minority of the total world population. We don’t want Apple to create technology that can later be used against a US citizen by a foreign government.

    From a technical view, the burden on Apple would be large. The phone’s data is almost certainly encrypted, otherwise the FBI could just take the device apart and copy the memory chips directly. So it’s not just a matter of commenting out a single line of code. Disabling the device reset feature after 10 attempts and disabling the delay between attempts also require creating new ways for the FBI to try passwords, and the behavior of that code needs to be verified carefully, which takes time. Making an OS version that only runs in RAM without writing to the phone’s long term storage would also be a massive burden to Apple (necessary to maintain forensic integrity). Making a version of their OS that only works on a single phone, in a way that prevents trivially extending it to other phones is not simple either. iOS updates need to be cryptographically signed, a slow process due to the need to prevent the signing key from being leaked. Finally, once a technology exists, it can’t really be made to cease existing.

    Sorry for the wall of text; typing this on a phone.

    • Diana MacPherson
      Posted February 20, 2016 at 1:58 pm | Permalink

      +1 especially the part about the availability of information on external servers. All texts etc. are fairly easy to obtain. This is where the “locked room” analogy Sam Harris makes misses the point. Most of that information already left the room and Apple has complied with court orders to provide that data.

      • Ken Kukec
        Posted February 20, 2016 at 7:15 pm | Permalink

        What’s so special about the information on this phone as to distinguish it from our “persons, houses, papers, and effects” or the information that has traditionally been stored on mobile phones and routinely subject to disclosure pursuant to a showing of probable cause and a court order?

        In particular, why does this information merit an absolute exemption from disclosure for ever and all time, under all conceivable circumstances, no matter how exigent the countervailing need for disclosure?

        • Jeff Ryan
          Posted February 20, 2016 at 8:47 pm | Permalink

          What makes it so offensive to them is that it’s an iPhone, and Apple is a hero to many.

          What also makes it offensive is a knee-jerk reaction to the government doing anything that might hurt their feels. They seem to think that the Fourth Amendment means they have immunity from government action of any kind. When it is in fact a blueprint for the government to legitimately conduct an investigation within the confines of the Constitution.

        • Diana MacPherson
          Posted February 20, 2016 at 9:39 pm | Permalink

          Good rief I don’t know why the harping on the information. We who side with Apple oppose cresting a program to compromise security that is used on all iPhones. Apple doesn’t have to create a program that compromises the security that protects the users of its products from identity theft when they comply to warrants to hand over email data from iCloud. This is what is special about this request. It is the government compelling Apple to write software that defeats the security of their devices.

          • Diane G.
            Posted February 21, 2016 at 3:14 am | Permalink

            Indeed, it’s not about “what’s so special about the information on this phone,” it’s about the precedent.

          • Ken Kukec
            Posted February 21, 2016 at 3:39 am | Permalink

            Because it’s the information on the telephone that ends up being made sacrosanct. And why? Apparently for the same reason a dog licks its privates — because it can. 🙂

            We’ve now apparently discovered how to make the information on our cell phones irreversibly non-decryptable, so apparently we must — and the law and the needs of the public be damned.

            That’s the result anyway.

            • Diana MacPherson
              Posted February 21, 2016 at 11:37 am | Permalink

              No. Again. I don’t know why you aren’t hearing this. It’s not the information. NOT THE INFORMATION.

              Apple has already handed over information on the phone from iCloud backups save for a few months.

              The mobile company has already handed over INFORMATION in the form of SMS texts and calls from the phone.

              Get it? NOT ABOUT THE INFORMTION!!!!

              What is different in this case is creating a program to weaken, nay, completely disable the security on the iPhone. A program that could easily be used to weaken, nay completely disable the security on all iPhones. And I might add, in a precedent setting way.

              • Posted February 21, 2016 at 12:20 pm | Permalink

                Indeed, if I may suggest…Apple should insist that it will not do anything for the FBI, but that, if the FBI can get a judge to authorize armed FBI agents to seize Apple’s development computers, including those which hold the master software signing keys and the like, Apple won’t shoot back at the agents.

                Because that’s really what the FBI is demanding: the software keys to the Apple kingdom. Sure, they just want to borrow the keys for a little bit and promise not to make any copies of them or sneak any picture of them so they can hand-carve copies later or anything like that. But, really, that’s what it amounts to.

                And if that’s what they’re demanding, the only quasi-legitimate way for them to do it is by actually seizing the physical property itself, rather than coerce Apple employees into doing their dirty work for them.

                And if the FBI doesn’t know what to do with the loot once they’ve hauled it away? That’s their problem — same as if they seize a safe maker’s manufacturing blueprints and still can’t figure out how to crack the safe.

                b&

                >

    • Stackpole
      Posted February 20, 2016 at 2:06 pm | Permalink

      Encrypted, I trust…

    • Scott Draper
      Posted February 20, 2016 at 3:06 pm | Permalink

      data is almost certainly encrypted, otherwise the FBI could just take the device apart and copy the memory chips directly. So it’s not just a matter of commenting out a single line of code

      Many systems encrypt the data they use, but once a user is authenticated, the data is decrypted transparently to the user. From the article that Jerry cited, it appears the issue is solely one of authenticating the user. Commenting out code generally fixes authentication issues.

      • Posted February 21, 2016 at 3:45 am | Permalink

        If it’s so easy to just comment out code, let the FBI do it themselves.

        • Scott Draper
          Posted February 21, 2016 at 3:48 pm | Permalink

          It has to be done by someone who has access to the iOS source code and can generate a build that can be uploaded to an iPhone. Only Apple can do this.

    • Adam M.
      Posted February 20, 2016 at 9:03 pm | Permalink

      So the FBI doesn’t really need this phone, it needs a poster child case to score easy points…

      I agree that this is a part of it.

      the FBI is likely to cause a trickle-down effect where similar orders of assistance are eventually used in petty drug cases

      Apple used to routinely assist with unlocking phones in the past, including in petty drug cases. Recently they stopped cooperating, saying it would be ‘damaging to their brand’. Apple is resisting for the sake of their brand in this case, too.

      Technically it wouldn’t be hard. Disabling a 10-retry check is just commenting out a line. Making the update only run on one phone is just comparing two device IDs. Since the binary is signed, you couldn’t simply change the ID used for comparison, because that would invalidate the signature. Apple is lying when they say the software would allow the FBI to hack any phone.

    • FiveGreenLeafs
      Posted February 20, 2016 at 10:32 pm | Permalink

      +1 from me as well.

      I would like to stress this,

      “Finally, once a technology exists, it can’t really be made to cease existing.”

      and,

      “So the FBI doesn’t really need this phone,”

      As is also argued for in this article in Slate,
      http://www.slate.com/articles/technology/future_tense/2016/02/the_apple_fbi_encryption_battle_is_over_an_iphone_unlikely_to_yield_critical.html

      • Ken Kukec
        Posted February 21, 2016 at 3:49 am | Permalink

        “Finally, once a technology exists, it can’t really be made to cease existing.”

        Great. Can’t wait until nuclear proliferation leads to the first doomsday machine.

        You’ll find me in the screening room watching Kubrick’s Dr. Strangelove again. 🙂

    • JBS
      Posted February 20, 2016 at 10:37 pm | Permalink

      Bravo. Thank you for a great explanation.

    • Posted February 21, 2016 at 4:04 am | Permalink

      This is very helpful insight.

  17. Posted February 20, 2016 at 1:39 pm | Permalink

    Several thoughts, as usual:

    1. It may not be as easy as the government seems to imagine to create new software in a timely manner to overcome software attributes
    for a system built upon an existing platform. Once a system is built on certain established precepts, it can be extremely difficult to modify certain aspects of it. Having been a user specialist involved in the development of a new system, I heard this from programmers all the time,it seemed.

    2. The government has clandestinely been collecting information from landline telephone companies via equipment in central offices for many years before the populace finally became aware of it. If, now, both landline and cell phone companies must have mechanisms in place to provide private call records to the government for potential evaluation of hazardous content, what next? Will all forms of communication be required to have back-doors built in to permit access by the government during or after individual conversations about perceived acts of violence discussed over the phone?

    When will the government decide to listen to and/or observe everything done by all citizens before crimes are committed? (Like cameras on the streets, etc., that don’t prevent crimes but provide evidence after the fact for prosecution). I don’t want bombs exploding, but I don’t want greater erosion of our freedoms either. I still want us to be “innocent until proven guilty” and not have an even bigger “Big Brother” than we already have. I don’t want the government to grow out of all proportion to the populace in order to monitor all that citizens say and do.

    3. Apple is particularly noted for the security mechanisms on their phones that protect the user. That is one of the major reasons people buy Apple phones. It could cause a big loss of customers for Apple if this security is diminished or done away with.

    4. Our government already has too many sub rosa agencies, groups and individuals performing actions without our knowledge or approval that are purported to be for the safety of US citizens. We don’t get to have a say on these actions taken by such agents supposedly on our behalf. Many of them, we wouldn’t condone if asked for our opinions.
    Enough.

  18. Eli Hershkovitz
    Posted February 20, 2016 at 1:41 pm | Permalink

    Jerry and the FBI. Period.

  19. Jeff Ryan
    Posted February 20, 2016 at 1:42 pm | Permalink

    I think this needs to be looked at for what it is: A criminal investigation.

    What we have here is really nothing more than a compelling government interest v. a for-profit corporation that has shown little concern about any country’s best interests.

    Remember, Farook is dead, so there is no one with legal standing to object to a search of the phone.

    It is a locked room situation. Only one person (Apple) has the “key” (in this case, the ability to perfect entry). There may be extremely important evidence and leads in the locked room (the phone). Setting aside the standing issue, the government has sufficient probable cause and a compelling governmental interest in learning what’s on the phone. Don’t forget, we know one of Farook’s friends was also a homegrown jihadi. It is hardly unreasonable to suspect there are more. And if there isn’t, the government has invaded the privacy of a dead man.

    The legal remedy has been very narrowly and carefully drawn. Certainly more so than any corporation would bother with. Apple is simply not being a good citizen. Their argument hinges on a danger to their profits. The government’s hinges on the safety of the country. Forgive me if I don’t give a shite about Apple. And any lawyer will tell you that what is being asked is so fact-specific that the danger of creating a precedent is laughable.

    Oh, and if someone thinks abuse of encryption is occasional, think again. ISIS, al Qaeda, and myriad other terror groups exploit encryption every hour of every day. That, to me, is not “occasional.”

    • Randy Schenck
      Posted February 20, 2016 at 2:52 pm | Permalink

      Well said and plus one.

      • Jeff Ryan
        Posted February 20, 2016 at 2:56 pm | Permalink

        Ta.

  20. jay
    Posted February 20, 2016 at 1:44 pm | Permalink

    A good security package has NO BACKDOORS period. If it’s done right, the creator of the product has no more chance of ‘hacking’ it than a hacker. I am hoping Apple did that right. A backdoor for the ‘good guys’ is a back door for anyone, including state sponsored hackers which are becoming the real threat.

    There are a couple of problems here:

    1) once such a ‘tool’ exists, assuming it were possible, than what about Egypt demanding information on terrorists, China, Saudi Arabia. It’s a slippery slope that could be disastrous.

    2) Back in the 90s when the government (then too under a Democratic administratio) tried to suppress encryption to that which could be ‘broken’. The result was that white hat hackers created stronger and stronger packages to defeat this menace. That will certainly start to occur again if commercial entities are forced to surrender privacty. (Look up the history of cypherpunks, and Phil Zimmerman). Eventually PGP went mainstream when the government backed down but looks like the open source underground needs to come alive again.

    • Wunold
      Posted February 21, 2016 at 3:47 am | Permalink

      “A good security package has NO BACKDOORS period. If it’s done right, the creator of the product has no more chance of ‘hacking’ it than a hacker.”

      This. And the feature in question, erasing the phone after X tries, is superfluous if the encryption is solid. The best crypto algorithms today are open source. It’s the math that protects the data, not the code.

  21. GBJames
    Posted February 20, 2016 at 1:51 pm | Permalink

    I side with Apple. It may seem at first blush to be an issue with just one particular phone. But inevitably another and then another “perfectly reasonable” cracking will be required. It is equally inevitable that the key will eventually be found in the hands of very nefarious characters, cracking into the phones of FBI and other federal officials.

    • Ken Kukec
      Posted February 20, 2016 at 7:22 pm | Permalink

      As I asked Diana above, what’s so special about the information on this phone as to distinguish it from our “persons, houses, papers, and effects,” or the information that has traditionally been stored on mobile phones, that has routinely been subject to disclosure pursuant to a showing of probable cause and a court order?

      In particular, why does this information merit an absolute exemption from disclosure for ever and all time, under all conceivable circumstances, no matter how exigent the countervailing need for disclosure?

      • Diana MacPherson
        Posted February 20, 2016 at 9:41 pm | Permalink

        And as I answered it’s not about the device. It’s not about the information. If the phone password could be given, Apple would give it. It s the writing of the program that defeats the security.

        • Ken Kukec
          Posted February 21, 2016 at 4:11 am | Permalink

          And yet the self-same phone sits there on the investigator’s table, its information now rendered forever inaccessible, no longer able to reveal its crucial evidence of a pending crime or terrorist act.

          While the law stands by toothless before Apple’s precious new technology.

          Think the next-of-kin will appreciate the fine distinction?

          • Diane G.
            Posted February 21, 2016 at 4:35 am | Permalink

            (I think I just posted this with a mistyped data field. But just in case it didn’t go through, I’m reposting. Sorry if we end up with duplicates.)

            …no longer able to reveal its crucial evidence of a pending crime or terrorist act.

            Can you attest to any evidence for the probability of that?

            • Ken Kukec
              Posted February 21, 2016 at 6:37 am | Permalink

              Such a showing would be required to establish “probable cause,” which my analysis is predicated on the government having shown.

              But I don’t really much care about the outcome of this particular case. The crucial question is whether those of you who oppose Apple ever being required to decrypt these cell phones are ready to follow that logic to its logical conclusion, however dire the consequences may be. It’s certainly feasible that a suspect’s cell phone will contain crucial evidence of a crime or terrorist act. Indeed it’s rather commonplace for a suspect’s cell phone to be the single most probative item of evidence in such cases.

              Whacha gonna do then? I keep seeing people in this thread say they stand firmly with Apple, which says no, never, ain’t gonna do it — and that once the decrypt genie is out of the bottle, game over. Are you truly prepared to stand steadfast with Apple all the way down that road? So if not now, when?

          • Diana MacPherson
            Posted February 21, 2016 at 11:46 am | Permalink

            Except that all the information prior to the crime has been handed over… Except that this was a work phone and the phones used were destroyed by the terrorist. Except given the above, it’s highly unlikely that this phone will yield anymore information, except that the FBI could, with time hack this phone.

            It’s not a matter of a ticking time bomb.

            • Ken Kukec
              Posted February 21, 2016 at 1:03 pm | Permalink

              This case may not be, but it’s more than feasible that a case down the will. It’s close to certain that a criminal investigation will arise where an encrypted iPhone contains information about an ongoing criminal enterprise that is unavailable from any other source.

              What then? You still gonna stand strong with Apple that a decrypt is out of the question, lest the encryption genie slip its bottle? Damn the consequences and full speed ahead?

              Given the inevitability of the issue coming to a head, and given the possibility that it may do so under circumstances that do no lend themselves to contemplative reflection, why not get it out on the table now?

              • Posted February 21, 2016 at 2:39 pm | Permalink

                What then? You still gonna stand strong with Apple that a decrypt is out of the question, lest the encryption genie slip its bottle? Damn the consequences and full speed ahead?

                If the FBI wants to crack the encryption, they’re more than welcome to do it themselves.

                What’s entirely off-limits is for them to demand that others do their dirty work for them. Or, even worse, as many are suggesting, to demand that nobody be permitted to lock their doors so that the cops aren’t ever inconvenienced when they want to snoop around your home.

                Anybody who knows some very basic math can encrypt communications in a way that nobody, not even hyperintelligent aliens, can even theoretically crack. And, of course, even such unbreakable encryption is useless if somebody’s looking over your shoulder as you encrypt or decrypt it. What’s at issue here is whether or not we’re going to let the FBI snoop over everybody’s shoulder as they try to have a private conversation with readily-available consumer products.

                Do you really think it’s a good idea to trust the FBI with all your private communications, and those of everybody else? Do you really trust the FBI not to let the cat out of the bag in such a way that every other foreign country and drug lord and what-not gets their hands on the technology? Do you really want to have to resort to old-fashioned means of communication when you want a bit of privacy? For that matter, do you really trust the FBI not to spy on your teenaged daughter or to do some muckraking with social and political movements they don’t care for?

                The FBI has all sorts of means at their disposal to do their jobs. They can tap the phones, they can have agents surveil suspects, they can interview associates, they can put moles in place and so on. They don’t need to have Big Brother in all our pockets, too.

                Cheers,

                b&

                >

              • Jeff Ryan
                Posted February 21, 2016 at 2:50 pm | Permalink

                No, the Fourth Amendment doesn’t discriminate. There is no asterisk in it that says the government can search anywhere with probable cause and a warrant EXCEPT (insert name of pet data storage device here). Nor should it.

                There may be legitimate reasons to bar forcing Apple to do what the FBI wants. I can’t think of any offhand, but I am open to suggestions.

                However, to elevate smart phones to the status you advocate will, among other things, merely enable criminal behavior and frustrate legitimate law enforcement. Rarely a good thing.

                And the Supreme Court has already spoken to this to some extent by saying a warrant is necessary (barring a recognized warrant exception) to get into a cell phone. The Court recognized the novel character of smart phones. But it didn’t say they were sacrosanct. Nor should they be.

                And I predict that if Apple somehow wins this fight, you will see legislation mandating some form of decryption ability for every phone. And that legislation would stand.

              • Posted February 21, 2016 at 3:32 pm | Permalink

                No, the Fourth Amendment doesn’t discriminate. There is no asterisk in it that says the government can search anywhere with probable cause and a warrant EXCEPT (insert name of pet data storage device here). Nor should it.

                The Fourth gives the State the right to search. It most emphatically does not give the State the right to find.

                b&

                >

              • Jeff Ryan
                Posted February 21, 2016 at 4:09 pm | Permalink

                It kind of does. But more importantly, is that a good public policy? I suggest that it would be a disaster.

                It’s time that corporations start acting like citizens.

              • Posted February 21, 2016 at 4:23 pm | Permalink

                We already know that what you propose is an even worse disaster. Much, much worse, for you propose to put the police in charge of deciding what they should and shouldn’t be able to do, and every prior example of such is an over-the-top example of tyranny.

                Indeed, the very word, “tyranny,” derives from the ancient Greek term for an authoritarian ruler, without any pejorative connotation whatsoever. The reason that both “tyranny” and even “authoritarian” are today considered such evil terms is because of the millennia since then of the abuses of authority.

                It’s only been when authority has been restrained that civilization has flourished.

                So why are you so eager to side with the tyrants? You might fear the petty criminals, but it’s those in power who present the true danger.

                b&

                >

              • Jeff Ryan
                Posted February 21, 2016 at 4:30 pm | Permalink

                You know, it’s obvious that you wouldn’t know actual “tyranny” were it to walk up and shake your hand. You dishonor those who have actually experienced tyranny. You statement is ridiculous.

                You seem to argue that government cannot exercise its powers in a legitimate way.

                Or put it this way: Government isn’t forcing you to own a smart phone.

              • Posted February 22, 2016 at 9:42 am | Permalink

                Government isn’t forcing you to own a smart phone.

                Seriously?

                You honestly think that, in this day and age, it’s permissible for the State to tell me my choice is to withdraw from society or give them unfettered access to my communications?

                You know what?

                The Government isn’t forcing you to live in your own home, either. You think that makes it okay for them to storm in any time they feel like it? Does the fact that you have to pay property taxes give them free reign to watch you in the shower?

                b&

                >

              • GBJames
                Posted February 21, 2016 at 4:44 pm | Permalink

                This isn’t really about smart phones or whether the government is forcing you to buy one. It is about computing devices. It is about the iMac I use to make my living. It is about the computers that run (nearly) all aspects of modern life. It is about much more than an iPhone.

              • Jeff Ryan
                Posted February 21, 2016 at 4:45 pm | Permalink

                Erm, but name one item the law says can never be searched. Just one.

  22. Posted February 20, 2016 at 1:54 pm | Permalink

    I don’t inherently trust either FBI or Apple. My understanding of the technology involved makes me think apple is pulling a pr to get customer sympathy. This can apply to a single device. It’s just a warrant.

  23. Posted February 20, 2016 at 1:57 pm | Permalink

    PWPD?

    I view ISIS as horrible as the Nazis. If unlocking this iphone could directly lead to the downfall of the top bad guys, I’d be all for it. However, if the FBI really intends for this to be a one time thing and on this iphone alone, why not save such extreme action on bigger fish? So I don’t buy the one time thing, as it makes no sense.

    If the phone of a much bigger fish was acquired, I *wonder* if the unlocking could involve extraordinary measures, such as requiring biometric information from thee top three executives at Apple. Once the phone is unlocked and the information extracted, then the program and the phone could be made to self-destruct.

    • jay
      Posted February 20, 2016 at 2:02 pm | Permalink

      There is NO SUCH THING as one time with FBI. There will be more times. And more after that.

      Guaranteed.

    • Posted February 21, 2016 at 4:10 am | Permalink

      (I meant WWPD… What would Picard do?)

  24. Posted February 20, 2016 at 1:57 pm | Permalink

    Will any information on that phone have a major impact in preventing further similar acts?

    We don’t know, but in all probability I think any information retreived will be of limited and temporary use, if it is of any use at all.

    • Mark Sturtevant
      Posted February 20, 2016 at 3:16 pm | Permalink

      It is pretty much a crap shoot, I agree. I am just surprised I have not heard about this sort of push to unlock phones before.

  25. Posted February 20, 2016 at 2:00 pm | Permalink

    I’m with Apple. And for thus very simple reason: terrorists are people, and people have a right to privacy.

    If not here, where will we draw the line? Where will privacy no longer be respected in the name of national security? Is terrorism the line? Then what about murder?

    Why not access people’s private data from the beginning, to make sure no heinous acts are committed in the first place?

    This example is a good one, since it shows that (so far) Apple us in the righty side of the moral divide.

    What is to say that law enforcement will find anything in the phone in the first place?!

    • Diana MacPherson
      Posted February 20, 2016 at 2:26 pm | Permalink

      I’m on the side of Apple too but don’t be too quick to see them as moral. Most likely they are protecting their product and IP more than they are advocating for privacy. Recall how Apple behaved when a journalist got hold of one of their beta iPhones – going so far as throwing him in jail and having police break into his home and toss his house. Sure, it can be argued he was in possession of stolen property, but when have police acted in such a way to a stolen iPhone?

    • Jeff Ryan
      Posted February 20, 2016 at 2:28 pm | Permalink

      American terrorists may have Fourth Amendment rights. But not terrorists overseas.

      Fourth Amendment rights, in this context, would protect the phone’s owner, and that’s all. And all the amendment does is ensure that the state not conduct an illegal search and/or seizure in the absence of a warrant (though there are numerous exceptions to the warrant requirement). Accordingly, a terrorist is not immune from search/seizure where the state has probable cause and a warrant, or it has probable cause and there is an exception to the warrant requirement.

      As to “murder,” the Fourth Amendment applies just the same. If there is probable cause to search or seize an alleged murderer’s person or property, the state can do it (subject to the warrant requirement or an exception).

      You don’t seem to appreciate that this is happening in a court proceeding. In other words, due process is being followed. No one has a right to be immune from search or seizure where there is probable cause. And I frankly think that Apple’s objections have been answered reasonably.

    • Ken Kukec
      Posted February 20, 2016 at 7:48 pm | Permalink

      “… terrorists are people, and people have a right to privacy.”

      An absolute right to privacy, for ever and always, regardless of the any exigencies, and no matter how strong the countervailing interest in disclosure? WTF?

      Now that’s a novel concept that runs counter to eight centuries of well-established Anglo-American law, dating back to Magna Carta. We the people have been entitled to have agents acting on our behalf search our fellow citizens’ persons, houses, papers, and effects pursuant to a warrant issued by a detached and neutral magistrate upon a showing of probable cause that a crime has been committed since the founding of our nation.

      Why do terrorists all of a sudden deserve such greater protection of their privacy?

      • jay
        Posted February 20, 2016 at 8:17 pm | Permalink

        They can search where they want. If they can’t read the content, that is their problem.

        Defenders of the feds are not answering the question… what happens when China, or the Saudis start demanding access?

        The only sensible option is to make a system with no back door whatsoever.

        • Ken Kukec
          Posted February 20, 2016 at 9:42 pm | Permalink

          What prevents China and Saudi Arabia from demanding that Apple create this software right now? I don’t recall the Chinese or Saudis ever hedging their demands on what some trial judge in California says in a court order, do you?

          Are opponents of the feds prepared for terrorists, organized crime, and narco-trafficers holding US citizens hostage to put evidence of their crimes forever beyond detection by law enforcement no matter what?

          Is that your “only sensible option”? Really?

          So we’re obliged to have our privacy expectations driven solely by what technology can accomplish — anything technology can make secret, must be kept secret? How far out are you willing to follow that logic — until it becomes a suicide pact?

      • Posted March 6, 2016 at 10:48 am | Permalink

        Because, as I said above, terrorists are people too. The same applies to murderers, adulterers, and random people on the street. They are all people first, everything else that they might be comes second.

        • Wunold
          Posted March 6, 2016 at 12:53 pm | Permalink

          But we’re talking of *dead* people here.

          The question is, therefore, what rights of privacy do the deceased have? Does anyone here know the legal situation in the U.S.?

          • Jeff Ryan
            Posted March 6, 2016 at 12:56 pm | Permalink

            Essentially none, and none of constitutional dimension that I can think of.

  26. jay
    Posted February 20, 2016 at 2:01 pm | Permalink

    Really the government cannot be trusted.

    One example, the ‘stingray’ a secretive device used to impersonate a cell tower and sweep up all moblile communications, a violation of 4th amendment as well as FCC rules.

    The government was using this secretly for quite a few years. There was a non disclosure clause in the contract for police departments, often only one or two cops would know and hand ‘hints’ to others cops. Often even the prosecutor had no knowledge of this method of evidence gathering. In one case, when the ACLU attempted to get more information the feds actually DROPPED the charges to prevent information from getting out.

    Now we know that NYCPD has been using it more than 1000 times since 2008 with no warrants, no oversight

    • Jeff Ryan
      Posted February 20, 2016 at 2:34 pm | Permalink

      First, the abuse has become public, and is being dealt with.

      Second, and I’m no expert, my understanding is that StingRays were used by law enforcement to monitor what cell phones were operating at any given time, and not to listen in on conversations. To listen in on a conversations would trigger the Fourth Amendment’s warrant requirement. (And I believe some agencies have held that even using the monitoring-only feature would still violate the Fourth Amendment>)

      • Randy Schenck
        Posted February 20, 2016 at 2:59 pm | Permalink

        And besides all of that…Where is the logic in “because the govt. can’t be trusted”. If that is the way is goes, I’ll never buy another car either.

      • jay
        Posted February 20, 2016 at 8:13 pm | Permalink

        No it was not just what conversations were active. They could get that from the phone company with a warrant. They used it getting content, and bypassing the rules of obtaining information.

        • Jeff Ryan
          Posted February 20, 2016 at 9:07 pm | Permalink

          I do not believe you are correct. First, that would be in violation of settled law, and even local law enforcement would know that.

          Second, you can get all sorts of information from a phone company. But not contemporaneous tracking of what phones are active.

          • Posted February 20, 2016 at 9:09 pm | Permalink

            But not contemporaneous tracking of what phones are active.

            Not directly, no. But the FBI and lots of local police forces are doing so surreptitiously and with dubious legality with so-called “Stingray” devices that perform a “man-in-the-middle” attack on the cellphone network.

            Cheers,

            b&

            >

            • Jeff Ryan
              Posted February 20, 2016 at 9:38 pm | Permalink

              This was in a followup to a StingRay thread.

  27. Chemist
    Posted February 20, 2016 at 2:05 pm | Permalink

    This appears to be a manufactured case by the FBI or Department of Justice to establish a precedent.

    Consider: The phone was an asset of City of San Bernardino. The Apple ID was changed after the phone came into FBI’s possession. Apparently the iCloud password was reset by the County at the request of the FBI. That is a really strange set of circumstances if the data on the phone was as important to the investigation as claimed.

    Here’s the link to the story from BuzzFeed.

    http://www.buzzfeed.com/johnpaczkowski/apple-terrorists-appleid-passcode-changed-in-government-cust?utm_term=.sue2bA6l6#.tjqYBn989

    If this pans outs as described, someone has some explaining to do to the judge. This certainly weakens the Government case. This also follows similar behavior we have seen in how privacy is ignored by the US Government. It also shows that the set up of “privacy vs security” is a fraudulent tactics (appeal to emotions) to get something not needed.

  28. Mark Sturtevant
    Posted February 20, 2016 at 2:07 pm | Permalink

    I had been watching this story, and am thankful that you came to the same conclusion that I had. The of intrusion by authorities into ones’ personal space in this case seems very similar to having a search warrant. Search warrants may have been abused, but that has been pretty rare and highly localized. I wish Apple would allow it.

  29. Posted February 20, 2016 at 2:09 pm | Permalink

    I have rather strong feeling on both sides of this subject. But I think I feel more strongly for Apple’s wanting not to establish a precedent.

    Don’t forget, though, that no encryption is unbreakable. It’s just a problem of computing power. If we ever do get quantum computers, beware, privacy.

  30. DiscoveredJoys
    Posted February 20, 2016 at 2:19 pm | Permalink

    From what I have read there is only a very small chance that the phone will contain ‘new’ information about terrorist acts and communication. There doesn’t seem to be any critical need for such information. In which case is asking Apple to break their own design and run the risk of commercial backlash reasonable and proportionate?

    It seems to me that the FBI are trying to create a backdoor principle in tiny steps.

    • Jeff Ryan
      Posted February 20, 2016 at 4:18 pm | Permalink

      What you have read here is pure speculation. I doubt anyone knows what the odds are that there is valuable information. As I noted before, the initial investigation led them to a third party who has since been charged, and showed, from his own statements, that he also believed in violent jihad. That doesn’t mean he posed a real threat, but one tends to be overcautious when many people have already been killed.

      Even so, that is not the standard. I deplore anyone who lumps Muslims, or dissidents, together without cause. But here you have some people who have, shall we say, proven they were serious about murder. And considering that this is a form of organized murder, I give the feds the benefit of the doubt here. But when we reach the point that private corporations are allowed to trump public safety concerns, they’ve lost me. And frankly, I’m a little disturbed when, in circumstance like these, Apple’s response is not “How can we help” but instead “Fuck you.”

  31. ThyroidPlanet
    Posted February 20, 2016 at 2:21 pm | Permalink

    posting before reading here:

    from what I have read, the news has been written to suggest that there is only one entity in the whole wide world in the entire time of the universe that can look at this … permanent resident’s? … phone.

    until I know more about that, ….

    • Jeff Ryan
      Posted February 20, 2016 at 4:10 pm | Permalink

      Now that’s a good point.

      But would such information be safer in a non-interested party’s hands? (I’m not saying I know.)

  32. Stackpole
    Posted February 20, 2016 at 2:23 pm | Permalink

    A technical question: On my iPhone 6s, the passcode is all of four numbers, but I can see that I could make it many more mixed letters, numbers, and symbols. Does anybody know (or is anybody saying) how long the Farook passcode is set to? “Gazillions” of brute force tries, really?

    Granted a brute force shot at my phone would immediately trip the 10 try limit and destroy all Ms. McPherson’s bird pictures, a substantial loss. But a smart attack (I have no idea how) might hit the exact one of the 10,000 possibilities that is my passcode number.

    • Diana MacPherson
      Posted February 20, 2016 at 2:38 pm | Permalink

      Tell me your passcode and I’ll look into it. 😉 I wouldn’t want my bird pictures destroyed.

    • chigaze
      Posted February 20, 2016 at 4:51 pm | Permalink

      Without the 10 try limit a four digit code can be cracked in about half an hour on an iPhone. There is a hard 80ms per try limit that can not be removed in software.

      A six digit code would take a few days and and alphanumeric one could take years.

  33. tubby
    Posted February 20, 2016 at 2:31 pm | Permalink

    It doesn’t matter if there’s a stipulation allowing them to destroy the firmware. Once the deed is done they will be called upon to repeat it time and again because terrorism, because drugs, because won’t someone think of the children, because fear du jour. All it takes is the precedent to become routine.

    • Jeff Ryan
      Posted February 20, 2016 at 2:46 pm | Permalink

      And your point?

      There is no right to commit crimes, or terrorism. And if law enforcement has probable cause, they have the right to search. This is not a remarkable concept. It’s been around for a very long time.

      Feck Apple. If they want their sales pitch to be “We’ll help you be a criminal!”, then they should just say so. Otherwise, they should state the actual case: “We won’t guarantee you a way to be a criminal. If the government serves a warrant, we’ll comply. Kinda like you’re obliged to do.”

      Frankly, this is another example of the libertarian nonsense Silicon Valley specializes in.

      • Diana MacPherson
        Posted February 20, 2016 at 2:56 pm | Permalink

        Apple has already complied with other requests from the FBI to hand over data. The are hardly marketing themselves as the company for criminals if they’ve already complied. They have concerns about this particular request for reasons beyond handing over information.

        • Ken Kukec
          Posted February 21, 2016 at 1:18 pm | Permalink

          Louisville Slugger hardly markets itself as the company for loan sharks. That’s cold comfort to the deadbeat who’s taken a beating. 🙂

          And Colt, Remington, and Winchester don’t market themselves as the companies for murd… er, okay, maybe they do. Never mind about that one.

      • tubby
        Posted February 20, 2016 at 6:20 pm | Permalink

        I also assume you’re going to go rail against the notion that they should ever even bother to try to pretend that this is a one time deal and after they can just destroy the firmware because that’s that? No? I’m an easy target to jump at then, right? Or would you be less upset if it was a company other than Apple?

        • Jeff Ryan
          Posted February 20, 2016 at 6:57 pm | Permalink

          Well, first, I don’t care what company it is. Nor am I “jumping” at any target.

          It just seems that many are solicitous of Apple’s rights. And not so much those of the public.

          • tubby
            Posted February 20, 2016 at 7:41 pm | Permalink

            This isn’t about the rights of the public. It’s about the ability to the government to compel a business to go so far as to write a new operating system to allow them to break into a device. And about pretending that this is a once and done extreme circumstance when it’s little more than creating precedent. What will they find that’s so valuable that it needs this intervention, especially after the government intentionally borked their own easy in with iCloud ticket?

            • Jeff Ryan
              Posted February 20, 2016 at 8:59 pm | Permalink

              This is not a game of “last tag.” I’m not sure what you mean by “borked,” but then, “You keep using that word. I do not think it means what you think it means.”

      • Posted February 21, 2016 at 4:04 am | Permalink

        Yes they have a right to search. But I have no duty to help them do it. If the government wants to search a house and can’t get in, they knock the door down. They do this by themselves. They don’t go to the builder of the home and ask him for the key.

        All of this is brought about by our enormous fear of terrorism which is less likely to result in my or any other person’s death than lightning is. And Jeff Ryan seems absolutely terrorized by this minuscule threat.

  34. Geoffrey Howe
    Posted February 20, 2016 at 2:37 pm | Permalink

    I don’t really see this as all that different from other search warrants. The government can break down your door to get inside if it thinks there is sufficient need and less destructive methods aren’t available.

    This would just be adapting to a new age where you can hide things in your phone as well as your basement. So the only real question, I feel, is whether the government can be trusted with the ability to violate our privacy. But if you didn’t think they could before, this wouldn’t be stepping over a new line.

    Of course, there is still the burden on Apple. They have to pay to write a new piece of software. As far as I’m concerned, that’s the governments responsibility to compensate them for. You don’t get to break down someones door and then demand that they pay for the repairs.

    How Apple wants to respond to all this is up to them of course, and I do have some sympathy for them (that sympathy is hard to work up as I rather dislike Apple, but still). However, it’s no different than a lock smith being told he must pick the door of a house he put a lock on.

    I wouldn’t exactly say I trust the government with this kind of power, but I do think the negatives are outweighed by the positives, and I don’t think the electronic component of this particular invasion of privacy makes it any different than a warranted home-invasion.

  35. Barry
    Posted February 20, 2016 at 2:55 pm | Permalink

    A lot of the comments supporting the request of the FBI are analogous to those who make the argument…”I agree with free speech, but…” You either see privacy as, in principle, a human right, or you don’t. Just because it is the privacy of alleged terrorists doesn’t change the argument in principle. As with free speech, we defend those whose arguments are reprehensible, on the grounds that to deny them that right is wrong. I see it no differently with privacy.

    I find it appalling that so many on this site seem happy to trade that right in these circumstances.

    • Jeff Ryan
      Posted February 20, 2016 at 3:03 pm | Permalink

      So, if the government must, by virtue of probable cause and a warrant, violate someone’s “privacy,” then public safety be damned?

      That has simply never been the law. Ever. The “right to privacy” is no more absolute than any other right. The government has always been able to violate it with probable cause. Always. The right to privacy does not constitute a right to behave unlawfully. It merely mandates that the state comply with the constitution before acting. After all, the Fourth Amendment expressly allows such where the state has secured a warrant. (And in this case, I doubt they needed one.)

    • Randy Schenck
      Posted February 20, 2016 at 3:12 pm | Permalink

      However, others find it appalling that some folks seem to believe that privacy applies to every human on the planet and no matter what kind of criminal they may be. Never heard of probable cause for search I take it.

      Well, shucks – they killed lots of people but please remember their privacy. That thing we did to Ben Laden, terrible what we did to his privacy.

      • Barry
        Posted February 21, 2016 at 7:31 am | Permalink

        The right to Privacy does apply to every human being. That’s the point. You are playing straight into the hands of a government that has run roughshod over due process in its surveillance activities and you are a complete pawn to the “argument” the government will make for all future requests – terrorism. Indeed, you make their argument for them.

  36. Benjamin
    Posted February 20, 2016 at 3:07 pm | Permalink

    I’m not a huge fan of Apple, but I’m completely on their side in this matter, and applaud their stand against the overreaching government.
    Some thoughts:

    1. The government needs to learn that shouting ‘national security’ doesn’t (or shouldn’t) allowed them to get their own way all the time. It isn’t an excuse to remove people’s rights, ride roughshod over laws and human rights, or treat innocent people like terrorists/criminals.

    Just look at organisations like the TSA! If we allow this kind of behaviour to spread, then we’ve effectively capitulated to the terrorists: we’re giving up our rights voluntarily before they take them from us.

    2. If the FBI get their way on this, it won’t stop. It’ll never stop. They’ll continue to demand access to everyone’s information, even without probably cause. They’ll probably pressure Apple into handing over the software to unlock the phones so that it can be use by them at will and without discretion.

    Just look at what the NSA has been up to see what government organisations will do if they think that they can get away with it.

    3. Encryption won’t go away. It’s key to far too many technologies, and trying to remove or undermine it will be disastrous for everybody.
    There’s nothing special about encryption. It’s just mathematics. If Apple remove/circumvent the encryption on their phones than the “bad guys” will just use alternative software that’s widely available. The criminals will have perfect privacy, whilst the government will continue snooping on innocent civilians. The government must know this, so you wonder what they’re really up to in their war against encryption. Just a way to keep citizens in line maybe?

    4. You can’t take away privacy, just because you want to stop things like crime/terrorism. You may as well install CCTV and microphones in every private residence. After all, you never know what’s being discussed behind closed doors. Selective bugging of people’s residences based on strong evidence and judicial oversight might be permissible, but what the FBI, CIA, NSA et al. want to do is far more akin to bugging everyone’s houses…just in case someone, somewhere might be a terrorist.

    • Jeff Ryan
      Posted February 20, 2016 at 3:18 pm | Permalink

      The government has ALWAYS been able to pierce privacy by presenting probable cause to believe there is evidence of a crime in the location specified, or, in cases of seizure, that a crime has been, or is being, committed.

      That’s exactly what the Fourth Amendment SAYS. The Constitution has never said the police must stand by helpless because someone claims “privacy.” Ever.

  37. Posted February 20, 2016 at 3:17 pm | Permalink

    To me, this seems to be a case where the long-term effects outweigh the immediate benefit. I agree with Apple unless I hear a good argument otherwise.

    Here is a good breakdown of the issue:
    https://stratechery.com/2016/apple-versus-the-fbi-understanding-iphone-encryption-the-risks-for-apple-and-encryption/

    • Jeff Ryan
      Posted February 20, 2016 at 3:20 pm | Permalink

      I’m sorry, I must have missed the part where Apple’s long-term health trumps law enforcement.

      I thought we lived in a nation. Not a board of directors.

      • Posted February 20, 2016 at 3:27 pm | Permalink

        I’m not talking about the long-term effects on Apple. I’m talking about the consequences for all consumers (and citizens) of creating such software, and the precedent that such an action creates.

        • Jeff Ryan
          Posted February 20, 2016 at 4:23 pm | Permalink

          To the extent I misunderstood your point, I apologize. And your concern is valid.

        • DiscoveredJoys
          Posted February 20, 2016 at 4:48 pm | Permalink

          According to the technical summary by the EFF (link in stuartcoyle’s post below) the All Writs Act requires that the technical assistance requested not be “unduly burdensome”. So while business does not trump law enforcement there is still a test of what is reasonable.

          The bar for “unduly burdensome” would be exceptionally high if it meant finding a nuclear bomb in New York, but a dead man’s work phone when he has show awareness of the ‘risk of exposure’ by destroying his personal phone?

          I guess the lawyers are going to get rich.

          • Ken Kukec
            Posted February 20, 2016 at 5:16 pm | Permalink

            If the information on the phone is likely to disclose the participation of others in the crime, or is likely otherwise to disclose information important to the government’s investigation of this matter, I think that would be a sufficient governmental interest to outweigh the burden on Apple — although, as I’ve said elsewhere in this thread, Apple should be entitled to reimbursement by the government for the reasonable costs associated with complying with the court order.

      • Posted February 20, 2016 at 3:34 pm | Permalink

        Wish it were so, but the age of corporate soveriegnty is well and truly upon us, soon TPPA TTIP will make this brutal reality undeniable. 😦

        • Jeff Ryan
          Posted February 20, 2016 at 3:44 pm | Permalink

          I certainly take your point.

          Look, I’m an old fart, I understand. But if there’s one thing I learned it’s not to put stuff on your phone, your laptop, your Facebook, etc. with the expectation it won’t be accessible to everyone. The Internet (and I include smart phones with this) is great, but it must be used with eyes open.

  38. Posted February 20, 2016 at 3:24 pm | Permalink

    I haven’t read every post. Has anyone mentioned that starting with the 5S phone, even Apple cannot break the encryption or defeat the password?

    So if this fix got out, it would only affect older phones, and only if the hacker had custody of the phone.

    Encryption is pretty damn good these days. It cannot be defeated unless there is a bug in the OS or in the hardware.

    • chigaze
      Posted February 20, 2016 at 4:56 pm | Permalink

      This may not technically be true as Apple can patch the firmware of the Secure Enclave. This means they may be able to remove the 10 try limit even on A7 based phones.

  39. kevin7alexander
    Posted February 20, 2016 at 3:57 pm | Permalink

    I haven’t read all the posts so I don’t know if this has been addressed but what about the Constitutional protection against unreasonable searches. It seems pretty obvious by now that these people were one offs so what reasonable chance is there that there is some ticking time bomb there.
    We all criticize Sam Harris for creating outlandish scenarios to justify his violent fantasies so why should we give the FBI a pass; especially when the FBI have so much more power.

    • Jeff Ryan
      Posted February 20, 2016 at 4:02 pm | Permalink

      I understand that you haven’t read all the posts.

      The Fourth Amendment protects the individual whose privacy rights are directly affected. The owner of the phone is dead. So his privacy rights no longer matter. That is the way a court will look at it. Since Farook’s privacy rights no longer exist, the Fourth Amendment isn’t really in play.

      The recent Supreme Court opinions forbidding the police to read what’s on cell phones without a warrant does not expand this in any way.

      • DiscoveredJoys
        Posted February 20, 2016 at 4:31 pm | Permalink

        The phone in question was a work’ phone – owned by his employer, the San Bernardino County Department of Public Health.

        • Jeff Ryan
          Posted February 20, 2016 at 4:53 pm | Permalink

          I understand that. But it is one in which he had an expectation of privacy.

          More to the point: Is the Department of Health opposing this? If not, who cares?

          • Scote
            Posted February 20, 2016 at 5:22 pm | Permalink

            Everybody supports the FBI having access to this phone. There are no direct 4th amendment rights at issue.

            What is at issue is an All Writs Act order to force Apple to write a custom operating system that eliminates the security features of iOS that prevent people from brute forcing (systematically guessing and trying passcodes) the passcode. The tool Apple would be forced to write would could be used, or modified to be used, on all iPhones, mooting the secure architecture of every iPhone. Should the FBI’s desire to get into this phone be allowed to give them the power to force any tech company to write custom software to remove the security from any and all of their products on the FBI’s say so? Because that is what is at stake.

            • Jeff Ryan
              Posted February 20, 2016 at 6:05 pm | Permalink

              I understand that. I also understand your concern. But if a company is going to operate in interstate commerce, as well as communications, it runs this risk. And whether I happen to like this or not (though I do side with the government on this) does not mean the policy is bad.

              There are few, if any, areas where, assuming the government interest is legitimate, a corporation’s “rights” (such as they are) will trump those of public safety or national security. Many here worry about abuse: The answer to that is to incorporate adequate safeguards. It is not to deny the overriding interest of the government in protection of its citizens.

      • Ken Kukec
        Posted February 20, 2016 at 5:05 pm | Permalink

        I take it your addressing the question of “standing,” Jeff.

        In general, you’re correct that the telephone’s owner(s) would the party with standing. But if Apple or its customers have “a legitimate expectation of privacy” (the standard for Fourth Amendment standing) that would be compromised by compliance with the decryption sought, I think Apple would also have standing to litigate the matter in the courts. (Analogous to how an owner and a bailee of personal property might both have standing to contest a search.)

        I don’t think Apple would ultimately prevail on the merits, unless there are additional, as-yet undisclosed circumstances pertaining to this case.

        • Jeff Ryan
          Posted February 20, 2016 at 5:19 pm | Permalink

          But they don’t have a legitimate expectation of privacy. When you call someone else’s phone, you rarely have a legitimate expectation of privacy. You incoming number will be recorded. And, for Christ’ sake, you are basically asking there be a legitimate expectation of privacy in a radio.

          No one has any control over what the receiving phone owner will do with the information he receives. Once you let another person in on your communications, you lose that expectation.

          Apple certainly has no such expectation. They are not the owner of the phone, the recipient of the calls. If they have standing, then how come they can’t open it? What interest do they have in what calls he received or what information he preserved?

          • infiniteimprobabilit
            Posted February 20, 2016 at 5:54 pm | Permalink

            Apple’s customers (on whose behalf Apple is standing, not the dead terrorist) *do* have a legitimate expectation of privacy. Because they bought a phone with supposed uncrackable encryption. And if the FBI get this one decrypted, they can get *any* iphone decrypted the same way.

            cr

            • Jeff Ryan
              Posted February 20, 2016 at 6:17 pm | Permalink

              That is a specious, speculative argument.

              Apple is not standing in for software owners who may, at some future, undefined point possibly be affected by this action. “We can’t name them, they have not joined this action, but MAYBE…” doesn’t even come close to amounting to standing. Look at taxpayer standing cases if you want to see why. I.e., as a taxpayer who opposed the Iraq war, I don’t have have standing to challenge the use of my tax money on it.

              I don’t agree that the FBI will have carte blanche to access anyone else’s phone based on this case. If an individual phone owner can show the FBI illegally accessed their phone’s date, then that person would have standing. Until then, no.

              And look at what the FBI is going through to simply access this phone. Do you seriously think they are going to go through that for every cell phone user? They haven’t the time, money, or inclination.

              • infiniteimprobabilit
                Posted February 20, 2016 at 6:29 pm | Permalink

                I’m not talking about legal ‘standing’ but people who have a real personal interest in the outcome. Which surely includes many of Apple’s customers.

                “They [FBI] haven’t the time, money, or inclination.”

                And you know this how?

                Ever heard the word ‘precedent’?

                cr

          • Ken Kukec
            Posted February 20, 2016 at 10:02 pm | Permalink

            Presumably these phone are capable of storing information in addition to that which involves communications with third-parties. Moreover, the expectation of privacy at issue here pertains not to the information stored, but to the capabilities of the phone’s software.

            In any event, all I’m suggesting is that Apple be given the opportunity to establish in court that it, or its customers, do in fact have such an expectation of privacy and that this expectation is one society is prepared to recognize as “reasonable.” That’s just standard, noncontroversial Fourth Amendment practice.

            I have my doubts whether Apple can meet this initial burden, and even if it does, even graver doubts that it could succeed on the ultimate merits of the case by showing a substantive Fourth Amendment violation.

  40. stuartcoyle
    Posted February 20, 2016 at 3:58 pm | Permalink

    I agree with Apple in this case. This could be because I have no interest whatsoever in the FBI gaining more powers, as I am not a US citizen. I support the EFF, who have a good technical overview ( https://www.eff.org/deeplinks/2016/02/technical-perspective-apple-iphone-case )of the case.

    • Posted February 21, 2016 at 4:55 am | Permalink

      This is an excellent analysis. Thanks for posting this.

  41. Ken Kukec
    Posted February 20, 2016 at 4:34 pm | Permalink

    If the government has probable cause to believe that the phone contains evidence of a crime (which seems to be the case), and if Apple can do the decrypt at its own facility without compromising either its propriety intellectual property or the privacy interests of its other customers, Apple should be compelled to do so.

    The district court should stay its order so that Apple can seek review up the appellate chain to the highest court that agrees to hear this apparently novel issue. And if Apple is ultimately compelled to proceed, it should be reimbursed by the government for the reasonable costs it incurs in complying with the court order.

    Unless I’m missing something, this doesn’t appear to be a particularly hard case.

    • DiscoveredJoys
      Posted February 20, 2016 at 4:53 pm | Permalink

      Unless, of course, Apple argue that decrypting the phone would cause a loss of customer confidence in the brand. Even a 1% loss (difficult to prove I know) of $60 billion per year is quite a lot of money.

      • Jeff Ryan
        Posted February 20, 2016 at 5:04 pm | Permalink

        Yeah. You want to be the judge that rules that Apple’s profits are more important than public safety? How quickly do you suppose impeachment proceedings would start? 1 day? 1 minute?

        • Posted February 21, 2016 at 4:18 am | Permalink

          Judging by the number of posts by Jeff Ryan, he has a lot of time on his hands today.

          But there is this. The judge agreed (I think) that Apple be compensated reasonably for their costs of developing this hack. If it costs them money in the way of consumer confidence (let’s say it’s 1% of $60B), are you all in favor of the government giving Apple $600M a year until they fix the issue?

          And judges have ruled in favor of corporations profits lots of times over public safety. The new carbon regulations for one.

          And toddlers kill more people in a year than terrorists do.

          • infiniteimprobabilit
            Posted February 21, 2016 at 5:08 am | Permalink

            I’d *love* to be the lawyer who argued that loss of profits = costs. On either side of the case. And that’s even before we get around to trying to calculate what loss of profits is actually attributable to this.

            Entire law firms could get rich on that 😉

            cr

          • Ken Kukec
            Posted February 21, 2016 at 1:31 pm | Permalink

            And toddlers kill more people in a year than terrorists do.

            I knew it! How fast you suppose we can swap out our terrible two-year-olds for the prisoners left at Gitmo?

      • Ken Kukec
        Posted February 21, 2016 at 4:37 am | Permalink

        So let’s see: If Apple began equipping its phones with an app capable of discharging single deadly doses of anthrax, the law must stand by helplessly, because taking Apple’s app away could cost the company a couple points of market share?

        Or is your market-share test a one-off applicable only to the breach of software encrypting stored digital information the disclosure of which could prevent a crime or terrorist attack?

        How much market share — if any — do you suppose Apple might be made to part with under such circumstances?

        • infiniteimprobabilit
          Posted February 21, 2016 at 5:13 am | Permalink

          That’s stupid. You’d have to come up with a better analogy than that.

          Anthrax would be dangerous and completely pointless, with no legitimate application. Encryption has extremely valuable practical advantages for almost every user.

          cr

          • Ken Kukec
            Posted February 21, 2016 at 6:06 am | Permalink

            They call that a reductio ad absurdum in some parts, but you can go with “stupid.” 🙂

            • infiniteimprobabilit
              Posted February 21, 2016 at 6:25 am | Permalink

              A bit too absurdum. Nobody (sane) would want such an app.

              See if you can think of a feature that might also have some legitimate use. Like, say, a built-in taser for ‘self-defence’. Now if that were ruled ‘too dangerous’, and Apple were told to disable it on everybody’s phones, there could be an argument about how much compensation they were entitled to.

              But this is a little beside the point.

              I think DiscoveredJoys point – and it would be my point too – is that the ‘reasonable costs in complying’ (your words) could be argued to include loss of profits stemming from loss of customer confidence, and that would dwarf any actual programming costs.
              As I said to GreenPoisonFrog, lawyers could get rich arguing that one.

              cr

              • Ken Kukec
                Posted February 21, 2016 at 1:47 pm | Permalink

                Given that a taser-phone could be seen as inherently dangerous, Apple may have assumed the risk that they would be banned, such that it wouldn’t be entitled to compensation.

                Not saying it’s 100% analogous, but I seem to recall the US Department of Justice going round-and-round with Apple and other tech companies regarding how electronic devices equipped with unbreakable encryption would interfere with legitimate law enforcement functioning.

              • Posted February 21, 2016 at 10:18 pm | Permalink

                My point was more in relation to the earlier comments as to what it was costing Apple to comply as part of the ‘burden’ although I didn’t make that really clear.

                I think it’s fair to say that Apple has suffered a lot of burden already due to this request from the government. Regardless of where they came down on this issue, they have taken a lot of heat on the matter.

                For those of you that think Apple is doing this for publicity purposes, you should just view the comments section in almost any article from any newspaper. I’ve seen them condemned as throwing their lot in with terrorists and several people saying that they were throwing their iPhones away and/or never buying an Apple product. If this is a marketing/publicity attempt, it sure isn’t up to Apple’s usual standards.

    • infiniteimprobabilit
      Posted February 20, 2016 at 5:06 pm | Permalink

      “without compromising either its propriety intellectual property or the privacy interests of its other customers”

      But if Apple complies, it can’t help compromising the privacy, not only of Apple’s other customers, but every cell-phone user. Because what the FBI has done once (to Apple), they can do again to any manufacturer…

      cr

      • Jeff Ryan
        Posted February 20, 2016 at 5:24 pm | Permalink

        Yeah, and?

        How is this different than any other search warrant situation? Or put another way, if the police must get a search warrant (as they must) to examine the information on a phone, can anyone who called that number, texted that number, object?

        Answer: No.

        This is a common misapprehension, and a colorable argument. But it’s not a winner. Because the riposte is: No one made you call that phone. And it’s not your phone. If you didn’t want anyone (including the recipient) revealing that info, you shouldn’t have called. Because it’s eminently foreseeable that the recipient may disclose your communication.

  42. chigaze
    Posted February 20, 2016 at 5:01 pm | Permalink

    I’m with Apple on this for a lot of the reasons already stated. A few technical points though:

    Apple is not being asked to decrypt the phone, only to remove the 10 try limit and provide a way for passcode attempts to be made through the data port.

    Apple can not decrypt the phone as with these phones the passcode is combined with a unique device key to create the encryption key. The unique device key is not readable so is also unknown.

    There is a hard 80ms per attempt limit that not even Apple can remove. This means four digit codes will break in half an hour, six digits in days, and alphanumeric codes in years.

    With current iPhones this method would not work as the ten try limit is not enforced by the operating system but by a dedicated chip called the Secure Enclave. However Apple can write to the SE’s firmware. It is unclear exactly what changes they can make to the SE.

  43. infiniteimprobabilit
    Posted February 20, 2016 at 5:01 pm | Permalink

    I guess it boils down to ‘who do I distrust most, Apple or the FBI?’

    Well, I don’t have to trust Apple, I don’t buy their products anyway, but it seems to me that in this case they’re refusing to snoop on an individual. (The individual’s an asshole, and dead, but that’s really kind of irrelevant). And I’m sure their motivation is commercial, not altruistic, but that doesn’t alter the issues.

    The FBI on the other hand – do they really expect to find evidence of some real and present danger on this phone? Or is it a fishing expedition? Or is it that they’re just using this highly-publicised case to establish a precedent so they can extend their snooping abilities for future use?

    So on this one I’m firmly with Apple.

    cr

    • Jeff Ryan
      Posted February 20, 2016 at 5:07 pm | Permalink

      Your principles will be a great comfort, I am sure, for any victims.

      I’m not saying there will be: I am saying that is what you risk.

      • infiniteimprobabilit
        Posted February 20, 2016 at 6:12 pm | Permalink

        The innumerable victims, you mean, of Big Brother, who find their privacy has been eroded little bit by little bit until it’s all gone?

        cr

        • Jeff Ryan
          Posted February 20, 2016 at 6:26 pm | Permalink

          This is the sort of thinking that is beneath contempt.

          The restrictions on search and seizure have grown almost exponentially over the years. Until Mapp v. Ohio, your local constabulary could search your home without a warrant. Until Miranda, no one had to advise you of your rights. Until Missouri v. McNeely, the police got away with drawing your blood, without a warrant, to get evidence in DUI cases.

          Arguing a horrendous loss of liberty is wingnut pablum. I keep hearing how the gubmint is taking away our rights. What rights has it removed? Only the ones in people’s heads. They certainly aren’t in the Constitution.

          • infiniteimprobabilit
            Posted February 20, 2016 at 6:57 pm | Permalink

            Ah well, since my thinking is beneath contempt, I’ll remove myself from this conversation.

            Keep digging…

            cr

            • Jeff Ryan
              Posted February 20, 2016 at 8:40 pm | Permalink

              A poor choice of words, and one for which I apologize.

              But seriously, we are “freer” now than at any time in U.S. history, and certainly freer than at the time of the founding.

              As an object lesson, just think of what the “Malheur Militia” was whining about. If that’s their idea of “tyranny”, then words truly are meaningless now.

              • infiniteimprobabilit
                Posted February 21, 2016 at 12:23 am | Permalink

                Apology accepted, and I do apologise for my huffy reply.

                I felt your original reply was a little bit tendentious. I think the chances the terrorist left anything significant on his work phone are negligible.

                But on the other issue, of encroachments on individual privacy, I’m not equipped to judge. All I can say is that the impression one gets from the news is that privacy is constantly under attack.

                By the way, I don’t like Apple at all, for numerous reasons which are irrelevant here. I think in this case their motivation is commercial (nothing wrong with that) in that a big selling point is that their phones are uncrackable, by anybody. Certainly a lot of dissidents overseas, from Syria to China, would have good reason to be worried if a hack existed.

                cr

              • Jeff Ryan
                Posted February 21, 2016 at 1:10 am | Permalink

                Yeah, I was a bit snarky, so sorry for that.

                As for Apple, why do I suspect that many, many corporations would respond the way they have?

                To the extent that privacy seems under attack, it can be difficult to remember that the same old rules largely apply. I am less worried about government intrusion on my privacy than I am about the intrusion of private companies, which is I think more widespread and with much less accountability. Whenever my computer is on, it’s talking to a lot of people I don’t know about. And I don’t mean the government.

              • Diana MacPherson
                Posted February 21, 2016 at 11:16 am | Permalink

                How do you feel about the intrusion of foreign governments? A bit off topic but much data is housed outside the US. In places like Germany, laws protect their citizens from this in that the data of their citizens is not allowed to be housed outside of Germany. This is a PITA for companies that have data interests in that they need to comply by creating server farms in Germany….but there are reasons the German government protects its citizens this way.

                The rest of us aren’t so lucky. Our data could find itself in a foreign country with different data rules and it could be hacked by foreign governments – like China who has famously hacked private information of US citizens in the past.

  44. Malcolm
    Posted February 20, 2016 at 5:18 pm | Permalink

    It would seem from this article: https://www.washingtonpost.com/news/volokh-conspiracy/wp/2016/02/20/or-is-apple-happy-to-build-a-backdoor-as-long-as-it-makes-money-from-it/

    that the a backdoor existed but wasn’t implemented by the Employer of the Terrorist. While I think hat as this is a work phone there will be little that is relevant I wouldn’t trust Apple here either. Also as already stated texts and numbers used would be available from he carrier.

    • chigaze
      Posted February 20, 2016 at 11:20 pm | Permalink

      MDM is not a backdoor anymore than the a user passcode is a backdoor. MDM just changes which passcode is used as part of the encryption key for the device. It allows the phones owner, the company supplying the phone to the employee, to manage and have access to the device.

  45. Craw
    Posted February 20, 2016 at 5:42 pm | Permalink

    I am with Apple. Libertarian Diana M has put the case well, and Scote too.

  46. rusty
    Posted February 20, 2016 at 5:50 pm | Permalink

    If Apple comply, they will be forced to do this with increasing frequency.
    “Apple could, as it will, destroy the software so that nobody else can have it.”
    What about next time and the time after that?
    How often is it reasonable for Apple to create and destroy the software? Every time it happens, the chance of a leak increases.

  47. Bob
    Posted February 20, 2016 at 5:53 pm | Permalink

    Those arguing that Apple should comply with a court order to unlock the phone would probably change positions if the court were in Russia or some other equally repressive country. Or should companies engaged in international commerce only have to comply with court orders in one country – all the court orders in other countries can be ignored? Or possibly, they can ignore all the court orders from countries we don’t like?

    • Posted February 20, 2016 at 6:22 pm | Permalink

      LOL, I bet that’s an uncomfortable consideration for those pro FBI request.

    • infiniteimprobabilit
      Posted February 20, 2016 at 6:45 pm | Permalink

      Or if not Russia – how about Syria? Syria is fighting ISIS – and also, don’t forget, huge numbers of its own population who *aren’t* in ISIS.

      Don’t you suppose Assad’s secret police would just love to have access to everything on any cellphone they capture? Many of which are probably iphones.

      cr

    • Ken Kukec
      Posted February 20, 2016 at 10:24 pm | Permalink

      That’s primarily a business decision for Apple to make. I would expect that Apple would reach an understanding regarding the applicable ground rules with the governments of the countries where it elects to do business. Failing that, Apple needs to make a decision regarding which governments’ laws it is prepared to comply with, and to do business solely in those countries. There’s nothing extraordinary about that. And Apple of course must make those types of business decisions regardless of what a federal trial court in California decides in this case.

  48. Posted February 20, 2016 at 6:13 pm | Permalink

    I assume they can access their own brand’s phones but it isn’t beneficial to admit this. Now it is even beneficial to keep a hard stance, and get dragged before court, for it makes excellent headlines for them that will invariably be mostly cheerleading from the tech and consumer electronics sites.

    Privacy concerns are the latest hip thing that is actually working against US corporations. I might verge on a mild conspiracy theory, but I find it curious that Apple gets handed such a great case just in time. Very interesting. I’m not really buying it. Data is a billion dollar industry and I don’t believe that a corportations leave the money on the table voluntarily, especially when nobody watches over their shoulders. They can claim about anything.

  49. gravelinspector-Aidan
    Posted February 20, 2016 at 6:30 pm | Permalink

    Apple could, I’m told, even create an iPhone whose passcode could never be hacked by any software, something that seems perfectly legal.

    And no-one would believe that there wasn’t a special FBI backdoor built in.
    Apple’s only way out of this bind is to open-source their security systems, at least. (NB Open Source does not mean relinquishing copyright.) But I don’t see them doing that.

  50. Steve Gerrard
    Posted February 20, 2016 at 6:38 pm | Permalink

    My understanding is that Apple helped with the wording of the order, so that the issue would be a legal one, not a technical one.

    Apple’s premise is that in the long run, it will be the case that either A) any law enforcement agency in the USA can get an order to have any given smartphone unlocked or made unlockable; or B) no law enforcement agency can do so for any phone.

    Apple wants this to be the legal case to decide that, rather than kicking the can down the road.

    Apple’s view is that phone security is important, especially in the global market, and than crippling it would be a blow to their success. There will be phones made in other countries that can’t be unlocked, and consumers who care will get those, so it will just be a big loss of market share for Apple.

    I think Apple is right, people are going to want secure phones, and Apple should be able to make them and sell them. It seems kind of obvious to me that these devices need to be secure.

    • Jeff Ryan
      Posted February 20, 2016 at 7:06 pm | Permalink

      It sounds like they want a special class of people to whom the Fourth Amendment doesn’t apply.

  51. Eli Hershkovitz
    Posted February 20, 2016 at 6:50 pm | Permalink

    Just curious what the casualty count would have to be before made up minds could be persuaded to side with the feds. Would 1,400 be enough? 14,000? 140,000? No ceiling to the body count?

    • Scote
      Posted February 20, 2016 at 6:53 pm | Permalink

      “Just curious what the casualty count would have to be before made up minds could be persuaded to side with the feds. Would 1,400 be enough? 14,000? 140,000? No ceiling to the body count?”

      You could ask that of any part of the constitution. At what point should the President be able to institute nation wide martial law to deal with terrorism? “Would 1,400 be enough? 14,000? 140,000? No ceiling to the body count?”

      How much liberty are you willing to give up for appearance of security?

      • Ken Kukec
        Posted February 20, 2016 at 10:35 pm | Permalink

        The constitution prohibits the imposition of martial law. It nowhere requires that the information stored on an Apple owner’s cell phone be kept hermetically secret forever under all circumstances. Indeed, any notion it does would be inconsistent with the Fourth Amendment’s express provisions regarding “reasonable searches and seizures.”

      • Eli Hershkovitz
        Posted February 21, 2016 at 7:01 pm | Permalink

        “How much liberty are you willing to give up for appearance of security?”

        In the interest of national security, quite a bit.

    • infiniteimprobabilit
      Posted February 20, 2016 at 7:02 pm | Permalink

      That presumes that what they *might* find on the cellphone *might* save one life. Otherwise it’s just a fishing expedition.

      I suspect the FBI have just picked on this one because it’s a nice dramatic case. Whereas some drug-runner’s phone wouldn’t have the same impact.

      cr

      • Jeff Ryan
        Posted February 20, 2016 at 8:42 pm | Permalink

        I would suggest that the FBI would be spectacularly negligent if they didn’t investigate the phone.

      • Ken Kukec
        Posted February 20, 2016 at 10:44 pm | Permalink

        The feds should be required to establish by competent evidence presented to the court that there is probable cause to believe that the information found on the phone will lead to the detection of evidence of a federal crime.

        You have a problem with this standard? If so, why, inasmuch as it’s served the Republic well for nearly two-and-a-half centuries?

        • Jeff Ryan
          Posted February 20, 2016 at 10:50 pm | Permalink

          Actually, I’m not sure that is true. Granted, it is Saturday night, and my brain has had enough.

          But someone has to object to the actual search itself. Given that it appears that the government agency that nominally owned the phone isn’t objecting to the search, and the only person who otherwise would have standing is dead, I don’t think the government needs probable cause on these bizarre facts.

          The argument that Apple has a right to demand probable cause is difficult, because the information on the phone does not belong to Apple. Rather, a whole separate analysis (what is required to force Apple to comply) comes into play, and things get dicey. Ultimately, I think Apple should lose.

          • Ken Kukec
            Posted February 21, 2016 at 4:48 am | Permalink

            Yeah, I’m not sure the 4th A applies by its terns either. If I were the federal district judge in this case, I think the first thing I would have done is request the parties to brief the issue whether the Fourth applies here, either directly or by way of analogy. Even if the Fourth doesn’t apply, it still might furnish a useful analytical starting point.

        • infiniteimprobabilit
          Posted February 21, 2016 at 1:40 am | Permalink

          I have no problem with ‘probable cause’ – and I actually doubt there is probable cause that this work phone would yield any useful data. Though I’m suspicious the court might be overawed by ‘national security’.

          But there’s also the fact that it’s *not* just a court order to hand over some data in a fairly painless fashion. It’s an order to Apple to do a lot of work to demonstrate that a very valuable feature of their phones is probably not so good at all. As an Apple hater I should love that, though I deplore the precedent. But I find the thought of forcing any person (or corporation) to do that, for data that is probably going to be worthless, quite distasteful. I think the ‘probable cause’ should need to be ‘probable beyond all reasonable doubt’ to justify that.

          cr

    • Posted February 21, 2016 at 4:28 am | Permalink

      Toddlers killed more people last year than the San Bernadino pair. How many have to die before you ban toddler?

  52. Jimbo
    Posted February 20, 2016 at 7:14 pm | Permalink

    I’m with Apple.

    I do believe the FBI is overreaching and has an ulterior motive of wanting to penetrate iOS devices. As to the question of ‘who do you trust more: Apple or the government?’ this is silliness. Apple wants privacy for its customers, the FBI wants to spy on those customers aka citizens. Apple has spent its time and money building security and the govt spends our money to dismantle it.

    What if this incident did not involve 1 iPhone but the FBI collected 17 that they thought were involved in a terrorist network? Would Apple have spend its time not making their next product but cracking all of these phones?

    For everyone here who lauded Krauss’s piece on the insanely improbable odds of being killed in a terrorist attack, why has this irrational fear induced so many to grant the FBI sweeping powers to erode was little privacy remains? So that they can make you safer from being killed by something as unlikely as lightning? How about this: the FBI protects us from terrorists the way they did it 20 years ago before an iPhone existed. All those methods are still available to them.

    • Ken Kukec
      Posted February 21, 2016 at 2:13 pm | Permalink

      I’m with Apple [and] I do believe the FBI is overreaching.

      To paraphrase the immortal words of Mandy Rice-Davies of Profumo Affair fame, you would, wouldn’t you? 🙂

  53. Randy Schenck
    Posted February 20, 2016 at 7:23 pm | Permalink

    With all the good discussion around this it’s odd that many of the tech set seem to think Apple has the case in this. The telephone in all of it’s lifetime never had this much protection from the law. In the end, it is just another communication device and some seem to give it much higher status. So high that we must not let the law force the corporation to invade the privacy of the known bad guys. How about the house – the home is man’s castle so lets protect that as well, regardless of who lives inside? Not high tech enough maybe.

    The privacy is more important than going after the murderers. I wonder if the murdered family members think so? There was a pretty interesting show on HBO for about 4 years called the Wire. The gangs and bad guys of Baltimore were the main event on this show. One of the important things about this show was that the bad guys used cheap throw away phones so the cops could not get warrants to tap the phones. Two of the gang members’ only job was to go from store to store purchasing the phones daily and distributing them to gang members. The cops set up all kinds of equipment to capture the identities of these phones and rush to the judge for warrants. Therefore, the title, The Wire, because this was the race between phone technology and the law attempting to stop the drug sale and murder, which was beyond control in the city.

    • infiniteimprobabilit
      Posted February 20, 2016 at 7:28 pm | Permalink

      Thing is, it’s not just a ‘telephone’ any more. It’s a smartphone. Equivalent to a personal computer (in fact many people don’t bother to own a PC these days). Think phone plus diary plus credit cards plus ID plus bank account…. for some people, it’s their entire personal life that’s stored on there.

      So, NOT just ‘another communication device’

      cr

      • infiniteimprobabilit
        Posted February 20, 2016 at 7:30 pm | Permalink

        Oh, and this is why the idea of it being uncrackable is so important. Not just for people who are afraid of the FBI. If a program exists to crack it and it gets stolen…

        cr

      • Randy Schenck
        Posted February 20, 2016 at 7:49 pm | Permalink

        Okay, so you are saying you are just taken away with this little computer in your hand. Your house, who cares, your desk top computer, forget about it. But this “smart” phone, boy o boy. It should be the most protected item since when and who cares about the bad guys.

        • infiniteimprobabilit
          Posted February 20, 2016 at 8:05 pm | Permalink

          Point is, it is NOT just a ‘communication device’. (Oh I already said that. Why is it so difficult to understand?).

          Would you leave your computer, diaries, bank statements, credit cards all packed in a nice handy box and sitting on the street for anyone who wants to stroll away with? Because, if you lose it, that’s what your smartphone effectively *is*.

          (Anyone who wants to steal all that from me (I don’t have a smartphone) is going to have to mount a serious burglary. Even then, good luck with finding any useful information among the piles of papers).

          So that’s why a smartphone needs far better security than Alexander G. Bell’s little gadget. And that includes not having any programs ‘out there’ that can crack it.

          cr

          • Randy Schenck
            Posted February 20, 2016 at 8:34 pm | Permalink

            Where did you keep all this stuff before the computer at home or in your hand. you had it in files in your house. Maybe locked up, or not. Makes no difference. The police could get a warrant and look at everything you got. They could get your phone records, bank accounts everything. What is so special about your little cell phone? You act as if Madison had a cell phone at the Philly convention or knew they were just around the corner and so special, he slipped something in to protect it. we are still looking?

            • Posted February 20, 2016 at 8:54 pm | Permalink

              The police could get a warrant and look at everything you got.

              Yes, but they can’t get a warrant to force you to give them the combination to your safe. Nor can they conscript the local locksmith into picking the lock for them. The police can use their own sledgehammers or cutting torches or lock picks or whatever to get into the safe. But if they lack the means to do so themselves and can’t find anybody willing to help them, the police are SOL.

              b&

              >

              • Jeff Ryan
                Posted February 20, 2016 at 9:25 pm | Permalink

                No, you are wrong. The law doesn’t just throw up its hands and say, “Well, I guess if you won’t tell us the combination, we’re stuck!”

                Don’t want to give up the combination? Welcome to jail, pal. You’ll have lots of time to read up on “contempt of court.” In fact, an indefinite amount of time. Like until you decide to cooperate.

              • Posted February 20, 2016 at 9:32 pm | Permalink

                Sorry, but that’s simply not true.

                https://www.eff.org/press/releases/appeals-court-upholds-constitutional-right-against-forced-decryption

                You do have a Fifth Amendment right to remain silent, even if the Court wants your password.

                Cheers,

                b&

                >

              • Jeff Ryan
                Posted February 20, 2016 at 9:46 pm | Permalink

                Thanks for the cite. I’ll read the case.

                And it certainly doesn’t matter that it comes from the most reversed federal circuit in the country…

                But I’ll read it.

              • Jeff Ryan
                Posted February 20, 2016 at 9:48 pm | Permalink

                Oh, and of course, the court here isn’t asking anyone to incriminate themselves. So it’s not really applicable.

              • Ken Kukec
                Posted February 20, 2016 at 11:23 pm | Permalink

                Ben —

                The law is not 100% clear in this area. A suspect can be compelled to provide blood and hair samples, handwriting exemplars, and to participate in a line-up. There was even a line of cases holding that a suspect could be compelled by court order to sign a waiver giving the feds access to his or her foreign bank accounts “if any.” What the Fifth Amendment’s self-incrimination clause prohibits is the compelling of a suspect to make testimonial-style disclosures of fact.

                Also, nothing in the Fourth Amendment prohibits the government from enlisting the assistance of outside experts to gain access to information contained in items seized pursuant to a warrant. I currently represent an IT guy whose computer was seized, and the feds are busy trying to crack his encryption he code.

              • Jeff Ryan
                Posted February 20, 2016 at 11:37 pm | Permalink

                That sounds like a case I would want to hear more about. Though obviously not on this list, as it’s very OT.

                What’s interesting in this case is that they’re not asking necessarily for codes. They are asking, at the least, that Apple hack the device, without disclosing just how. (Of course, this is their fall-back position.)

                Doubtless my computer illiteracy is showing, but I think you get the drift.

            • infiniteimprobabilit
              Posted February 20, 2016 at 9:02 pm | Permalink

              Are you missing the point deliberately?

              Information in my house enjoys a certain level of security. For it to go astray, someone would have to burgle my house. I can fit burglar alarms if I want. The police can get it with a warrant but some random guy in the street can’t without a fair bit of effort. And the cops are unlikely to raid my credit card balance afterwards.

              Now, put that all on a handy portable device of some intrinsic value in itself, and leave it in your pocket (to be picked) or carry it around with a fairly high risk of accidentally leaving it somewhere. The worry is *not* the cops, it’s who might pick it up.

              *That’s* why encryption is significant on a smartphone, and why uncrackability is important to users.

              That didn’t apply to old phones, and not even to my little old-style makes-phone-calls-&-texts-only Samsung.

              cr

              • Jeff Ryan
                Posted February 20, 2016 at 9:33 pm | Permalink

                And…? So…?

              • infiniteimprobabilit
                Posted February 21, 2016 at 1:02 am | Permalink

                So… Randy couldn’t see why a smartphone was any different from an old-fashioned telephone. But IMO it very obviously is, because it holds huge amounts of personal information. It’s hardly even recognisable as a ‘phone’ any more.

                This may not make much difference to the authorities, who could get a warrant for all of that, but it makes a huge amount of difference to the individual who is worried that some crook might steal it or find it. Hence the importance of good encryption on it, and why (I assume) Apple is very leery of admitting that it can be cracked.

                cr

          • Ken Kukec
            Posted February 20, 2016 at 10:57 pm | Permalink

            So by your logic, personal computers — indeed, all of an individual’s personal belongings — should be off limits to searches and searches? Or why just Apple phones? The law isn’t bound to accept every form of secrecy technology is capable of providing.

            • infiniteimprobabilit
              Posted February 21, 2016 at 1:11 am | Permalink

              (sigh) NO!

              Smartphones are NOT (as Randy put it) ‘just another communication device’. They are also a storage device for, potentially, all your personal information.

              *THAT* was the point I was making.

              Hence the need for encryption – not (in most cases) to defeat the law, but in case it gets lost or stolen. Hence the significance to buyers of it being ‘uncrackable’. And – I presume – Apple’s reluctance to even contemplate cracking it, which would prove that it can be done and would be shooting themselves in the foot so far as their sales are concerned.

              cr

              • Jeff Ryan
                Posted February 21, 2016 at 1:27 am | Permalink

                But of course it can be done. They just don’t want to admit it.

        • eric
          Posted February 20, 2016 at 8:08 pm | Permalink

          The murderers in this case are already dead. The ‘crime happening as we speak’ concept doesn’t really apply.

          Moreover AFAIK nobody is badmouthing the government for asking Apple to do this or seeking this power in court. Everyone understands its the government’s job to investigate murder. The question is really whether Apple should comply voluntarily, without a court ordering them to do so, because Apple does not have the job of investigating murders.

          • Jeff Ryan
            Posted February 20, 2016 at 9:05 pm | Permalink

            Read the article again. The court HAS ordered Apple to comply.

            Sheesh.

            Now that you know the court has in fact ordered what the FBI has asked, what objections do you have?

            • eric
              Posted February 20, 2016 at 9:37 pm | Permalink

              None, once all the legal appeals etc. get sorted out.
              Until then, Apple should defend its rights the same way you would, if you were appealing a legal ruling against you.

              • Jeff Ryan
                Posted February 20, 2016 at 9:48 pm | Permalink

                You are correct on that.

    • Diana MacPherson
      Posted February 20, 2016 at 9:45 pm | Permalink

      Again, it’s not about the phone. It’s not about the information. It’s about code that defeats the security in the device that keeps us safe. Landlines didn’t hold information. iPhones and ipads are computers. And they are secure.

      If Apple had the password, they would hand it over. They don’t want to defeat their security by creating malware that can be used to defeat the security of iPhones and ipads other than this one.

      • infiniteimprobabilit
        Posted February 21, 2016 at 1:15 am | Permalink

        Thanks Diana. You got the point, I think.

        cr

      • Ken Kukec
        Posted February 21, 2016 at 2:24 pm | Permalink

        I understand: it’s all about the codes (and, maybe, the benjamins); the info on the phone is just “collateral damage.” 🙂

  54. Don
    Posted February 20, 2016 at 7:31 pm | Permalink

    I think the Government should do all it can to get the information from the phone. But I’m not convinced that Apple should be compelled to make it easier for them to do it. If I have a safe and refuse a search warrant to open it, can the Feds force the manufacturer of the safe to make a master key to open it? Even if they reimburse the safe maker for the cost of making the key — do they really have the power to commandeer the company’s workforce and require them to create something that doesn’t exist? That just seems scary to me.

    That being said, I’m also not sure I believe Apple when they say the Feds are asking for something that they don’t already have (maybe they just don’t want the public to know about it.) But I could be a little too untrusting of big corporations.

    • Jeff Ryan
      Posted February 20, 2016 at 8:56 pm | Permalink

      In answer to your question, yes the government could do so. Or, alternatively, a judge could have you jailed for contempt until you produced your key. This is not even a close question.

      And yes, Apple can be forced to do it. The people and the government are sovereign. Not Apple. Yet, anyway.

      • Posted February 20, 2016 at 9:06 pm | Permalink

        Or, alternatively, a judge could have you jailed for contempt until you produced your key.

        In the States, only if the key is a physical key. Or you could be compelled to produce an existing piece of paper with a combination or password previously written upon it.

        But it’s a very clear violation of the Fifth Amendment protection against self-incrimination for the courts to coerce you into communicating secrets, such as the combinations to locks or computer passwords, which exist only in your head.

        Indeed, merely admitting that you know the key in question can be considered incriminating, so you don’t even have to admit that you have any knowledge of it.

        …and, in the context of iPhones, it’s worth noting that the police might be able to compel you to use your fingerprint to unlock an iPhone with TouchID enabled. So, if you’ve got anything on your phone that you don’t want the police to get to, either turn off TouchID…or power down the phone before the police (Customs, etc.) arrive. iOS requires your password after being powered off. iOS also requires your password after so many failed fingerprint attempts, so you might consider using your off-hand ring finger or the like for the fingerprint, so you can repeatedly “helpfully” try your thumbs and index fingers for the nice ossifer and “accidentally” lock it that way.

        Cheers,

        b&

        >

        • Jeff Ryan
          Posted February 20, 2016 at 9:36 pm | Permalink

          Like they say, this is so wrong, it’s not even wrong.

          You can be compelled to stand in lineups, give fingerprints, provide DNA samples, all without offending the Fifth Amendment. Don’t even.

        • Ken Kukec
          Posted February 20, 2016 at 11:41 pm | Permalink

          Ben — You might want to check the Arizona regulations regarding practicing law without a license :), although you’re doing pretty good so far, counselor.

          Maybe tread a little lighter regarding the advice on how to go about obstructing a law enforcement investigation, though. 🙂

  55. eric
    Posted February 20, 2016 at 7:56 pm | Permalink

    I haven’t read the thread (just got back from a trip), but my own thought is that Apple is under no moral or ethical obligation to help, and it is probably both in their monetary interests and in their business ethics interest not to do so. If the government wants to break into someone’s safe, the safe-making company is not under any moral obligation to help them do so. Their job is to build good safes. That’s their business promise. Making them *unsafe* is not. …Unless there is a court order requiring them to help the USG break into one.

    Which is the other side of the coin. If the USG does get a court order compelling Apple to help, then IMO they should do so. IMO if this happens then their “customer problem” or business eithcs dilemma largely goes away anyway, since the only thing this action tells their customers is that Apple will keep their info private to the extent the law allows them to do so.

    • infiniteimprobabilit
      Posted February 20, 2016 at 8:09 pm | Permalink

      Except – Apple is an international company. And their customers in China or Syria or Saudi Arabia or Myanmar may not have the same legal safeguards…

      cr

    • Jeff Ryan
      Posted February 20, 2016 at 9:02 pm | Permalink

      The government HAS a court order. Read the article again.

    • Ken Kukec
      Posted February 20, 2016 at 11:45 pm | Permalink

      This case doesn’t involve an appeal to Apple’s moral or ethical obligations; it involves an appeal to the court’s legal authority.

  56. keith cook + or -
    Posted February 20, 2016 at 8:00 pm | Permalink

    From what I’ve heard, when in America traveling by air, not even your testicles are private. All for the greater good and in public!
    So if the travelling public have conceded to this indignity is a corporates’ private parts any different.. for the greater good. We shall see.

  57. Posted February 20, 2016 at 8:24 pm | Permalink

    Very late to the party. Haven’t even pretended to read all the comments, but I see I’m once again in cahoots again with the Canuckistanian with the fetish for backwards toilet rolls — and that she’s got the situation pretty well under control.

    The FBI is welcome to do whatever they like with the phone. They’re not welcome to draft a private company into doing its dirty work for them. They can ask Apple for help, of course. But compel them? The precedent this would set is unthinkable.

    Would any of us be okay with the government compelling Boeing to make napalm-throwing drones that Boeing found morally offensive? No? Not even despite the fact that Boeing already makes bomb-throwing drones that aren’t all that much worse?

    So why is it okay for the government to compel Apple to make illicit security-busting software that Apple finds morally offensive, even though Apple already makes legitimate security software that’s the exact opposite of what the government wants them to do?

    Might as well order Apple engineers to go down to the local hardware store, buy some garden hoses, and beat the information out of the accused, if that’s what it’s going to take to get the passcode.

    Essential to the notion of civilization is that not only do the ends not justify the means, but that compromise is frequently necessary. As the saying goes, better to let ten guilty men free than to hang one innocent.

    If the FBI’s case against this person really does hang on the contents of the smartphone, the case is so pathetically weak that they deserve to be laughed out of court and told to do a better job of investigating next time. What I want to know is why the FBI is so incompetent that they can’t get a conviction without shredding the Bill of Rights. Or maybe they care more about power and their own laziness than justice?

    Regardless, the proper answer to this mess is to fire everybody in the FBI from the person who suggested the idea all the way up. Bring in a fresh team of prosecutors, maybe from some friendly foreign country, hand them the FBI files, and see if they think there’s anything in there worthy of filing charges.

    Cheers,

    b&

    • Randy Schenck
      Posted February 20, 2016 at 9:08 pm | Permalink

      And someone just ahead of you said nobody was badmouthing the government here. I guess he spoke to soon or just before you fired everyone.

      There are probably some on the high court that would buy some of your thoughts. They do think very highly of corporations treating them just like persons and letting them spend all the money they save from paying no taxes to hire the politicians they prefer. They don’t have to disclose much of anything because really – they look just like you and me.

      I think they outlawed napalm but I’m sure the government could come up with something worse to match the evil you say they are doing to poor Apple.

      Would they really be shredding the bill of rights by requiring access to a cell phone of a dead terrorist. I can hear the paper shredding now.

      There are a bunch of folks in this country, some running for president, that think water boarding was just fine and they would do more. Now that’s something you should be concerning with.

      • Posted February 20, 2016 at 9:13 pm | Permalink

        Would they really be shredding the bill of rights by requiring access to a cell phone of a dead terrorist.

        Most emphatically.

        They’ve got the phone. They can do whatever they want with it.

        But they can’t force me, you, Apple, or anybody else to do what they want with it.

        If they didn’t have the phone, they could get a subpoena to take possession of it. But they still couldn’t order you, me, Apple, or anybody else to do something to the phone before they took possession.

        b&

        >

    • Jeff Ryan
      Posted February 20, 2016 at 9:12 pm | Permalink

      Just about everything you said is wrong.

      • Posted February 20, 2016 at 9:17 pm | Permalink

        So, let’s make it personal.

        Pretend the FBI needed your help in some investigation. And the help went beyond surrendering physical items for which the FBI had duly executed warrants.

        What’s the most morally reprehensible thing you’d be willing to do at the FBI’s command?

        Would you crack somebody’s safe for them?

        Would you play peeping tom for them?

        Would you eavesdrop on somebody’s phone calls for them?

        Would you intercept somebody’s mail for them?

        Would you torture somebody for them?

        Would you kill somebody for them?

        Where would you draw the line?

        And why should Apple draw the line anywhere other than where I’d hope you personally would: surrendering physical items for which the FBI has a reasonable and duly-executed warrant?

        b&

        >

        • Jeff Ryan
          Posted February 20, 2016 at 9:41 pm | Permalink

          Well, that sounds like a nice blurb on the back of a subpar thriller, but none of what you’ve said is on point.

          BTW, you might want to look at the numerous statutes, rules of criminal procedure, and case law on the original point. It will save you time in the future.

        • Ken Kukec
          Posted February 21, 2016 at 5:08 am | Permalink

          If directed by a court, based on a finding that I could forestall the clear and present danger of a heinous crime being committed, I would do all but the penultimate (involving torture) — including the last, if I was certain that the crime being avoided involved a greater loss of life (and if I could screw my nerve to the sticking place).

          Not sure, though, what all that has to do with the topic under consideration. 🙂

          • Posted February 21, 2016 at 11:15 am | Permalink

            That’s really the heart of the matter.

            A state in which the police can expect to be able to trivially conscript the people to do its bidding is, by definition, a police state. Some police states are more or less benign than others, but I don’t want to live in any type of police state.

            If the police can’t do their own dirty work by themselves, then the dirty work is the type of dirty work that should not be done.

            I’m not advocating for the obstruction of justice. I’m advocating that the people are under no obligation to actively assist the police — and imposing such an obligation on the people is the line between a free people and a police state.

            b&

            >

            • Ken Kukec
              Posted February 21, 2016 at 2:39 pm | Permalink

              But you’re begging what should be the real question here: should we allow electronic devices to be encrypted by means that put the information on them forever beyond the reach of all legal process?

              • Stephen Barnard
                Posted February 21, 2016 at 2:43 pm | Permalink

                I would say … emphatically YES!

              • Posted February 21, 2016 at 2:47 pm | Permalink

                Absolutely, unquestionably, without hesitation nor doubt.

                And, if you object…then you better be prepared to make not only decks of cards illegal, but paper-and-pencil plus a coin.

                You do know, do you not, that uncrackable encryption is as easy as tossing a coin?

                No?

                Here’s an essay I wrote many, many moons ago. Read it and tell me you still think you can order the tide to recede.

                http://trumpetpower.com/Papers/Crypto/OTP

                Cheers,

                b&

                >

              • Jeff Ryan
                Posted February 21, 2016 at 2:52 pm | Permalink

                And the answer is: No.

              • Posted February 21, 2016 at 3:37 pm | Permalink

                Yes, because that puts them forever beyond the reach of criminals — thus allowing the digital economy to actually function.

                /@

              • Jeff Ryan
                Posted February 21, 2016 at 4:26 pm | Permalink

                Really? Read that again.

                How does this stop criminals again? I mean, handing them the technology to plot in secret, with no fear of discovery? This is a good thing…how?

              • Posted February 21, 2016 at 4:28 pm | Permalink

                No, it’s not a “good thing”, but it is a corollary of a better thing.

                /@

              • Jeff Ryan
                Posted February 21, 2016 at 4:39 pm | Permalink

                What is that “better thing”? The principle that allowing criminals to commit murder without fear of detection saves our vaunted “privacy” rights? Just a simple tradeoff?

                Why yes, Mrs. Jones, it’s terrible your child was slaughtered, but consider: we saved someone else’s right to privacy?

              • Posted February 21, 2016 at 5:00 pm | Permalink

                Where did I mention “privacy”?

                /@

              • Jeff Ryan
                Posted February 21, 2016 at 5:05 pm | Permalink

                Okay. Then what is that “better thing”?

              • Posted February 21, 2016 at 5:36 pm | Permalink

                The thing Diana, GB and I already mentioned. Pretty much every transaction online and in the “real world” (POS and ATM terminals are online computers too) depends on robust encryption.

                /@

              • Jeff Ryan
                Posted February 21, 2016 at 5:43 pm | Permalink

                Which in the appropriate case should be disclosable to law enforcement.

                Look, there’s no right to conceal criminal activity. There is a right to force the government to justify disclosure to a court.

              • Posted February 21, 2016 at 5:47 pm | Permalink

                But the point is that there is no way to do that in “robust” encryption. Once you give law enforcement a way in, the encryption is no longer robust.

                /@

              • Jeff Ryan
                Posted February 21, 2016 at 5:57 pm | Permalink

                Well, tough. Some sort of accommodation must be made.

                The state doesn’t have to throw up its hands and say, “Well, jeez, I guess that public safety has to take a back seat!”

              • Posted February 21, 2016 at 6:02 pm | Permalink

                Say goodbye to all the money in your checking account then.

                /@

              • Jeff Ryan
                Posted February 21, 2016 at 6:04 pm | Permalink

                Hmm. Seems like at least three times a year my bank tells me they’ve been hacked. And that’s true of every bank. This is new? Please.

                If you think government can’t keep a secret, then kindly provide me with today’s ICBM launch codes.

              • Stephen Barnard
                Posted February 21, 2016 at 6:13 pm | Permalink

                I’ve never been “hacked” at my bank or anywhere else. Get a Mac. 🙂

              • Diana MacPherson
                Posted February 21, 2016 at 6:03 pm | Permalink

                1970 here we come. No more working from home since no secure way to access the work networks, no more computers in cars, no more internet, no more transfer of files of any type. Back to paper for everything.

                This is the world of untrustworthy encryption.

              • Jeff Ryan
                Posted February 21, 2016 at 6:06 pm | Permalink

                Sure, because now everything is hack-proof.

                Oh look, a unicorn!

              • Diana MacPherson
                Posted February 21, 2016 at 6:14 pm | Permalink

                Because encryption is used for many things we do in the modern world including all the things I listed. Your bank gets hacked but not its secure transfers around the world because those aren’t hackable.

                I don’t want to sound like an ass saying this, but it is clear you don’t have the knowledge of how all this stuff works. You are making comments that are really out there. I’m not trying to be mean saying this or launch an ad hom but you really need this fundamental understanding.

              • Posted February 21, 2016 at 6:18 pm | Permalink

                Sure, not everything is hack proof. And that forces everyone to expend effort to keep fraud to a tolerable level (like “shrinkage” in retail). But without robust encryption the problem would be orders of magnitudes worse; enough to cripple the economy.

                /@

              • Posted February 21, 2016 at 4:28 pm | Permalink

                Stopping criminals is not the be-all and end-all. Better to let ten guilty men go free than hang one innocent, remember?

                Only tyrants have an interest in killing ’em all and letting the gods sort ’em out.

                b&

                >

              • Jeff Ryan
                Posted February 21, 2016 at 4:41 pm | Permalink

                That quote has pretty much nothing to do with the subject. It doesn’t say “Better to let 10 guilty men commit a preventable crime than investigate one innocent one.”

              • Stephen Barnard
                Posted February 21, 2016 at 4:30 pm | Permalink

                I’m pretty sure that was intended as irony.

              • Jeff Ryan
                Posted February 21, 2016 at 4:43 pm | Permalink

                If I missed that, I’m denser than I thought. Which is probably true…

              • Ken Kukec
                Posted February 21, 2016 at 4:45 pm | Permalink

                But it also puts large amounts of those same criminals’ (and terrorists’) activities beyond the reach of law enforcement.

                Maybe that’s part of the deal, what’s absolutely required to prevent mankind from being driven from The Garden of High-tech back into the Dark Ages of Paper & Ink — maybe, though I’d still like to take a peak at something supporting the proposition that resembles, you know, proof.

                (And as for cards and coins (and dice) — outside the setting of casinos and cheat-joints, their encryption capabilities are rarely enlisted to shield criminal activities.)

              • Diana MacPherson
                Posted February 21, 2016 at 5:16 pm | Permalink

                Are you asking for proof of how information is secured when stored and transmitted using encryption?

              • Ken Kukec
                Posted February 21, 2016 at 6:07 pm | Permalink

                No, what I’m looking for is empirical support for the proposition that there is absolutely no way to simultaneously accommodate the public’s need for data encryption and law enforcement’s need to access the same data to investigate criminal and terrorist activities — whether it’s im-freakin’-possible ever to breech the encryption without unleashing the genie and bringing the whole kit-and-caboodle crashing to its knees, is what I’m asking about.

                (Not sure how many metaphors I managed to mashed-up there. 🙂 )

              • Posted February 21, 2016 at 6:13 pm | Permalink

                Hmm… start here: _Applied Cryptography: Protocols, Algorithms and Source Code in  C_ by Bruce Schneier.

                /@

              • Diana MacPherson
                Posted February 21, 2016 at 6:17 pm | Permalink

                Oh I see. The position I take (and Ant, Ben, GB and a few others) is that by providing a means to break the encryption weakens the encryption and when you weaken the encryption it becomes unacceptable for many if not most transactions. So if you can crack the encryption used for email (as an example) and the same sort of encryption is use for bank file transfer, banks will no longer want to use that encryption because doing so is too risky.

              • Ken Kukec
                Posted February 21, 2016 at 7:10 pm | Permalink

                Ok, thanks. Here’s the thing: I’m not a huge tech guy, but [trigger warning: frightening aperçu to follow] it’s probably even-money that I’m as digital-savvy as the federal district court judge sitting on the Apple case.

                I’ll start with the Schneier; I’m also planning to follow this California Apple litigation closely (maybe Hizzoner and I can try a bit of buddy-system learning). My mind is open on this one — so open that, were it open any wider, I’d need a skull-cap and bandana to keep the gray matter in and the noxious GOP fumes out. Looking forward to engaging on this. and to my upcoming encryptification-edification.

    • Diana MacPherson
      Posted February 20, 2016 at 9:52 pm | Permalink

      Woohoo cahoots!

  58. Ken Kukec
    Posted February 20, 2016 at 8:34 pm | Permalink

    Armed with a search warrant issued by a detached and neutral magistrate upon a showing of probable cause that evidence of a crime will likely be discovered, law enforcement agents can break down our front door, enter our bedroom, rummage through our dresser drawers (or any other place specified in the warrant), and seize such property of ours as the warrant identifies. This has been the case since the founding of this nation 240 years ago (has, indeed, been the case under Anglo-American law for eight hundred years, since the signing of Magna Carta).

    What is it about the info on an Apple phone that makes it so much more precious than our “persons, houses, papers, and effects” such that it must be made completely exempt from disclosure, for all time, under any and all circumstances, no matter how convincing the showing that it contains evidence of a crime or terrorist act, no matter what exigencies may be present, and no matter how overwhelming the showing may be of the need for disclosure?

    What is the source of this new, nontextual right to absolute privacy that warrants all of a sudden abandoning eight centuries of well-settled law? And what are its limits, if any? What if GMC should develop a tractor-trailer rig that can be made search-proof? Are we all cool with that driving down 5th Avenue?

    Or does this new-fangled right to absolute, unbridled privacy apply only to items smaller than a breadbox? (And does anybody even remember WTF a “breadbox” is? 🙂 )

    • Posted February 20, 2016 at 8:52 pm | Permalink

      What is it about the info on an Apple phone that makes it so much more precious than our “persons, houses, papers, and effects” such that it must be made completely exempt from disclosure, for all time, under any and all circumstances, no matter how convincing the showing that it contains evidence of a crime or terrorist act, no matter what exigencies may be present, and no matter how overwhelming the showing may be of the need for disclosure?

      The difference is that the FBI is attempting to draft Apple into doing the FBI’s dirty work for it.

      The FBI has the phone in their possession. They’re welcome to do with it as they will — break into it if they can. Same as if the plans were in a bank vault and they had stormed the bank building. Break out the jackhammers, acetylene torches, plastic explosives, whatever. Or buy their own bank vaults and study them to learn how to open them easier.

      What’s pretty clearly out of bounds for the FBI is for them to kidnap the maker of the safe, put a gun to his head, and threaten to pull the trigger if he doesn’t let the FBI in.

      Wich is what the FBI might as well be doing to Apple. Well, maybe not the bullet to the brain bit, but certainly the kidnapping. Anybody here doubt that the Apple executives would wind up behind bars if they continued to refuse after the FBI got a court to issue such an unconscionable order? And how is that any different from kidnapping and enslaving them?

      b&

      >

      • Randy Schenck
        Posted February 20, 2016 at 9:24 pm | Permalink

        If the FBI, with a warrant from a judge, told the bank you do all your business with to open it up and give them all your records. Why is that different than a warrant to Apple to open up a cell phone and provide what it has?

        • eric
          Posted February 20, 2016 at 10:04 pm | Permalink

          AFAIK Apple has already complied with the FBI in that they have turned over the info they ‘hold like a bank holds your money.’ That’s the info the killers backed up on the iCloud servers. The info the FBI wants now is stuff Apple *doesn’t* have any legal or contractual access too. Instead they are, like Ben said, operating like the manufacturer of a safe who is suddenly told by the governor to crack it or go to jail. Its simply not their job or responsibility to do that. Once they sell the thing, its not their safe any more. If you sell me a house, the government can’t come to you a year later and insist you break into it for them; it’s not your house, you and I have no legal agreement any more, and you have no legal hold or rights having to do with that house any more.

          This fight is not about the FBI access to info Apple holds as some sort of legal guardianship over the owner. That’s already been handed over. Its about the FBI demanding Apple create a new tool designed to break into people’s personal property, simply because Apple manufactured that property.

      • Ken Kukec
        Posted February 21, 2016 at 12:45 am | Permalink

        Cell phone companies are routinely required, merely by issuance of a subpoena, to compile and provide to a litigant user and cell-tower information. They are also routinely compelled by subpoena to produce competent witnesses to appear in court to authenticate the records and to testify as to how the company conducts its business. Banks are also routinely compelled by subpoena to compile and provide similar account-holder information — including, in some circumstances, even foreign banks (which is why Swiss-numbered bank accounts ain’t what they used to be).

        I’m inclined to see the difference here as one of degree rather than kind. I certainly don’t think Apple should be compelled to do what is being asked of it merely by subpoena. Additional well-thought-out protections are in order, such as a final, appealable court order issued upon a satisfactory legal showing (such as “probable cause,” or higher), setting out the safeguards needed to protect the privacy interests of third-parties and Apple, as well as providing reimbursement to Apple for its costs of compliance.

        The elephant in the room no one here is yet discussing is whether US law should allow electronic devices equipped with unbreakable encryption software in the first place. The law is not required to follow technology; technology’s required to follow the law — although the two must at length accommodate one another. This is to say, the law need not recognize as reasonable every new form of secrecy that technology is capable of providing. Are we as a society prepared to accept technology that permits malefactors to put evidence of their wrongdoing — no matter how dire the consequences — totally and permanently beyond detection by the government officials entrusted with our safety? If so, what are the outer bounds (if any) of this new doctrine?

        These are novel legal questions addressing new and evolving technology. They merit serious consideration by our courts and vigorous public debate.

        • Jeff Ryan
          Posted February 21, 2016 at 1:18 am | Permalink

          You are 100% right. For a poor analogy, you are not allowed to hide your license plate number. The same reasoning (though better put than mine) would justify a law mandating that no devices be manufactured with undefeatable (word?) encryption.

          I am not saying defeatable by any mope out there. I am saying defeatable by the software provider. I personally think such a law would have many benefits, but one has to consider whether there is the political will to do such a thing. Abstractly, though, I don’t see a problem with such a law. The rub is establishing when government should be allowed to breach encryption.

        • infiniteimprobabilit
          Posted February 21, 2016 at 1:56 am | Permalink

          “The elephant in the room no one here is yet discussing is whether US law should allow electronic devices equipped with unbreakable encryption software in the first place.”

          Good point, but surely that should be decided by Congress – at least in principle – rather than by random lawsuits.

          cr

          • Ken Kukec
            Posted February 21, 2016 at 5:35 am | Permalink

            Yeah, you’re right, ‘cept those congress folks don’t seem so good at the enacting-any-actual-legislation thing lately.

            • infiniteimprobabilit
              Posted February 21, 2016 at 5:59 am | Permalink

              I did say ‘in principle’ …

              cr

        • Diana MacPherson
          Posted February 21, 2016 at 11:10 am | Permalink

          The encryption debate is an old one that existed long before iPhones and the US did at that time outlaw encryption using PGP because it was not crackable. Read about the history of PGP in the 90s here.

          What people often forget is secure, uncrackable devices existed long before Android and iPhone. The early BlackBerrys were so and that is why they became popular with governments around the world. That technology is still heavily guarded and used by BlackBerry today in all sorts of communications as we move into the so called “Internet of things”.

          So, what I’m really saying is tread carefully when it comes to thinking encryption should be hackable. Right now information from our smart watches and exercise trackers as well as our Starbucks apps spew location information to third parties. Do we really want this to be easily cracked? How secure is it now? How about cars and the apps that talk to them?

          • Ken Kukec
            Posted February 21, 2016 at 3:23 pm | Permalink

            Yeah, I recall Main Justice going round-and-round with Apple and other tech companies about these so-called unbreakable encryption systems when they were first disclosed, complaining that the systems would interfere with the legitimate investigatory functioning of federal law enforcement agencies.

            I suspect this litigation is a direct outgrowth of those contretemps, hand-selected by the feds owing to its connection with a high-profile terrorism investigation.

            • Stephen Barnard
              Posted February 21, 2016 at 3:35 pm | Permalink

              You are right on the money with the comment.

              My belief is the the FBI isn’t very interested with the information on the phone. If they were they would have acted sooner and been less inept (or possibly negligent on purpose) about changing the Apple ID password.

              By belief is that they’ve concocted this case to set a precedent for backdoors into all mobile devices (and ultimately into all digital systems), using terrorism as a motivating threat.

              • GBJames
                Posted February 21, 2016 at 4:06 pm | Permalink

                Actually, I think the FBI and Apple both participated in “concocting” the case in order to get the principle issue settled by the Supreme Court.

              • Ken Kukec
                Posted February 21, 2016 at 5:11 pm | Permalink

                I see it as arranged, too, but don’t impute the same malign intent to that as you do. This is an important, novel issue likely to set a crucial precedent (and, as GBJ observes, likely to do so before SCOTUS) and deserves, therefore, to be raised in a context where all its deep ramifications can be considered.

      • Ken Kukec
        Posted February 21, 2016 at 3:04 pm | Permalink

        Wich [sic] is what the FBI might as well be doing to Apple.

        No, what the FBI is doing is asking a federal district court to exercise its equitable jurisdiction to direct Apple to decrypt one of its iPhones — you know, the way someone might ask a federal court to exercise its equitable jurisdiction to compel a homophobe to bake a cake for, or issue a marriage license to, a gay couple.

      • Ken Kukec
        Posted February 21, 2016 at 5:34 pm | Permalink

        Ben, what you’ve said about the FBI is akin to claiming that that nice gay couple in Kentucky snatched Kim Davis out of the Rowan County clerk’s office, slapped the cuffs on her, dragooned her to the federal Marshal’s lock=up, stuffed her into a jumpsuit, and tossed her in the clink.

        It’s one thing for a federal agent to coerce a private citizen into involuntary servitude, quite another to invoke the equitable jurisdiction of a federal district court.

        Look, I’m no fan of the G-Man, have spent the better of a career drawing lines in the sand and daring them to cross, but fair’s fair, bubba.

        • Posted February 22, 2016 at 9:45 am | Permalink

          But Apple is being dragooned by the FBI — or, at least, the FBI is attempting to conscript Apple into writing software that they find morally reprehensible and commercially devastating. And Apple is a private entity, not an elected official that sought public office.

          b&

          >

          • Ken Kukec
            Posted February 22, 2016 at 1:31 pm | Permalink

            The government is seeking to invoke the court’s equitable jurisdiction — essentially, in simplified explanation, a court’s power to compel people to do stuff. (Without this inherent power, the courts would be restricted to whacking up the dollars by entering money judgments.) It is through the exercise of this equitable authority that courts desegregated lunch counters, took down “whites only” signs on drinking fountains, and required permits to be issued for the march across the Edmund Pettus bridge.

            True, many of those cases involved government officials, but that’s because they were the parties to the underlying cause of action, not because it is a restriction on a court’s equitable power. Equitable jurisdiction extends as well to purely personal causes of action between private parties. If I contract with you to build me a thousand widgets over the next year and you breach our contract, I can (assuming satisfaction of the elements of my cause of action) have a judge compel you to spend the next 12 months making me widgets (or sitting in jail, as a civil contemnor, until you agree to do so).

            The courts’ equitable authority is a necessary component of our system of justice, not a first step on the road to tyranny.

            • Posted February 22, 2016 at 2:12 pm | Permalink

              If I contract with you to build me a thousand widgets over the next year and you breach our contract, I can (assuming satisfaction of the elements of my cause of action) have a judge compel you to spend the next 12 months making me widgets (or sitting in jail, as a civil contemnor, until you agree to do so).

              I’m sorry, but the legal system you’re describing is most emphatically and quite clearly not one that’s been in effect in the United States for well over a century. We got rid of indentured servitude and debtor’s jail looooooong ago — as has every other Western country I’m aware of.

              If you fail to comply with the terms of the contract, there will be some sort of penalty, expressed or implied, that will come into play. Most likely, you’ll have to pay some money. It may well be cheaper and easier for you to fulfill the terms of the contract than breach it — but breaching the contract is always an option. And, if you can’t afford to pay the penalty, bankruptcy is a further option.

              In no case will you wind up in jail for failure to fulfill the terms of a civil contract.

              Cheers,

              b&

              >

              • Ken Kukec
                Posted February 22, 2016 at 5:23 pm | Permalink

                The preferred remedy for breach of contract is a money judgment, so that the parties are free to maximize their financial interests. But there are circumstance, such an order for unique goods unavailable from any other source, where a court will still enforce a contract through “specific performance” requiring the losing party to fulfill his end of the bargain.

                But my point here wasn’t to get bogged down in the filigree of contract law; it was (as I’m certain you realize) to show that the courts have equitable authority to compel parties to perform acts, against their will and even where they find the acts repugnant, and to enforce that authority through its powers of civil contempt.

                It is beyond question that courts of equity do have and exercise such authority — notwithstanding the specious objections, heard from those on the far fringes of both ends of the political spectrum, that the exercise of equitable jurisdiction results in “involuntary servitude.”

    • Jeff Ryan
      Posted February 20, 2016 at 9:20 pm | Permalink

      It is the attitude of infantile libertarians (sorry for the redundancy) who just can’t accept that we live in a society governed by law.

      It is akin to Joe the Plumber telling a parent “I’m sorry, but your dead kids’ lives don’t trump my Second Amendment rights.” Welcome to the U.S.A. of the 21st Century. “One nation, indivisible, as long as I get whatever I want.”

      • eric
        Posted February 20, 2016 at 10:13 pm | Permalink

        And not allowing the appeals process to work is IMO the attitude of infantile authoritarians.

        Personally I think that once all the legal machinations are said and done, Apple should comply with the law. But I’m very supportive of them legally fighting the current, lower court ruling and I hope they eventually win. I’m fine with the government having the phone, and the info on the iCloud, and the FBI’s best doing their best to hack the phone for more info. I’m not okay with them compelling by force of law a third party private corporation (or citizen) to do the hack for them.

        So, does that make me an infantile libertarian who can’t accept that the country is governed by law?

        • Jeff Ryan
          Posted February 20, 2016 at 10:32 pm | Permalink

          No, and those weren’t really the folks I was talking about. To clarify, I have no problem with Apple availing itself of its appellate remedies. I think they’re wrong on substance, but they have every right to appeal.

          But, yes, I do think the court can order Apple to comply, and I think the court should be able to do so.

          What I most object to is the attitude of some that smart phones are sui generis, and that they should be immune from government action no matter the facts. I object to the misconception that some “right to privacy” trumps law enforcement and/or national security no matter what. I could be mistaken myself, but I believe I saw exemplars of that mindset today, and it is terrifyingly wrong.

          But your position seems eminently reasonable to me.

          • Steve Gerrard
            Posted February 21, 2016 at 1:11 am | Permalink

            The issue with digital security is that it is all or nothing. If there is a way to break into phones, it will get out, and ruin the security of phones for everyone. It is the same as the “backdoor” issue for all encrypted data.

            We could go with no security on any phones or computers. No more paying bills with them, though. What we can’t have is “secure for most things but the good guys can sometimes break into them, but never the bad guys.” There just isn’t anyway to do that.

            • Jeff Ryan
              Posted February 21, 2016 at 1:26 am | Permalink

              You are better versed in the technology than I.

              But why do I suspect there is some kid in Marin County who has already figured out how to do it? When I worked in law enforcement, there were many times we sought court orders to access information we knew was readily accessible by private persons. I’m not suggesting that the government shouldn’t go through the necessary hoops. Far from it. I’m just noting that nothing is “hack-proof.” And if you say it is, then you are merely daring many, many people to prove you wrong. How long did it take for the anti-copying code for DVDs to be broken? A day? By a 17-year-old in Norway, if memory serves.

              • Diana MacPherson
                Posted February 21, 2016 at 11:24 am | Permalink

                The type of encryption today is hack proof. Unless you’ve got hundreds of years to try to hack it. This is a good thing. It’s what keeps all kinds of things safe, including military assets like nuclear missiles.

                But this Apple case isn’t about Apple not handing over codes. Apple already handed over this dead terrorist’s iCloud back ups from this very phone. The mobile company also handed over the SMS text information. The issue is weakening the security of all iPhones by creating a program to do so.

            • Ken Kukec
              Posted February 21, 2016 at 5:28 am | Permalink

              Well, then, I guess the decision we as a society need to make is which poses the greater harm — that some bad guys could hack into our phones and commit fraud, or that some bad guys could plan and commit crimes, heinous and small, aided by encrypted secrecy, thereby putting some potentially crucial evidence beyond the reach of the authorities?

              That’s why we as a society get paid the big bucks, I guess. 🙂

            • Diana MacPherson
              Posted February 21, 2016 at 11:20 am | Permalink

              No security on computers, etc is actually much worse. Not only would that destroy Internet commerce, it would destroy the modern economy — no more computers in cars, no more fitness bands or smart watches, no more paying with your phone, no more paying with apps, no more transactional data of any kind that is currently transmitted between banks and institutions. We go back to the technology of the 70s or earlier.

              If we are going to participate in the modern world, we need to accept encryption is part of it. And that encryption is going to have its downfalls like not forcing a company to weaken the security of its devices.

              • Ken Kukec
                Posted February 21, 2016 at 3:42 pm | Permalink

                Is there a sound empirical record supporting this “all or nothing at all” argument? I mean, something beyond what appears to be the tech version of Murphy’s Law: that anything that can get out, will get out?

                (PS – You’re welcome for the link to Frank. 🙂 )

              • Diana MacPherson
                Posted February 21, 2016 at 3:50 pm | Permalink

                You may take this as hyperbole but it really isn’t. The modern world requires uncrackable encryption to work. Simple. It’s a matter of understanding how devices communicate and how many different ways they need to communicate.

              • Ken Kukec
                Posted February 21, 2016 at 4:06 pm | Permalink

                You know, I (pace the major world religions) am prepared to embrace modernism, even if that means we must hide vast amounts of information behind an impregnable wall of unbreakable encryption. If that’s the way it is, so be it.

                But before I go down and break the news to the folks charged with crime-detection and keeping-the-world-safe-for-democracy that so much potentially crucial evidence has now been put totally and permanently beyond their reach, I’d like to have something more than mere folk wisdom to hang my hat on … just sayin’ 🙂 .

              • Diana MacPherson
                Posted February 21, 2016 at 4:11 pm | Permalink

                You don’t need to break anything to them. This debate about encryption happened decades ago.

              • Ken Kukec
                Posted February 21, 2016 at 5:41 pm | Permalink

                I’d say the litigation in CA is prima facie proof that the news hasn’t sunk in.

          • eric
            Posted February 21, 2016 at 6:43 pm | Permalink

            I think you’re drastically undercounting the government’s desire to interfere in legitimate, normal business in order to make its own job easier. In my opinion there is nothing special about cell phones that gets extra-special protection, but there is also nothing special about corporations offering people encryption that should be prevented by government – nor do I see any fundamental legal argument for forcing Apple to help. Can you explain the latter to me? The FBI has Bob in custody. Bob is no immediate threat (I add that to the analogy because these particular phone owners are already dead). I know Bob’s password, but it would be against my economic and personal self-interest to give it up; people would lose trust in me and my business, my income would suffer. What is the legal basis for compelling me, a private citizen, at legal gunpoint, to give it? Are we now forced to incriminate our neighbors or suffer jail time?

            The second issue is of course when Apple fixes their security issue (because they can), and the government objects and tries and to stop them from doing so (because they will; the NSA has tried to stop entcryption from being legal before)…then where will you stand? On what side will you be? Should Apple be allowed to prevent this from ever happening again, or not?

            • Ken Kukec
              Posted February 22, 2016 at 12:46 pm | Permalink

              The venerable principle of Anglo-American jurisprudence at stake in your “Bob” hypothetical is that a party is entitled to “every man’s evidence.” (We would update this today to “person’s,” of course.) This is the principle by which the Supreme Court compelled sitting president Richard Nixon to produce the Watergate tapes in Judge John Sirica’s courtroom, among other examples too numerous to mention.

              In the “Bob” case, you could be subpoenaed before a federal grand jury and called upon to testify under pain of the penalties for contempt of court. If your testimony would incriminate you personally, you could assert your privilege against self-incrimination (as you could any other applicable testimonial privilege: doctor-patient/attorney-client/priest-penitent, etc.) to refuse to testify. There is, however, no right to refuse to testify because your testimony would incriminate some other person. (If you think about it, any such exemption from the duty to testify would bring our criminal justice system to a screeching halt and render nugatory the constitution’s “compulsory process” clause.)

              If your testimony in “Bob’s” case involved trade secrets or other proprietary business information, a court would issue a protective order, or fashion some other appropriate remedy, to prevent the information’s dissemination. But no person has the right to refuse to testify, or to produce physical evidence, on the basis that it would be inconvenient, embarrassing, economically detrimental, or otherwise against his or her self-interest.

              • Posted February 22, 2016 at 12:53 pm | Permalink

                But no person has the right to refuse to testify, or to produce physical evidence, on the basis that it would be inconvenient, embarrassing, economically detrimental, or otherwise against his or her self-interest.

                Apple isn’t being asked to testify nor to produce physical evidence.

                Apple is being asked to create de novo something which does not already exist.

                Might as well have the courts order a maker of plowshares to craft the executioner’s sword.

                b&

                >

              • GBJames
                Posted February 22, 2016 at 1:49 pm | Permalink

                Kind of like making the victim dig his own grave before being shot.

              • Posted February 22, 2016 at 2:14 pm | Permalink

                It’s pretty clear that Ken’s understanding of the law is at least centuries out of date. He thinks we still have indentured servitude and debtor’s prisons, and that courts will uphold civil contracts even if the terms are unconscionable.

                b&

                >

              • Ken Kukec
                Posted February 22, 2016 at 1:57 pm | Permalink

                I understand but was responding to the specific question posed by eric. I have responded to your concerns above, here.

              • Ken Kukec
                Posted February 22, 2016 at 7:46 pm | Permalink

                Ben – I responded further to these assertions of yours here.

                You really might want to consider abandoning this
                “involuntary servitude” stuff. It’s a thoroughly cashed, discredited argument — indeed, one held in special disrepute owing to its execrable use by racist hoteliers, restaurateurs, and others offering public accommodation in their misguided efforts to resist desegregation. It was rejected out of hand then; it will earn you nothing but derision today.

  59. Posted February 20, 2016 at 8:38 pm | Permalink

    Sorry if this was posted already, but EFF have a very good overview here:

    https://www.eff.org/deeplinks/2016/02/technical-perspective-apple-iphone-case

  60. Adam M.
    Posted February 20, 2016 at 9:31 pm | Permalink

    Although I’m not clear about the All Writs Act, the FBI is in the right to obtain a warrant and Apple should follow it. This case is no different in principle from many uncontroversial cases from the past. Other companies have had to hack their own users. Other companies have had to write software to do so. Etc. See Hushmail for an example. I agree that the FBI also wants to use this case to further their argument against encryption generally, but that is legally irrelevant. They have sufficient evidence of a crime and sufficient probable cause to believe the phone contains useful information to justify obtaining a search warrant.

    For a long time, Apple quietly assisted the FBI in unlocking phones (and in cases of petty crimes, too), only recently ending cooperation with them, using the excuse that it would ‘damage their brand’. And that’s what this is really about. Apple wants their products to be perceived as secure. If they hack one of their own phones in a very public way, it may damage their brand.

    But the public’s perception of their phones as secure is an illusion. Their phones are not secure. There are no secure smartphones, no secure email services, and no secure cloud storage services. All phones have backdoors allowing the government to wiretap them and Apple’s are no different. Apple can push over-the-air software and firmware updates to its phones, and those updates can bypass any security.

    So in this specific case I’m with the FBI, even though I hate their role in general government surveillance, because they have a legitimate case and the request is not particularly burdensome. I also want Apple to “lose” because their what they’re trying to protect is the illusion of security on their devices, for the sake of their own profit, and if that illusion is weakened, it would be a good thing.

    If you want security in your data and communications, use trustworthy software and hardware. Apple does not make either.

    • Posted February 21, 2016 at 4:36 am | Permalink

      Surely, the fact that the FBI has issued this risk indicates that the iPhone is, in respect of stored data, secure. Else they’d just got the NSA’s cryptanalysts to do the job for them.

      /@

      • Adam M.
        Posted February 21, 2016 at 7:18 am | Permalink

        That’s true. The encryption is apparently good. But the government can wiretap them remotely, and Apple can install backdoors on them remotely at the government’s behest. If you only consider non-government, non-Apple attackers then it’s probably fairly secure, but if you include Apple as a potential attacker then the phones aren’t secure, so it’d be unwise to trust Apple or any other corporation to protect your data given that the government could easily twist their arm if they really wanted to.

        I guess not many people (in the US) worry about protecting their communications and data from the government, but if you do then you shouldn’t rely on corporations for security.

        • chigaze
          Posted February 21, 2016 at 10:17 am | Permalink

          You’re over estimating how much Apple can do in this case. They’ve deliberately engineered the phone to be difficult for even them to get into. The only thing they can bypass at this point is the 10 try limit and that requires a fair amount of work and physical access to the phone. They can not bypass the 80ms delay between attempts, that’s enforced by the hardware, which means if someone used an alphanumeric passcode, it could take years to unlock the phone.

        • infiniteimprobabilit
          Posted February 21, 2016 at 6:03 pm | Permalink

          “Apple can install backdoors on them remotely at the government’s behest”

          Damn good reason never to buy an iphone, then.

          cr

  61. Posted February 21, 2016 at 3:49 am | Permalink

    Thank you all for these comments, it’s been educational. I’m still undecided.

    Regarding the climate of skepticism, a “yuge” responsibility lays with those in the government who have destroyed the trust of the citizenry. A problem much bigger than one phone. Shame on all of them.

    Mike

  62. Posted February 21, 2016 at 3:53 am | Permalink

    Apple may be in the right here, as a manufacturer of telecommunications equipment under the Communications Assistance for Law Enforcement Act (CALEA), despite what the FBI says.

    In the end, the government’s snark in its brief that “Apple has attempted to design and market its products to allow technology, rather than the law, to control access to data” is too clever by half because it is the law as Congress wrote it that permitted Apple to deploy secure phone technology in the first place and that precludes the government from requiring Apple to undermine it.

    /@

  63. Macha
    Posted February 21, 2016 at 6:03 am | Permalink

    I may have got the wrong end of the stick completely. However my understanding is not that Apple is being asked to provide some kind of backdoor to the encryption on the phone, but is being asked to bypass that code in iOS which deletes the phone’s data after 10 wrong passcode guesses – thus allowing a brute force attack to get access to the data.

    I would have thought that Apple could produce a version of iOS which does exactly that and also allows it to run only on that specific phone and not on any other MAC address, or self-destructs after a certain time period, or whatever.

    Irrespective of all the ifs and buts and the (unlikely) notion of Software Engineers getting upset about hacking code, I reckon they’ll end up doing it.

    • chigaze
      Posted February 21, 2016 at 9:53 am | Permalink

      Apple is not saying they can not do it, they can, for this model of phone. The are contesting that they be required to do it.

  64. Mike
    Posted February 21, 2016 at 6:42 am | Permalink

    In this one case, unlock the Phone.

  65. Posted February 21, 2016 at 8:09 am | Permalink

    JCC. “The problem of arguing for an “objective” morality.”
    “Now how on earth can you possibly weigh these different forms of “well being”, even if we could know perfectly all of the consequences of both actions? (And, of course, we can’t.)”

    Well, as far as I can see everybody seems to be having a damn good try to do exactly that, -to weigh these different forms of “well being”.

    A deluge of certainties, yet comments mostly of two polar opposite moral views, -which cannot both be right, can they?

    What are Laws and Jurisprudence but attempts by a governing body to impose some objective morality? Without some formal Laws of property rights these arguments wouldn’t exist. To accept the right of property ownership one has tacitly already accepted a societal objective moral rule. The “Laws” are required to achieve some conditions and consistency in its observance, to avoid subjective moral opinions about property ownership for the sake of public “well being”. I own my phone not because I think so, it is so because, in my case, I bought it and hold a valid receipt for it.

    • Jeff Ryan
      Posted February 21, 2016 at 3:03 pm | Permalink

      Yes, and you own your car, too. Try driving it on the public way without registration.

      • Posted February 21, 2016 at 3:35 pm | Permalink

        …and let the police try to search it without neither your permission nor a warrant.

        Of course, if they can look in the window and see evidence of a crime, they’re then clear to pursue further. But, lacking actual evidence, the car is off limits.

        And if you’ve got a combination lock on the door, it’s up to the police to figure out how to open it. You might choose to help them so they don’t get the urge to smash in the windows. But, if you don’t give it to them, it’s up to the police to decide if they want to smash the windows. And if smashing the windows destroys what they thought was evidence? Tough luck; they should have been less ham-fisted.

        b&

        >

        • Jeff Ryan
          Posted February 21, 2016 at 4:20 pm | Permalink

          The warrant is irrelevant here. Farook, were he alive could object to a lack of a warrant. On the other hand, you, not being Farook, can make no such complaint. It’s not your phone.

          No offense, mate, but your argument reminds me of judges I knew who took perverse pleasure in letting an obviously guilty man go free in order to tweak the state. The FBI must act lawfully, make no mistake. But there are concessions to legitimate government powers that must be made if we are to have a society. The Constitution is not a suicide pact, as others wiser than me have noted. No constitution provides for its own destruction, which, frankly, you seem to be advocating here.

          Remember Sam Adams? Pretty radical guy, right? But what did he say about Shays’ Rebellion? To paraphrase, those who defy the laws of the Republic should be put to death. The Framers never intended to provide get-out-jail-free cards. They understood the legitimacy of government, even as they acted to stop its abuses. Many seem to buy into the anti-government tropes about the Constitution, missing the central point: The Constitution was written to expand federal power, not to shrink it. And when it comes to the protection of its citizens, it is a paramount duty.

          • Posted February 21, 2016 at 4:27 pm | Permalink

            But it is my phone. The FBI is demanding that Apple hand over the keys not just to one random schmuck’s phone, but to all phones. Sure, they’re saying they’ll only use it for the one phone, but you’ve got to be insane to trust them that they won’t manufacture every excuse imaginable to use it for “just one more thing.”

            The FBI has the schmuck’s phone. Let them do with the phone what they will. But, if they want to go after anything or anybody else, they’ve got to prove that said anything and / or anybody is actually evidence or a criminal (to whatever applicable standard applies). And, unless the FBI is proposing that Apple is a criminal organization, they’ve got nothing they can demand of them.

            b&

            >

            • Jeff Ryan
              Posted February 21, 2016 at 4:33 pm | Permalink

              Yeah, and it’s YOUR car. Think the government can’t regulate its use? Think again.

              • Posted February 22, 2016 at 9:43 am | Permalink

                The government can regulate the operation of the car. But if they want to search its contents, they need a warrant. And, even then, they only have the right to search, not to find.

                b&

                >

      • Posted February 22, 2016 at 7:53 am | Permalink

        Jeff: But of course I agree that laws, such as car registration, are necessary in any society for that society to work at all. I was arguing that there is objective moral behaviour over and above subjective moral opinions and also that Laws are a form of an objective, (not absolute but conditional) moral code. Admittedly we can never, for certain, see what the future holds but we can and must make empirically reasoned guesses and agree on some form of code of behaviour backed up, and imposed, by laws founded on that code. If a society continually imposes bad (i.e. immoral because consequentially damaging) laws to excess then the probability is that the society eventually fails. It’s Evolution at work. The inexorable elimination of the less fit.

        Ben: See my reply above to Jeff, and I think you meant “with neither…nor..”

  66. Filippo
    Posted February 21, 2016 at 1:16 pm | Permalink

    Someone (among the current 395 comments) may have already asked this:

    If the San Bernardino shootings had happened on the Apple campus, would Apple/Tim Cook still hold their position on the matter? How would surviving Apple employees feel?

    • Posted February 21, 2016 at 10:32 pm | Permalink

      Yeah, I’ve seen this BS argument elsewhere. It’s a simpleminded appeal to emotion “How would you feel if it were your child?” Is a popular question asked of people when discussing the death penalty for example. And it’s why we don’t let family of victims sit on the jury.

  67. Stephen Barnard
    Posted February 21, 2016 at 4:56 pm | Permalink

    I think a point that Ben was making is that unbreakable encryption is cheap and easy to acquire. A motivated terrorist or criminal can safely encrypt his or her nefarious plans and contacts, store them on an iPhone or anywhere else, and not the FBI nor Apple nor anyone else could decrypt them.

    The security that Apple provides isn’t intended for terrorists and criminals. It’s intended for their normal customers, you and me, who might have sensitive and valuable information on their devices — information that, if stolen, could lead to disastrous consequences.

    By opening back doors to this information, the FBI would be putting at risk hundreds of millions of users, while driving terrorists and criminals deeper into unbreakable, private encryption.

    • Jeff Ryan
      Posted February 21, 2016 at 5:03 pm | Permalink

      So it’s not foreseeable to Apple that terrorists and criminals might exploit this? They didn’t have a clue?

      A little like saying Gosh, we had no idea that guy who bought that gun might use it for bad behavior! Because we only market to law-abiding citizens.

      • Stephen Barnard
        Posted February 21, 2016 at 5:12 pm | Permalink

        Then the FBI should go after anyone selling or distributing encryption software, anywhere in the world. How would that work out?

        • Jeff Ryan
          Posted February 21, 2016 at 5:39 pm | Permalink

          No one’s “going after” anyone. But the government would have the right to outlaw distribution of such software, or require that the means of decryption be disclosed to it.

          • Posted February 21, 2016 at 5:44 pm | Permalink

            They’ve already lost that battle. See Diana’s comments about PGP.

            /@

          • Diana MacPherson
            Posted February 21, 2016 at 5:44 pm | Permalink

            I think what Stephen was attempting to convey is it isn’t “software”. It’s encryption. Encryption is required for a the majority of transactions we exact in the modern world: driving our cars, using a bank machine, paying your tuition to your university, buying something at a point of sale, moving money between institutions (EFTs), voting, filing taxes, buying a coffee, recording your location of your hike, I could go on and on.

            It’s much bigger than a piece of software. We need this encryption to be unbreakable or we won’t be able to rely on it. The modern economy is as advanced as it is partly because we have encryption.

          • Stephen Barnard
            Posted February 21, 2016 at 6:06 pm | Permalink

            To do that effectively — “outlaw distribution of such software, or require that the means of decryption be disclosed to it” — the government would have to shut down the Internet and all other means of digital communication.

          • Stephen Barnard
            Posted February 21, 2016 at 6:09 pm | Permalink

            The government would also (by the way) have to gather, classify, and censor all publications (now in the public domain) describing the various methods of secure encryption.

      • eric
        Posted February 21, 2016 at 6:49 pm | Permalink

        Well if you want to get into foresight, then how about this one: the murderers had multiple phones. They wiped the others, but not this one. So isn’t it foreseaeble that there isn’t jack sh*t on this phone, and that any court weighting the potential benefit of unlocking it against the potential risks incurred by millions of other users should reasonably, foreseeably see the latter is far bigger than the former?

  68. Kevin
    Posted February 22, 2016 at 9:04 am | Permalink

    If Apple has to make modifications to their technology for the FBI then gun companies should also make modifications to their technologies so that:

    1. The whereabouts of the gun are always available to the FBI (just a like a phone).

    2. Disarming the weapon can be done remotely by the FBI (just like a phone can be)

    I endorse these conditions, and, if met, then Apple should unlock their phone(s) to the FBI.

  69. Posted February 22, 2016 at 12:40 pm | Permalink

    In my view the exact parallel is to the manufacturer of a physical security system being asked to build something to compromise it. For example, to build another key to open a lock that someone else owns and uses. I’m not much of a fan of corporations, but it seems to me that there would have to be much more to the “required build” thing than “we said so and got an ordinary warrant”. Perhaps under extreme national security situations (e.g., nationalization of a chemical company to produce ammunition or explosives during an actual declared war) but …

  70. Chris G
    Posted February 22, 2016 at 2:37 pm | Permalink

    I don’t agree with Apple’s position; and I’m bemused by many of the arguments that do.
    As Sam Harris says in his recent podcast: “People are imagining they have a right which in fact never existed before the invention of the smart phone and could never exist in the real world”. “Apple is trading on a kind of paranoia here, that many people feel after Snowden.”
    We accept search-warrants granting access to other forms of information/possessions, authorised by robust legal processes, so why propose smart phones to be treated differently? Or are supporters of Apple’s argument suggesting custodians of emails, bank transactions, photographs etc. should now follow this precedent and also create systems that deliberately lock the data without a key?
    As Harris concludes: “Apple is intentionally building a zone of privacy that is currently unthinkable in the real world, and they’re declaring that we’ll all suffer mightily if they build any key to the lock they’ve put on the door – they’re saying they can’t possibly keep the key to themselves.”
    I wish not to live in a society where people can knowingly hide incriminating data. Nor do I want to live in a society where people wish for that ‘privilege’.
    Not all requests to access such information will concern only prosecution cases – access to smart phone data could be essential to a person’s defence too e.g. proof of an alibi.
    Paranoia of tyrannical governments aside, if you don’t have anything to hide, why worry?

    • Posted February 22, 2016 at 2:45 pm | Permalink

      Any good lawyer will tell you, or should, that when the policeman asks to see inside your trunk the prudent answer is no. I am not presumed guilty. If you want to see what I have, state what you want to find in front of a judge. If you tell me with a warrant you are looking for heroin in my trunk and find marijuana instead you can take it away from me but not prosecute. If I didn’t know it was in there and volunteer to open the trunk then I can be arrested and charged. No thanks.

      • Chris G
        Posted February 22, 2016 at 2:49 pm | Permalink

        So we’re in agreement here: search only permitted when authorised by robust legal process.

        • Posted February 22, 2016 at 2:54 pm | Permalink

          But, for the umpteenth time, what the FBI is doing here is not performing a search. They are, instead, commandeering private citizens to make an universal safe cracker. The FBI is welcome to search provided they follow the legal requirements for a warrant and the like, but the law guarantees them no right to find.

          And how could the law guarantee them the right to find? Until you find, you have no actual certainty that what you’re looking for actually exists. If they had a guarantee to find, they’d have to manufacture whatever they wanted to find in order to find it.

          Apple is free to help them search if Apple thinks it’d be a good idea for them to do so. But Apple also has a vital role to play in the protection of the safety of our society, and that’s to stand up against government overreach — and it’s a very clear overreach for the government to demand that Apple do the government’s dirty work.

          b&

          >

          • Chris G
            Posted February 22, 2016 at 3:27 pm | Permalink

            I don’t think every search should necessarily have to specify what they hope to find.
            If a body’s discovered buried in a back-garden, the rest of the garden, the garden-sheds, the car parked out front, and the house are typically turned upside-down searching for anything that may be of help to the murder investigation – without having to anticipate what may be found.
            If that then also leads to smartphones being accessed, and computers at work-addresses also being searched (with the correct legal authority), where’s the problem?
            If during these searches personal information/photos/videos are discovered on the work PC, read/viewed, analysed, and dismissed, where’s the harm?
            I don’t think any technology should be built the way Apple smartphones have been.
            Ben – do you think all electronic-systems should now follow Apple’s approach e.g. emails systems encrypted to be locks without keys? So that the custodians can state ‘We can’t access the data, and we refuse to help’?

            • Posted February 22, 2016 at 3:46 pm | Permalink

              Chris, police very often are required to get a second warrant to pursue a different investigation if unrelated evidence of a different crime shows up while executing some other warrant. If you discover a body buried in the garden, you better have had a warrant permitting you to dig in the first place. If the warrant was to dig the south lot looking for drugs and you find a body, you can use the body as evidence in a new murder investigation, but you now need a new warrant to dig the north lot to look for more bodies. And, if you’re at all clueful, you’ll immediately stop digging the south lot for drugs and call the judge to get a new warrant to dig the south lot (and every other lot on the property) for both drugs and bodies.

              Ben – do you think all electronic-systems should now follow Apple’s approach e.g. emails systems encrypted to be locks without keys? So that the custodians can state ‘We can’t access the data, and we refuse to help’?

              Such is exactly how encryption has worked, quite literally, since the days of the Roman Empire. What’s so special about computers that we need to play King Canute with millennia of advancements in mathematics?

              If you don’t understand that this is so, you’re profoundly ignorant of the technical aspects of the case — quite literally at the level of asking why you have to bother stopping every few hundred miles to put more gas in the tank. After all, why can’t Ford make a car that never needs to be refueled? It’s just a technical limitation — let the engineers worry about it.

              b&

              >

              • Chris G
                Posted February 22, 2016 at 4:04 pm | Permalink

                Ben, with regard to the first point (searches after finding the body), we clearly agree the correct legal processes must be followed where police should justify the further actions they request to take. But why should they have to specify what they hope to find?
                On the second point, again I think you may have misunderstood me. I believe no systems/technology should be built such that it becomes a black-hole where nothing can escape.
                To repeat the quotes from Sam Harris: “People are imagining they have a right which in fact never existed before the invention of the smart phone and could never exist in the real world.” “Apple is intentionally building a zone of privacy that is currently unthinkable in the real world, and they’re declaring that we’ll all suffer mightily if they build any key to the lock they’ve put on the door – they’re saying they can’t possibly keep the key to themselves.”
                Are you arguing that all digital systems should be built with the ‘no key’ encryption found in Apple smartphones?

              • Posted February 22, 2016 at 4:14 pm | Permalink

                But why should they have to specify what they hope to find?

                Because, simply, “no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”

                I believe no systems/technology should be built such that it becomes a black-hole where nothing can escape.

                That’s nice. I want a pony, too. A flying unicorn pony.

                http://trumpetpower.com/Papers/Crypto/OTP

                Now can we please stop pretending that guaranteed-crackable encryption is even a coherent concept in the first place?

                Are you arguing that all digital systems should be built with the ‘no key’ encryption found in Apple smartphones?

                Are you arguing that we should make coins and pencils and paper pads illegal? Because that’s all the technology you need to build a system such that it becomes a black-hole where nothing can escape.

                Because, if you are, better outlaw playing cards, dice, pieces of string, pebbles, lava lamps, campfires….

                b&

                >

            • Jeff Ryan
              Posted February 22, 2016 at 3:51 pm | Permalink

              Warrants do have to be specific, particularly with regard to locations to be searched. Though one could fill many, many rooms with Fourth Amendment case law. There are exceptions. There are also rules: There have been cases where the police get a warrant for a home. They find nothing, then notice a shed out back. Courts have held they would need a separate warrant for the shed.

              In a case like this, where there are no objections to the search itself, it’s a moot point. However, were it not moot, the government likely would put in the affidavit what the practices of other criminals/terrorists are with respect to what they keep in their phones. They don’t have to promise the court they KNOW what will be found. Just the claim that they are likely to find evidence relating to the shootings somewhere in the phone.

        • Posted February 22, 2016 at 3:25 pm | Permalink

          I’d agree that there need to be strong probable cause and other factors as usually should apply to a search.

          My objection and response above was more to the “if you have nothing to hide” comment. Even if I don’t, it is far better to let them get a warrant stating specifically what they want to find. You aren’t allowed to come into my home or look in the locked trunk unless you have some idea what you are looking for.

          The Apple case is different in two key aspects from my point of view.

          One isn’t particularly germane. The government is “fishing”. They have no idea what they are looking for except that it may be evidence of a terror plot. I will presume that they are allowed to do this only because the owner isn’t the suspect, it’s the business (county) and they can consent to the search.

          What is more important is that I’m really stuck on thinking of an example where a citizen (and corporations are people too these days) has ever been coerced into aiding a search of something they have no security in (and by that I mean no ownership rights). And if they are, I object to the idea of it.

          • Chris G
            Posted February 22, 2016 at 3:50 pm | Permalink

            You say: “The government is “fishing”. They have no idea what they are looking for except that it may be evidence of a terror plot.”
            Are you serious?
            The police have possession of the phone used by a person who murdered many civilians, and you trivialise potentially important information on that phone? Or are you saying we shouldn’t assume it could possibly be important information?
            If Apple already had a ‘key’ to access that phone i.e. no debate over ‘forcing’ Apple to help, do you still think we’d have no good reason to view every bit of data on that phone?

            • Posted February 22, 2016 at 3:59 pm | Permalink

              …but not only does Apple not already have a key, the perpetual nonexistence of the key is essential to the security of iPhones everywhere.

              b&

              >

              • Chris G
                Posted February 22, 2016 at 4:33 pm | Permalink

                I disagree. Allowing Apple smartphones to become black-holes is not essential to anybody. There is a balance to be struck, where Apple hold a ‘spare key’, just like every other data-system and all possessions, so that the right authorities are able to access with the right authorisation.

              • Posted February 22, 2016 at 4:38 pm | Permalink

                So, write up your technological proposal to prevent me from turning a coin, a pad of paper, and a pencil into the exact cryptographic proposal you’re so terrified of.

                Tell me, in whatever level of detail you like, that would prevent me from doing what you fear, using no more than what we send every young child to school with.

                Or, you could continue to demonstrate your complete ignorance of the subject by insisting that π = 3 if the legislature passes a law to that effect.

                Cheers,

                b&

                >

            • Posted February 22, 2016 at 4:09 pm | Permalink

              1. Yes I am serious.
              2. Trivialize is a strong word. But I do think there is only a trivial expectation of finding anything on that phone. And even if not trivial, I’m fine with their not being able to search it. Remember that the terrorists crushed their phones beyond recognition and the PC hard drive still hasn’t been found. Yet they forgot to do anything with this phone? Not particularly likely although criminals can be dumb.
              3. Yes. If my next door neighbor has a key to my house and the police want it, I don’t think the police should be able to force the neighbor to hand it over.

              My biggest objection to all this is that it is primarily driven by us being scared of our own shadows, I mean terrorism. We take all sorts of actions designed to limit our freedoms just because we are so stupidly scared. And as I mentioned (way) above, toddlers kill either themselves or someone else about twice a week in this country, far in excess of any terrorist threat and that doesn’t seem to bother anybody.

              • Chris G
                Posted February 22, 2016 at 4:24 pm | Permalink

                Your neighbours wouldn’t have to hand over the key to your house: your house wasn’t deliberately built to only allow access with a key. The police can kick the door down if they have good reason and the correct legal backing – and rightly so.
                As for trivialising the potential usefulness of the information on the locked phone, you didn’t answer my question: if Apple already had a ‘key’ to access the phone, do you still think we’d have no good reason to view every bit of data on it?

              • Jeff Ryan
                Posted February 22, 2016 at 5:36 pm | Permalink

                I don’t think those two had a brain between them. I don’t think that was their original target. And having done what they did, why would they go home? I also think they may have not had time to trash the phone that the FBI has. Because, face it, the way these things are usually planned, you are going down in a hail of bullets or an explosion at the target scene, or you have a plan to escape. Neither of those things happened. I have no idea what’s on the phone, but I would think the FBI crazy if they didn’t search it. Grossly negligent, in fact. They have to identify any fellow “jihadis” or targets. And one seemingly innocuous item, a picture, say, might lead to important info.

              • Posted February 22, 2016 at 10:04 pm | Permalink

                And Chris, I did answer your question. “3. Yes”

      • Jeff Ryan
        Posted February 22, 2016 at 2:55 pm | Permalink

        Actually, part of what you said is wrong. If the police have a search warrant for heroin, and they find marijuana (assuming you don’t live in a state where it’s legal), they absolutely can arrest you and charge you with possession and/or possession with intent to distribute (if it’s a lot of pot, or a lot of individually wrapped packages of it).

        They are allowed to search anywhere in the described location where heroin could be. If they find pot, they are hardly required to ignore it or just “take it away from” you. Oh no, you’ll be charged all right.

        • Posted February 22, 2016 at 3:08 pm | Permalink

          It’s a bit muddier than that.

          If the warrant to look for heroin doesn’t include searching your computer, they can’t search your computer to discover kiddie pr0n. But if they see kiddie pr0n on your computer screen, they can absolutely call the judge on the spot to get a warrant to seize the computer for further investigation. But if the original warrant include searching the computer for the financial records of the heroin business you were running and they come across kiddie pr0n while searching the computer, that’s now fair game.

          But it’s still up to the police to figure out how to access the computer. They can ask you for your password, but they can’t compel it. And if they can’t figure out how to get into the computer without the password, that’s tough luck for the cops.

          Cheers,

          b& >

          • Jeff Ryan
            Posted February 22, 2016 at 3:40 pm | Permalink

            Well, I was merely correcting a legal misapprehension. I wasn’t talking about a computer search.

        • Posted February 22, 2016 at 3:42 pm | Permalink

          It’s outside my field so I won’t say you are wrong but maybe you can clarify how that would work in light of the 4A: “….and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.” Now it seems to me that based on my example which you replied to, the warrant is for heroin. If the place described is my trunk, and the things to be seized is heroin, how does finding marijuana in my trunk not fall under the exclusionary rules of evidence? If the warrant says “illegal drugs”, then I agree I’m screwed.

          • Jeff Ryan
            Posted February 22, 2016 at 3:59 pm | Permalink

            Because the police are seeing it from a place they have a right to be. It’s in plain view. They have a warrant that will allow them to search everywhere in the trunk (using your example) the heroin could be. (Which is just about anywhere.) So in executing the warrant lawfully, they discover marijuana in a place they have a right to search for heroin. Because they are lawfully searching the trunk, any contraband found in the course of the search can be seized, and the owner charged.

            Now if they were looking for, say, a guitar, and they look in the glove box, they will probably be screwed, because they should know a guitar isn’t going to fit in a glove box, and now they’ve gone exploring.

            • Posted February 22, 2016 at 4:08 pm | Permalink

              …and it’s also worth noting that the Constitution prohibits overly-broad warrants. The police might be tempted to ask for a warrant that say, “We want to search all of so-and-so’s possessions for evidence of any crime whatsoever,” but that wouldn’t hold up on appeal. So they have to be aware of the limits…search this one address, maybe even only one room, and only for illicit narcotics. See something else, don’t touch; get a new warrant.

              About the only exceptions are going to be imminent danger. Search for drugs, hear a child whimpering in another room…you don’t need a warrant to go comfort the child, and if there’s evidence the child is being abused that evidence will hold up because you discovered it as part of protecting the child from imminent danger. But if the child shows you a picture of the narcotics lab across town you’ve been looking for for years, you’ve got to get a new warrant to search the lab….

              b&

              >

              • Jeff Ryan
                Posted February 22, 2016 at 5:29 pm | Permalink

                Well, that’s not the fact pattern I was working under. But a warrant for, say, a gun might limit a search. However, if the warrant is for “a gun, ammunition, proof of purchase of a gun, credit card and bank records which would corroborate purchase of a gun, etc.” will probably hold up and allow a broader search. Or say you’re looking for a stolen car. You wouldn’t expect to find that in the hall closet. You might, though, find the plates for the car there. So the warrant will be crafted to cover the car, its registration, tires, etc.

                And you certainly have to specify what criminal activity you’re investigating. But a drug warrant will usually not just mention drugs. It will also authorize a search for drug paraphernalia, scales, baggies, financial records, bank records…you get the gist.

                In your example, if the police have a legitimate concern for child welfare, and you go to find the child. And she’s sitting in a meth lab… (And some cops will, to be safe, apply for a new warrant with a different object of the search. And they can keep the homeowner out of the home while they’re getting it.)

              • Posted February 22, 2016 at 10:27 pm | Permalink

                So now Jeff has confused me. When I mentioned that a warrant that was to search for heroin turned up marijuana that would be outside the exclusionary, Jeff said no. Now Jeff, you are mentioning that a warrant for a gun might limit a search so I lost the distinction.

                I believe you also mentioned that there are roomfuls of case law on searches so maybe this all depends and it too much to understand in this limited forum.

              • Jeff Ryan
                Posted February 22, 2016 at 11:02 pm | Permalink

                I only mentioned a gun, because there are places/containers that would be too small to hold a gun, and thus not properly searchable. But it was a lousy analogy, so I apologize – I can see why any sane person would wonder just what my point was.

                More to the point, the “rooms full of case law” is not an exaggeration. Basically, searches get challenged all the time via motions to suppress, because if you can knock out the evidence then the state pretty much has to dismiss. (Or appeal. The state can appeal a court’s granting of a defendant’s motion to suppress evidence. Jeopardy hasn’t attached, so double jeopardy isn’t implicated, and the state can appeal.) And given the infinite universe of possible fact patterns, courts have addressed motions to suppress over and over again because applying a “bright line” rule is difficult. Every case is distinguishable, and it’s hard to generalize. Two cases may seem almost identical, yet get two different results.

              • Posted February 22, 2016 at 11:58 pm | Permalink

                Fair enough. Thanks for the insight on the matter. Much appreciated.

              • Jeff Ryan
                Posted February 23, 2016 at 12:06 am | Permalink

                I don’t know how much insight I provided. Rereading my post (before posting it!) I had trouble making sense of it myself.

                Suffice to say, when you ask a lawyer a question, the only safe answer he can give is “It depends.”

                Try dealing with that every day!

    • Posted February 22, 2016 at 2:49 pm | Permalink

      Paranoia of tyrannical governments aside, if you don’t have anything to hide, why worry?

      So…you’d be happy with us installing a webcam in your bedroom or over your toilet? In your childrens’s shower? Staple a list of your medical prescriptions to your front door? Give your bank account details, including security questions and answers, to the homeless guy on the street corner? Scanning and posting to the Web those letters you got from your high school sweetheart, including the one where she dumped you for certain explicit reasons?

      We all have lots of things to hide, and for some painfully legitimate reasons. And what I do and don’t choose to hide and from whom is nobody’s business but my own.

      The government has a very limited right to attempt to violate my privacy. But, even then, they can’t force me to testify against myself — and, let’s be perfectly clear, the Fifth Amendment is the only necessary answer to this bullshit of “if you don’t have anything to hide.”

      …and this is, once again, before we get to the whole matter of the police demanding that the maker of a safe must craft an heretofore-nonexistent master key to all safes whose very existence is anathema to the security of the system and trust the police to not abuse it.

      b&

      >

      • Chris G
        Posted February 22, 2016 at 3:07 pm | Permalink

        Hey Ben. If the police convince a court of law that they have good reason to put me under surveillance, I have no argument against them installing video/listening devices in my home.
        We shouldn’t underestimate the security this provides for all of us.
        As for your rather bizarre extension of this e.g. someone giving my bank-details to a homeless person, how could that ever be warranted?
        As for personal information being accessed as part of an investigation, why assume it would then necessarily be made public?
        We do not all have incriminating things to hide – for those who do, don’t we want them to know it could be accessed?
        Potential abuse of police powers is not a good enough reason to be able to permanently lock information away.
        And yes, I think all safes, literal and metaphorical, should be accessible. Not accessible to you or me, but to the right authorities regulated by robust legal processes that we must ultimately trust. Even if that means occasional misuse of that power.

        • Posted February 22, 2016 at 3:13 pm | Permalink

          Your analogy again breaks down.

          The police can get a warrant to surreptitiously plant cameras and what-not on you. But they cannot require you to install a camera for them, and they cannot object to you discovering the camera and disposing of it. Nor can they mandate that you must remove the blinds from your windows to make it easier for them to use cheap cameras to spy on you, or prohibit the use of deadbolts and steel doors because they’re too hard for them to bust through.

          And, historically, it’s never an occasional abuse of power. Indeed, you’ve got it exactly bass-ackwards: we should be tolerating the occasional petty crimes as the price to pay for freedom from an intrusive government, rather than tolerate a totalitarian police state as the price to pay for safety from petty crime.

          b&

          >

          • Chris G
            Posted February 22, 2016 at 4:28 pm | Permalink

            Ben, do you have any stats/data to back up your claim “it’s never an occasional abuse of power”?
            With regard to your claim that your ‘totalitarian police state’ only save you from petty crime, do you have any idea of the number of major threats they thwart through surveillance and monitoring?

            • Posted February 22, 2016 at 4:36 pm | Permalink

              I know the number of terrorist plots the TSA has thwarted: zero. And the entire security apparatus was utterly useless in stopping the attacks of September 11, 2001, despite members of the public telling the FBI about their suspicions of the individuals responsible.

              So it’s damned clear that my pet rock is doing just about as much to save me from tiger attacks as the security apparatus is doing to save me from terrorist attacks.

              And, really. How, exactly, do you plan on preventing, say, a disgruntled farmer from loading his pickup with fertilizer converted into explosives and blowing it up on a major bridge in the middle of rush hour? Terror is trivial; all you need is the intention and a lack of complete stupidity and incompetence. What saves us from the terrorists is that there actually aren’t any. At least, what few there are are less of a threat than slippery bathtubs — and how many of us freak out over a bit of shampoo dripping in the shower?

              Cheers,

              b&

              >

              • GBJames
                Posted February 22, 2016 at 4:39 pm | Permalink

                “I know the number of terrorist plots the TSA has thwarted: zero.”

                Now how could you possibly know that?

              • Posted February 22, 2016 at 4:48 pm | Permalink

                Because there’ve been a number of terrorist plots thwarted not by the TSA; because every investigator whether from inside or outside the agency who attempts to smuggle contraband has no trouble getting lots of really scary stuff past the TSA; and because the TSA itself has never told us of such a success.

                Can you really imagine a bureaucrat of such an agency failing to justify all the billions of dollars otherwise wasted?

                b&

                >

              • Jeff Ryan
                Posted February 22, 2016 at 5:46 pm | Permalink

                They let Coventry be bombed rather than reveal the cracking of the Enigma machine.

                So, yeah, I have no problem believing they have had successes we don’t know about.

              • Posted February 22, 2016 at 5:54 pm | Permalink

                So what super secret were they protecting such that they let the 9/11 hijackers kill thousands and cause billions in damages?

                Sorry, but, again, they have crowed from the rooftops when they’ve stopped attacks. Made sure it was front-page news. That the TSA isn’t even competent enough to manufacture a story of how they stopped somebody from trying to smuggle an oversized jar of real (not sham) poo onto the plane tells us how pathetic they really are.

                b&

                >

              • GBJames
                Posted February 22, 2016 at 5:02 pm | Permalink

                You can only possibly know that if you assume that TSA successes would be made public. You can not know that, despite your suspicions about how bureaucrats behave.

              • Posted February 22, 2016 at 5:18 pm | Permalink

                It’s a slam dunk that the TSA would brag about such successes to at least the same extent as all law enforcement agencies that have had such successes have. Plus…there have been terrorists on planes that the TSA didn’t catch. Remember the shoe and underwear bombers? What’s the TSA’s excuse for failing to catch them? And basically everybody who’s flown has a story about how they forgot to unpack a pocket knife or some scissors or some other bit of contraband and it’s gone completely unnoticed — and plenty have the same stories to tell of guns and novelty fake grenades and the like. Not to mention all the investigative journalists and internal reviews that sneak all sorts of stuff through without trouble.

                b&

                >

              • Jeff Ryan
                Posted February 22, 2016 at 5:53 pm | Permalink

                The shoe bomber, Richard Reid, boarded his flight in Paris. TSA wasn’t involved.

                The underwear bomber boarded his flight in Amsterdam. Again, no TSA.

              • Jeff Ryan
                Posted February 22, 2016 at 5:41 pm | Permalink

                Actually, the president himself was warned. And had airport security been beefed up, some rules changed, it might not have happened.

                It’s not like they weren’t warned. They were. And told it would likely involve commercial airliners.

              • Posted February 22, 2016 at 5:47 pm | Permalink

                …and the good it did to stop the attack?

                And if even the full security apparatus personally warning the President isn’t enough to stop such an attack, why are we even pretending that TSA gate rape is effective?

                b&

                >

              • Jeff Ryan
                Posted February 22, 2016 at 5:57 pm | Permalink

                If you warn the president, and he does nothing (after being warned for months through different channels), that is not a very good argument that “nothing could have been done”. Something most definitely could have been done. The problem isn’t that “nothing could have been done”. The problem is that nothing WAS done.

              • Posted February 22, 2016 at 6:17 pm | Permalink

                You’re making my argument for me.

                We have all this gee-whiz security apparatus that was demonstrably useless for its stated purpose. We know further, thanks to Snowden and others, that it has been used to recklessly violate the trust placed in the government.

                So why is the argument that we should place even greater trust in the government’s insecurity apparatus, especially in a case where they’re continuing to demonstrate incompetence?

                Again, are we really to believe that this phone is the one-and-only way the FBI has to determine Farook’s associates or whatever it is we’re supposed to trust them is all they’re going to use this super ultimate master key to unlock?

                Please. Pull the other finger.

                b&

                >

              • Jeff Ryan
                Posted February 22, 2016 at 5:47 pm | Permalink

                Terror is not trivial. Because it works.

              • Posted February 22, 2016 at 6:21 pm | Permalink

                It only works because we have forgotten that the only thing we have to fear is fear itself.

                Once upon a time, Americans had the courage of their convictions and embraced significant hardship and danger to achieve greatness. Today, merely hint at the threat of a perceived inconvenience, and in response we shit ourselves trying to hush people from even thinking thoughts the sensitivity police deem insufficiently embracing.

                What the hell happened?

                b&

                >

      • Jeff Ryan
        Posted February 22, 2016 at 11:15 pm | Permalink

        You are absolutely right that the “If you don’t have anything to hide” line is beside the point. It alone can never justify a search. You can’t be penalized for exercising a constitutional right. After all, it’s not much of a right if they hold it against you.

        I’m not a tech wiz, so I’ll stick to what I am more conversant in. But I don’t think you understand how searches like this can work. I knew one investigation where the police were allowed to put a video camera in a suspect’s home. Now, this is EXTREMELY hard to get a court to agree to. You gotta have beaucoup evidence a crime will be committed to justify it. And if you get such permission, it has to be monitored 24/7 in real time. And if the target is just BSing with his buds, or having a heart to heart with his wife, you are required to stop listening and watching. And all of this has to be reviewed by the issuing court. I worked with wiretaps as a state prosecutor, and only the Chief Judge could authorize it, and I had to bring everything we got to that judge and show that the order was justified, or that there was a damned good reason to allow the wiretap to continue.

        So, without having read all the proceedings, I expect that the judge in the Apple case will keep a tight rein on the FBI. And that whatever might be developed will be subject to a protective order. And likely sealed by the court. And if you think it’s easy to access a sealed federal court file, you’re very much mistaken.

        • Posted February 23, 2016 at 10:43 am | Permalink

          I might not be conversant with the procedures courts propose, but you’re clearly not conversant with the technology itself. A signed copy of the OS that bypasses fundamental security…that’s something that can trivially crack basically any iOS device ever made. And it’d be trivial to copy the hacked operating system off the device itself because, again, the system security has been bypassed.

          Everybody here is admitting that the court doesn’t fully understand the technology involved. What on Earth makes anybody think the court is therefore competent enough to guarantee that the FBI isn’t going to abuse the court’s incompetence? I daresay that there’re half a dozen regular readers here who, if they kew ahead of time that they’d have the kind of access that the FBI is demanding, could put together a successful theft of Apple’s crown jewels. (And “put together” would involve enlisting the assistance of associates…I couldn’t do it by myself, but I know who to approach.)

          …and, again, this is the same FBI whose internal investigations reveal not a single agent in its history has misused a gun, the same FBI that hounded MLK, the same FBI that ran COINTELPRO…they have repeatedly demonstrated that they are not trustworthy. Yet everybody is falling over themselves to trust them?

          b&

          >

          • Jeff Ryan
            Posted February 23, 2016 at 10:59 am | Permalink

            I’m not a big fan of the FBI myself, but some things they do very well. And there are many agencies that have stepped out of bounds in the past. We don’t just toss them aside. Your brush is far too wide.

            As for the courts, while they take a long time with new technology,they usually end up getting it right. That was the case with DNA, certainly. A possible remedy would be to do what a court handling breast implant litigation did: Appoint its own panel of experts. They did a damned good job, and now that frivolous litigation has pretty much disappeared.

            But a word of caution: Do not sit so high on the horse. A lot of Silicon Valley “geniuses” have demonstrated total disdain for the national interest, and a glee at tax evasion Bernie Madoff would envy. To say nothing of an overweening sense of superiority and exclusivity. Is my brush now too broad? Sure. But tech types would do well to step back and consider how they look to others. And whether they really want to suggest that national security is SO 20th Century. If they really want corporations to be “people”, then they should start acting like citizens. (And no, I’m not talking strictly, or even mainly, about Apple.)

            • Posted February 23, 2016 at 11:06 am | Permalink

              A panel of experts would be an excellent idea, but only if the court was humble enough to trust the expert descriptions of the facts. But, again, in this case, you don’t need PhD.-level expertise to understand the snow job the FBI is putting forth on this naked grab for power they’re making; any decent introductory text, and Schnier’s is excellent, will give you such competence.

              It’s also worth noting that the courts themselves are hardly without sin, themselves. Far too many courts have given far too many executive agents far too much power that only tyrants could love. It’s entirely possible that a court could realize fully what it’s doing and still decide to deliver the people unto the FBI.

              b&

              >

              • Jeff Ryan
                Posted February 23, 2016 at 11:26 am | Permalink

                I don’t know the text, so I hesitate to agree or disagree. But understand that for many reasonably bright people (I count myself as one), “introductory texts for lay people” can still be impenetrable.

                I picked up a highly recommended “introduction” to digital audio that was aimed at the lay person. By page 5 (still in the introduction to the book), I was hopelessly lost. And digital audio is extremely important to me.

                Also, I would not toss about accusations of “tyranny” so casually. It is an overworked, and now trivialized word, almost invariably used by people who have no conception of what tyranny actually is. I’ve known Holocaust survivors, people from the former Soviet Union and Yugoslavia, and others who can actually tell you what tyranny is. And native-born Americans, most thankfully, have not experienced it in their own country. Worried about governmental overreach? Then live in a country where your phone goes dead because the government has to start a new blank tape to record your conversations. A country where, to go from the equivalent of one county to another, you have to go through a government checkpoint. The FBI, and other agencies, have been guilty of unforgivable transgressions, and certainly bear watching. But tyranny? Not even close.

              • Posted February 23, 2016 at 11:45 am | Permalink

                Yes, it most certainly could be worse. It can always be worse.

                But, make no mistrake: the situation really is dire in the States these days.

                You yourself reference government checkpoints…ever try to board a plane without subjecting yourself to gate rape? Ever heard of VIPR? Ever tried to enter a sports stadium without being searched? Ever driven the highways near Tucson or other “border” cities? Ever think about walking down the street without your papers to show the nice officer with the shiny boots?

                The Stasi would have killed to have had the universal spying facilities that Snowden revealed the NSA deploys as part of their daily routine.

                The FBI has officially certified that no FBI agent has ever misused a firearm in the line of duty, an unbroken streak that continues to this day.

                The CIA has run and almost certainly continues to run torture operations indistinguishable from those we hanged WWII war criminals for.

                We continue to operate overseas prisons different only in scale from the Gulag.

                The President regularly issues orders for summary executions, including of American citizens, and uses the military and secret police to indiscriminately carry out those orders with little regard for so-called “collateral damage.”

                It is true that the American government presents a much kinder and gentler face, and that citizens are allowed a fair amount more slack in the ropes.

                But, make no mistrake: the main difference between the States today and the Soviet Union a generation ago lies in the average standard of living. Stuff we take for granted today “in the name of security” is indistinguishable from the stuff we used to make jokes about the Soviet subjects having to put up with.

                Even simple things, like photographing bridges without being accosted by armed thugs wearing official badges….

                b&

                >

              • Jeff Ryan
                Posted February 23, 2016 at 12:24 pm | Permalink

                Much of what you complain of was instituted in response to 9/11. Do these things work? Some do, others don’t, and so frickin’ what?

                The overwhelming majority (and I mean really overwhelming) don’t face government control or intrusion as a matter of course. (The fact that you face government intrusion regarding, say, taxes, hardly rises to the level of tyranny.) And after the World Trade Center attacks, something was going to be done, and what was done is by no means unreasonable. Maybe not well executed, I’ll grant you, but that doesn’t make the idea unreasonable. Try flying El Al sometime.

                In the former Soviet Union, you had your kids being told to snitch you out. If someone came to visit you from another country, they were assigned a KGB man to follow and observe them. And you had the “knock on the door” where you could be hauled off to Lubyanka without a warrant, or even an explanation why. And you think this country is the equivalent of the Soviet Union? Are you serious? Have you even studied what happened there?

                Look, I don’t mean to be snide, I really don’t, but that is such a false equivalency as to be laughable.

              • Posted February 23, 2016 at 2:50 pm | Permalink

                Much of what you complain of was instituted in response to 9/11. Do these things work? Some do, others don’t, and so frickin’ what?

                I’m sorry, but dismissing such profound erosion of our most sacred foundational principles with, “so frickin’ what?”…well, really, how am I supposed to politely reply to that?

                The NSA has an espionage system vastly more intrusive than any seen before in human history, including in those of the most repressive regimes. But so frickin’ what?

                The President regularly orders flying death robots be deployed to kill indiscriminately, including so-called “double-tap” operations, just because he says somebody is a bad guy. But so frickin’ what?

                If I want to travel anywhere, I have to subject myself to sexual assault and public humiliation. But so frickin’ what?

                And, yes. Again, of course. It’s easy to come up with examples of situations, past and present, that are worse, including much worse.

                But so frickin’ what?

                Is the fact that there are Bangladeshi bloggers having their limbs hacked off supposed to make me feel better the next time a TSA agent feels me up?

                Here’s an idea. How about we stop this insane race to be the next-to-worst shithole on the planet and stop giving the government our blessing to do whatever they want simply because it’s not as bad as it could be and, hey, so frickin’ what?

                b&

                >

              • Jeff Ryan
                Posted February 23, 2016 at 3:01 pm | Permalink

                Except for technology, what you describe represents nothing new. What do you think the CIA has been doing since the ’50s? You think the Iranians really WANTED the Shah? Or that the Vietnamese were really in love with Madame Nhu? Where have you been? There is a whole lot that happened before the TSA was created. You should read up on it.

                And if you think the TSA is so “tyrannical”, you really don’t have any case at all. Look around you – not too many folks are upset with TSA, because its actions are vanishingly trite. Maybe you should just drive a lot more.

                Otherwise, you know who you sound like? Ammon Bundy at the Malheur Refuge. And he became a laughingstock.

              • Posted February 23, 2016 at 3:45 pm | Permalink

                Except for technology, what you describe represents nothing new.

                And this makes it good, or otherwise somehow acceptable? Even excusable?

                Plus, you’re falling into another very dangerous trap. “So-and-so was a fruitcake, and was in support of this-and-that. Therefore, this-and-that is a fruitcake position.” Even bin Laden had some very legitimate grievances against the States in the midst of his ranting and raving. The flip side of that would be me observing that you’re calling for even more spying on the general public, just as the KGB was notorious for performing, and that we all know the laughingstock the KGB became. Or, you might conclude that kissing babies is evil because Hitler kissed so many babies.

                Address the ideas, and leave aside the personal characteristics of unrepresentational samples of those support them.

                b&

                >

            • Diana MacPherson
              Posted February 23, 2016 at 11:55 am | Permalink

              You do know most tech people are just working stiffs like Ben and I, don’t you? Not multi millionaires like Larry Ellison?

              That’s as bad a non sequitur as calling all scientists immoral for not being involved more in civil rights like that ranter’s email to Jerry yesterday.

              • Posted February 23, 2016 at 12:00 pm | Permalink

                …and, for that matter, a judge who can’t make sense of an introductory text on the subject at hand has no business judging that case. Recuse and refer to somebody with more experience and / or inclination.

                b&

                >

              • Stephen Barnard
                Posted February 23, 2016 at 12:12 pm | Permalink

                Apple is asking for an expert commission to examine this. I think that’s a good suggestion. Clearly, this isn’t a time-critical issue. The San Bernardino shootings happened over a month ago.

              • Jeff Ryan
                Posted February 23, 2016 at 12:29 pm | Permalink

                THAT is a reasonable request.

    • Diana MacPherson
      Posted February 22, 2016 at 5:26 pm | Permalink

      Sam, and it seems you as well, misunderstand our objections. He thinks we are arguing that breaking into the phone violates the terrorist’s privacy and that is not at all the case. In fact, Apple has already handed over the data on the phone that was stored on their iCloud servers.

      None of us has a problem with this. I put this sentence in a separate paragraph because I want it to stand out. We don’t have a problem with privacy being violated. We also don’t care that the smartphone’s data is being requested.

      What bothers us is that there is a request for Apple to write a program that will, once written, weaken security for all iPhone users.

      So, amused as you may be by our opposition to this request, you miss the point.

      • Chris G
        Posted February 23, 2016 at 5:34 am | Permalink

        Hey Diana. Firstly, just a minor point, but I said ‘bemused’ not ‘amused’. I think we can all agree this is no laughing matter.
        I’ve gone back and read the comments you’ve posted here, all 49 of them, which was very useful and informative.
        However, I’m still a little confused: you say “We don’t have a problem with privacy being violated”, but rather you object to the existence of “a program that will, once written, weaken security for all iPhone users.”
        But I presume your worry about weakened security is because of the risk of violation to privacy? You believe the current iPhone security should not be undermined because of a wider ‘right’ to privacy? Or have I misunderstood?
        I first read Jerry’s piece on this story soon after listening to the recent Sam Harris podcast where he expressed his initial thought on the issues too. Sam states he’s open to having his views changed on this, and I feel the same, hence why I’m engaged in this discussion.
        (Btw, Sam asked if listeners could suggest anyone suitable for a podcast conversation with him on this – sounds like you’d be a strong candidate? Or do you know of someone else you’d rather recommend?)
        Jerry wrote “this is a question of ethics, somewhat analogous to the dilemma of whether to torture someone who has information that could lead to saving thousands of lives.” Seems Jerry may have had Sam Harris in mind, what with Sam’s controversial views on the ethics of torture.
        Many have commented here that the slim likelihood of there being anything of importance on this phone strengthens Apple’s case. But Ken Kukec suggested, in his comment below, a hypothetical scenario where Mohamed Atta’s phone is found in a hotel room on 9/12, in a context of a known risk that it may contain information about an even bigger imminent attack than that of the day before i.e. a ticking time-bomb situation. Ken then asked if that phone should remain encrypted, which prompted the reply: “Yes. Because the alternative is worse.” This aligns with the ‘slippery slope’ argument often proposed against the use of torture in any circumstance.
        I’ve been trying to step back from the detail and clarify first principles involved here. For me, I think it comes down to whether we have any right to total privacy. And whether companies should be allowed to build and design technology that allows information to disappear into black-holes. I don’t think we should have that right; we should not be able to operate in total anonymity, assume any detail of our history can remain hidden, nor erase our footprints. We do not all have incriminating things to hide – for those who do, don’t we want them to know it could be accessed?
        As you’ve pointed out, Apple have been able to retrieve information and pass this to the FBI e.g. emails from the iCloud, text messages. I asked Ben Goren if he thinks all electronic-systems should now follow Apple’s approach, so that the custodians can state ‘We can’t access the data, and we refuse to help’? He didn’t give me a straight answer on this – can you? Do you think the emails and texts should also have been held in a way that would have prevented Apple from passing these to the FBI? If not, what’s so different about a smartphone?

        • Diana MacPherson
          Posted February 23, 2016 at 11:08 am | Permalink

          No I am not as concerned about privacy as I stated. Security, yes. Keeping my social insurance number private as well as bank accounts and GPS information and a lot of other information that can be transmitted. There is a big difference between privacy and security and the violation of security required to get at this information is a threat to all of us.

          The people is recommend to talk to Sam is someone from the EFF or even a security expert like our very own Ant Allen! 🙂

          • Chris G
            Posted February 23, 2016 at 2:19 pm | Permalink

            Thanks for the clarification Diana.
            But you didn’t answer the question in my final paragraph?

            • Diana MacPherson
              Posted February 23, 2016 at 3:40 pm | Permalink

              In answer to your last paragraph, I don’t think the texts themselves are what we, who oppose this issue, are concerned with.

              It’s also not that it’s a smartphone. The medium is not what’s bothering us.

              What’s bothering us is Apple is being compelled to weaken their security.

              If the medium was the Internet and the process was sending information across it, I wouldn’t like a company being compelled to write software that would weaken that security either.

              So, in other words, I don’t care about the government getting a subpoena for the information and I don’t care that the information is on the smart phone. What I *do* care about is the writing of code aimed at weakening security.

  71. Ken Kukec
    Posted February 22, 2016 at 4:48 pm | Permalink

    This post and its commentary have proved interesting and enlightening.

    Among the Apple supporters here, there appear to be two schools of thought. The first is what I would call the Libertarian Purists. Their position is, in essence, that Apple has created this uncrackable encryption and that we, therefore, have an absolute right to its unbridled use. If Apple assists the government by decrypting the phone, the government will get its hot hands on the the technology and — never mind its hollow assurances to the contrary — will use it against us, to snoop on our phones and electronic devices. So screw the government.

    The second (and, frankly, more interesting) argument is that of the Techworld Proponents. Their argument is essentially that uncrackable encryption is the sine qua non of the modern digital world we inhabit — that without it, our computers and devices cannot be made secure, and without this security, we’d be rendered unable to communicate or engage in commerce.

    Their position w/r/t the pending Apple litigation is essentially this (if you’ll pardon the artistic license): the government is tempting us to partake of the iPhone of the knowledge of good and evil. Should it succeed, the Edenic encryption covenant will be rent asunder, with Humankind banished from the Garden of Digital Paradise, left to roam in a Fallen World of pencil & paper, abacuses, limbless snakes, and women in the throws of labor pain.

    I have no problems dismissing the argument of the Libertarian Purists, for reasons set out above, as being contrary to eight centuries of Anglo-American law balancing personal privacy against societal interests in securing evidence to prevent and prosecute crime. (I can, nevertheless, relate to their sentiment; I mutter the phrase “screw the government” often, almost always when flipping through the C-SPAN channels.)

    The Techworld argument, on the other hand, appears to have merit — if its premise is correct and any breach of uncrackable encryption will banish us from the digital Garden (if, that is to say, its premise is supported by more than a Murphy’s Law-like folk wisdom holing that, in Techworld, whatever can get out, will get out. That is to say that, if the FBI is successful (or if any other inroad are ever made upon uncrackable encryption), then confidence in our computerized world will vanish, digital commerce and communications cease. (Sometimes when they say the sky is falling, the sky may really be falling.)

    Before any of us sign-off against the FBI, however, I think we’d do well to consider a final hypothetical (one that is, I submit, more feasible than fanciful):

    Imagine that, rather than the Berdoo killers, the phone in question had belonged to Mohamed Atta, and had been found on 9/12/01 in the rented room he vacated just before boarding his ill-fated flight (AA Fl# 11, departing Logan International 07:45, arriving WTC Tower 1 08:46). Imagine also that the FBI has information suggesting Atta had used the phone to communicate recently with his Qaeda handlers in Frankfurt and Afghanistan, and that “chatter” and informant tips indicated that the 9/11 attacks were only the first wave, that bin Laden had something “even bigger” in the works.

    Must Atta’s phone even then sit un-decrypted to preserve our place in the Garden? That’s a question I’m sure judges will ponder as this case wends its way through the courts.

    • GBJames
      Posted February 22, 2016 at 5:00 pm | Permalink

      Yes. Because the alternative is worse.

    • Posted February 22, 2016 at 5:15 pm | Permalink

      Their position is, in essence, that Apple has created this uncrackable encryption and that we, therefore, have an absolute right to its unbridled use.

      With this sentence, you demonstrate a very profound lack of understanding of the basic technology in question — at the level of somebody who thinks it’s a conspiracy that the car companies are squashing 1,000 MPG carburetors for engines that run solely on tap water.

      The math for uncrackable encryption is trivial. It’s stuff that any high school student can master. Encryption that gets cracked comes in only a few varieties. First, people who don’t bother to understand the math are likely to implement it badly — that accounts for lots of cheap things out there, such as the password “protection” of Microsoft Office documents. Second, even those who know what they’re doing can make subtle mistrakes in implementation that aren’t readily apparent until after the fact — such as, for example, giving access to an address book from a lock screen but failing to account for the address book application’s “feature” to use any data source as an address book whether or not it really is an address book. And, lastly, of course, you can intentionally cripple it.

      But, if you don’t cripple it, if you’re careful in your implementation, and if you’re not an incompetent oaf…your encryption will either be truly uncrackable, or cracking it will depend on as-yet-undeveloped technology or unknown physics (such as time travel).

      And, again again again again again…all the technology you need to yourself create an uncrackable encryption system is a coin, a pad of paper, and a pencil. Are you going to outlaw office supplies?

      All this is long before we get to, for example, Diana’s observations that a banking system that uses crackable encryption is one that anybody can steal from with reckless impunity. If your Web browser didn’t support in-practice unbreakable encryption, any random low-level flunky at any random Internet company could trivially use your banking information to buy anything from anywhere. You want to order stuff from Amazon and not worry about the kid at the table next to you in the café piggybacking on your order to ship a few hundred dollars worth of sex toys and a diamond ring to his girlfriend — all at your expense? You want the exact same category of encryption as Apple uses to secure their iPhones.

      b&

      >

      • Ken Kukec
        Posted February 22, 2016 at 5:45 pm | Permalink

        You clearly did not read past the second paragraph of my comment, Ben, which did not itself set forth my own views, but instead summarized those expressed by other commenters in this thread.

        The arguments made by Diana and others are summarized — accurately, I believe — in the third, fourth, and sixth paragraph of my comment.

        • Posted February 22, 2016 at 5:51 pm | Permalink

          If you don’t even understand the most elementary basics of the technology under discussion, what makes you think you understand the arguments of experts in the field are making about the technology?

          You do understand that Diana’s day job involves software quality assurance management, no? And that Stephen has an entire career in communications and artificial intelligence behind him? And that I paid off a mortgage in about three years doing database and Web development for secure internal corporate financial systems?

          Your summaries of the positions is not accurate, and is but a comic caricature of the actual positions. But explaining how your summaries go awry depends on explaining the technology, and you’re continuing to refuse to even acknowledge the most basic explanations of the technology we’re trying to offer you….

          b&

          >

          • Ken Kukec
            Posted February 22, 2016 at 6:39 pm | Permalink

            I didn’t mention technology, let alone misstate it. I’m addressing here the relevant legal positions of the parties. This is, after all, a lawsuit we are talking about. I think I’ve accurately characterized those legal positions. I certainly don’t hear you setting out any way in which my characterization of those legal positions is invalid.

            Keep in mind this case will be decided on its legal merit, by a lawyer sitting on the bench as a judge (and will likely be reviewed by other lawyers sitting on the appellate bench) — not by a panel of technical experts (though the court may well consider testimony from technical experts, to the extent it finds that such testimony will assist in illuminating and defining the legal issues).

            BTW, congratulations on the prompt retirement of the debt on your mortgage loan.

            • Posted February 23, 2016 at 10:34 am | Permalink

              I don’t think I’m constitutionally capable of respecting a legal process that deliberately ignores reality. A legislature that insists that π = 3, a judge that decides a girl is just a little bit pregnant, a court that thinks you can break one phone without breaking them all…these people are incompetent and we are idiots to trust them to make wise decisions on our behalf.

              Especially in this case. Bruce Schnier’s Applied Cryptography is well within reach of anybody with a decent education, and it provides all the technical background necessary to make informed judgements in cases such as this. You’ll notice that everybody here recommending such remedial education is siding with Apple — and that’s no coincidence. It’s just like a room full of aerospace engineers and auto mechanics trying to shut down the idiot blathering about the 1,000 MPG carburetor that lets you fill your Honda Civic tank with tap water that Lee Iococa has locked in a vault. Do you think technology would be irrelevant if the Army were suing VW to manufacture such a device for them? No? So why should legal fictions take precedence over basic math in the FBI / Apple case?

              b&

              >

              • Ken Kukec
                Posted February 23, 2016 at 4:33 pm | Permalink

                … these people are incompetent and we are idiots to trust them to make wise decisions on our behalf …

                I dunno, Ben, our system of government and justice ain’t perfect, but they’ve served us reasonably well for the past couple centuries. You really want to scrap it? You trust a bunch of technocrats to protect free speech, civil rights, right to fair trial, etc.?

                These things have been tried before. Government made a partner out of industry and gave it authority over what had previously been state functions in 1930s Europe. Churned out a lot of cool new tech, helluva lot of materiel, too. And it was nothing if not efficient. Not so good on the human-rights front, though.

                But, hey, you wanna go ahead, give the technocracy thing another try yourself, you probably wanna follow the advice Dick the Butcher gave the rebel leader in Henry VI, Part 2: “first thing … kill all the lawyers.”

              • Posted February 23, 2016 at 5:24 pm | Permalink

                You’re leaping far ahead of anything I’ve suggested. Leapt so far you’ve sailed clear off the cliff.

                Indeed, I already suggested a very simple remedy to the problem of a judge who lacks the intelligence and / or dedication to make it through an introductory text of whatever subject is relevant to the case at hand: the judge should recuse and let somebody smarter and / or more diligent preside.

                But, if, as you’re implying, even that’s too much to ask — that there simply aren’t any intelligent hard-working judges in our system…then, yes, we should fire the lot of them and come up with a different method of appointing judges to the bench in the first place. If your indictment of the judicial system is on target, then, clearly, our law schools are in serious need of reform, and we should start by requiring a broad-based liberal arts education as a prerequisite to entry into law school.

                I had been under the impression that that’s what law schools already do, and my personal experience with a number of lawyers and a couple judges suggests that there really are competent people in the legal system who would be able to get themselves up to speed in a timely fashion. (And, to be sure, there are also a number of bad apples out there!) But, if you’re so sure that that’s not the case, that the whole lot of them is rotten to the core…then why aren’t you the one demanding we kill all the lawyers?

                Or, alternatively, if you think the judge in this case might be competent…then why are you arguing so vociferously that technological and educational incompetence shouldn’t be a barrier to judgement here?

                b&

                >

      • Ken Kukec
        Posted February 22, 2016 at 8:56 pm | Permalink

        Look, you can accuse me of presenting a caricature of the arguments made by Diana and others, one lacking in subtlety — and I’d agree. But that is because their resulting legal position lacks subtlety. (I hasten to add that Diana, Stephen and, I’m sure, you have plainly devoted much subtle and sophisticated thinking to the analysis of the underlying technical questions. And I’m sure have given much subtle and sophisticated thought to the potential ramifications of a decision adverse to Apple in this case). But their bottom-line on what the legal outcome should be in this case is blunt, in a word: “No.” Apple shouldn’t have to do what the government is asking — not now, not ever.

        My own thinking on this matter has evolved, mostly in just the past 36 hours. Diana’s analysis in particular has brought me around — if not 180 degrees, then maybe 135, to where I’m not only undecided, but leaning toward Apple.

        If anything, I’m still trying to inject some subtlety into the legal analysis pertaining to the legal bottom-line here — in essence, asking in response to their “Never,” the rejoinder of the crew to the captain on the HMS Pinafore: “… Not ever?”

        My inquiry is two-fold — is there really no way for Apple to accommodate the government’s need for the information in question without blowing up encryption for good? And, really, never, no matter how dire the circumstances that might be presented in some later case?

        I’m confident that these will also be among the concerns of the court that decides this case — if not the district court judge (about whom I know nothing), then in the Ninth Circuit Court of Appeals, and for sure if the case reaches SCOTUS. (Bear in mind that, however justifiably you may mock my technological prowess, most judges are even worse. Just a few terms back one justice on the high Court inquired at oral argument if there was a difference between email and a beeper.)

        Cheers backatcha,

        • Jeff Ryan
          Posted February 22, 2016 at 9:25 pm | Permalink

          I hope I’m not intruding, but I’d like to comment, in a limited fashion.

          First, if I recall, the District Court has already ordered Apple to comply. The case is now in front of the Ninth Circuit Court of Appeals. Whichever way they vote, I can almost guarantee the losing side will appeal, either for an en banc rehearing or straight to the Supreme Court. The Supreme Court may also decline the case, leaving whatever the Ninth Circuit decides standing, though the Supremes rarely pass up an opportunity to reverse the Ninth Circuit, so that is a wrinkle that can’t be discounted.

          Last, let me agree with Ken on one important point: The courts (everywhere) rarely possess the technological knowledge that has just been raised throughout these posts. Courts almost always lag behind science and technology. I had the luck to wage war, as a prosecutor, over the admissibility of DNA evidence. Early on I perceived the attacks on such evidence to be largely specious. I am no scientist, but I think I can figure things out when I’m being beaten over the head with them, and I had no doubt as to the validity of the evidence. But it took courts years, literally, to accept what to me was not controversial. The courts will be similarly bewildered by the inside baseball that is unavoidable in this case. (And which I freely admit not understanding myself.) The side that wins this argument will be the side that can make the facts and issues understandable to the court.

        • infiniteimprobabilit
          Posted February 22, 2016 at 9:34 pm | Permalink

          I think, as the judge said, ‘circumstances alter cases’.

          This case is weighing two probable outcomes.

          And in this case the probability of there being anything useful on the guy’s work phone is, I think, slight.

          There’s also the fact that making Apple defeat their own ‘uncrackable’ phone – even if feasible – does have very significant consequences way beyond just handing over a password.

          So I think the actual (probable) consequences have to be taken into account. Different circumstances, maybe a different decision would be appropriate.

          Incidentally, I think the ‘ticking-bomb’ scenario (which doesn’t apply in this case) would also have its cogency reduced by the fact that writing an alternative OS, testing it, etc is going to take quite a lot of time. By which time the bomb would have gone off, and any miscreants who do happen to be on the suspect phone would have known to cover their tracks.

          Interestingly enough, a lot of the problems would have been obviated if the FBI had got Apple to do it in secret. But that raises its own can of worms, including what-if-(when!)-the-news-leaked-out.

          cr

          • Stephen Barnard
            Posted February 22, 2016 at 9:49 pm | Permalink

            “Interestingly enough, a lot of the problems would have been obviated if the FBI had got Apple to do it in secret.”

            Good point, but I don’t believe the FBI wanted this to be secret. They wanted a public, well-documented case to set a precedent, and that’s what Apple objects to. This is almost obvious, given the timeline and their previous positions on encryption and back doors and their inept handling of the Apple ID.

            • infiniteimprobabilit
              Posted February 23, 2016 at 1:49 am | Permalink

              I suspect you’re right, Stephen.

              cr

  72. Posted February 23, 2016 at 11:58 am | Permalink

    “Unbreakable” – I think that’s a red herring. Even if I just rot-13 my text, I don’t think anyone should be compelled to hand over the fact that I did that. Let the law enforcement always do their *own* work on these things, exactly like with the safes.

    Or put another way: Suppose some bad guys (or anyone else) were communicating in Inuktitut. Would the FBI (or other agencies) be allowed to compel a reader of Inuktitut to translate for them?

    • Jeff Ryan
      Posted February 23, 2016 at 12:28 pm | Permalink

      Probably.

  73. Richard C
    Posted March 1, 2016 at 10:39 am | Permalink

    Apple’s 65-page legal response is a surprisingly easy and fast read, without any dense legalese to parse.

    https://www.documentcloud.org/documents/2722196-Motion-to-Vacate-Brief-and-Supporting-Declarations.html

    Yes, the government has a valid search warrant. I can’t find any statistics, but I’d guess courts issue hundreds of thousands of valid search warrants every year. There are over a thousand wiretap warrants issued every year in the US (uscourts.gov), must Apple also rewrite the iPhone’s operating system to fulfill them? The Government can get warrants to secretly install GPS trackers on a suspect’s car and microphones in a suspect’s office or house, but you already carry an Internet-connected GPS device, microphone, and camera in your pocket. Must Apple secretly enable them by remote every time the government claims probable cause? And what of requests by other governments without 4th Amendment-like protections?

    China has been trying to force Apple to install encryption backdoors for years, but backed off because even the US government doesn’t get such access.

    A hacked operating system signed by Apple’s cryptographic installation certificate doesn’t just get you access to (some of the) data on a single terrorist’s phone*. It gets you control of the scariest, most powerful mass surveillance device ever invented. There needs to be extra care taken to protect it.

    What would J. Edgar Hoover’s FBI do with this kind of power?

    * A lot of communications software, including those recommended by ISIS training videos, employ additional end to end encryption that even the FBI’s requested hacks would not unlock.


%d bloggers like this: