UKC

Apple stand's up to FBI...But should they?

New Topic
This topic has been archived, and won't accept reply postings.
 The Lemming 17 Feb 2016
The FBI are putting pressure on Apple to break into an encrypted apple phone for them so they can see the contents.

This phone also belonged to a mass murderer, but is that reason enough for a nation's government agency to demand that a commercial organisation break its own encryption designed to stop such governments gaining access to information?
2
Lusk 17 Feb 2016
In reply to The Lemming:

Yes, Who the F do Apple think they are, that they think they can prevent access to potential evidence in a criminal investigation.
14
 Tyler 17 Feb 2016
In reply to The Lemming:

> The FBI are putting pressure on Apple to break into an encrypted apple phone for them so they can see the contents.

> is that reason enough for a nation's government agency to demand that a commercial organisation break its own encryption designed to stop such governments gaining access to information?

Are they being asked to break the encryption or just being asked to provide the key for this one device?
 toad 17 Feb 2016
In reply to The Lemming:

presumably it's ok for a large multinational company to be able to decrypt an iphone and read it's contents, just not the police...
4
KevinD 17 Feb 2016
In reply to toad:

> presumably it's ok for a large multinational company to be able to decrypt an iphone and read it's contents, just not the police...

The theory is the large multinational company isnt able to decrypt it either.
It wouldnt be so much beating the encryption so much as beating the 10 attempts only limit. If that can be bypassed then it can be brute forced.
 wintertree 17 Feb 2016
In reply to The Lemming:

That this has come to court makes me seriously question the technical capabilities of various TLAs including the NSA. Are they to busy developing mass surveillance of soft targets (and sifting through the oceans of dross this yields) to maintain any technical competence with even relatively basic "grown up" encryption?

1
 jkarran 17 Feb 2016
In reply to Tyler:
> Are they being asked to break the encryption or just being asked to provide the key for this one device?

The story on the radio this morning was that they're being asked to help facilitate a brute-force attack on the screen lock having already refused to provide more fundamental support. Presumably the FBI believe this is a fudge whereby Apple can assist them in their hacking without being seen to provide an easily abusable 'backdoor'. I'm a little surprised they actually need to go through the user interface to access the (mostly encrypted?) data, I guess it's the safest option but not the last resort. It's also quite likely the radio news report was garbled!

You can see Apple's commercial interest in standing their ground.
jk
Post edited at 12:45
In reply to Tyler:

> Are they being asked to break the encryption or just being asked to provide the key for this one device?

Neither. They are being asked to create a custom version of iOS and tell the phone it is an update so it is installed. The custom version of iOS will turn off the mechanism that inserts delays after a bad PIN guess and makes in infeasible to guess a PIN by trying all possible combinations. It is also supposed to turn off the mechanism that wipes the phone after a certain number of bad guesses.

Apple's concern is that once this version of iOS exists it could be used to hack into any older iPhone the government can get possession of. It's not clear if the new iPhones with the fingerprint chip would be vulnerable, if Apple were clever they'd have put mechanisms like this into the hardware.

The whole thing is arguably a sham because there are agencies in government perfectly capable of carrying out this kind of attack without any help from Apple. The reason for doing it this way is to make a point in public and so they can explain how the evidence was obtained in court.

OP The Lemming 17 Feb 2016
In reply to tom_in_edinburgh:

> The reason for doing it this way is to make a point in public and so they can explain how the evidence was obtained in court.


Interesting thought.

However, how good is the encryption on iOS and Android phones?
2
 DancingOnRock 18 Feb 2016
In reply to tom_in_edinburgh:

I'd suspect it's more about reverse engineering Apple's code and where the FBI would stand if they succeeded.

Is it even possible for the phone to update itself? Mine doesn't. It downloads the update and waits for me to say yes. And how would Apple release an update to just one phone.

And who is going to pay them to create this update/work around?
In reply to DancingOnRock:

> I'd suspect it's more about reverse engineering Apple's code and where the FBI would stand if they succeeded.

There's lots of ways an agency like GCHQ or NSA could hack the phone if they chose to and there are commercial labs that could do it for the FBI. What the FBI wouldn't get this way is a simple and low cost way of looking at lots and lots of phones which they can use as evidence in court.

If you just want to hack one phone getting in a legal battle with Apple is an awful expensive and time consuming way to do it but if you want to set a precedent that Apple has to open up any iPhone you can get a court order on then maybe its worth it. Once this backdoor software exists every other country is going to demand the same thing.

> Is it even possible for the phone to update itself? Mine doesn't. It downloads the update and waits for me to say yes. And how would Apple release an update to just one phone.

Presumably there's a way for the download to be activated without needing the PIN or the whole thing would be pointless. I don't see a problem with updating just one phone, when they are developing and testing the software they'll need to do that all the time.

> And who is going to pay them to create this update/work around?

The court ruling says Apple can charge to do the work.


1
 wbo 18 Feb 2016
In reply to The Lemming: the Encyclopedias tion is very strong. My understanding is similar to Toms re. The mechanism but the FBI won't commited to only using it on this phone, plus once the least versjon is out there it will inevitably end up 'in the market'

1
 Indy 18 Feb 2016
In reply to wintertree:

> That this has come to court makes me seriously question the technical capabilities of various TLAs including the NSA.

Would they want to be advertising the fact that they could break it only for people to use even stronger encryption?

Good on Apple and all the othe companies standing up to the Govt. on this
3
 GarethSL 18 Feb 2016
In reply to The Lemming:

Surely this is in some way related to 'withholding evidence' or 'obstructing an investigation'?

I understood form the news that they use the encryption by default to avoid this situation, surely however they have forced this situation by doing so.

I do believe however that in cases regarding national/international terrorism there should be transparency with regards to encryption and messages/communications between perpetrators.

But I don't agree however, that a government should be allowed to force(?)/demand the creation of a software/system by a company that can be used for these purposes. In a world ruled by tech its only a matter of time before such software is in the public domain.

Its hard not to feel this could result in another blind step towards bigbrotherdom.

That said, I've done nowt wrong, honest guv.
 Tom F Harding 18 Feb 2016
In reply to The Lemming:

GCHQ and the NSA, according to the Snowdon leaks, already have Iphone hacking tools called 'Warrior Pride'. Part of the 'smurf' surveillance and hacking tools.

https://en.wikipedia.org/wiki/WARRIOR_PRIDE

As people have said though, they won't be able to use evidence from this hack in a court of law.

Apple should refuse to allow this new 'front door' to be added - The precedent is too large. No one can believe that this will be a one off and will never be used again, it's very much the thin end of the wedge.
 john arran 18 Feb 2016
In reply to The Lemming:

I'm surprised nobody has yet complained about the greengrocer's apostrophe in the title - which is curiously appropriate in a discussion about apple's
1
 Dave Garnett 18 Feb 2016
In reply to Indy:
> Good on Apple and all the othe companies standing up to the Govt. on this

Are they 'standing up to the government'? And, if so, by what right, given its democratic* legitimacy?

I've no idea how they handle these things in the US but isn't this exactly the reason why we have judicial and political oversight? Here, the police would apply for a court order, or maybe the Home Secretary would need to approve a warrant but, in any event, where there was such an obvious justification I don't think a commercial company just refusing to cooperate would be an option.

* Yes, I know, it's democracy, but not as we know it, Jim.
Post edited at 09:57
4
 Dave Garnett 18 Feb 2016
In reply to Tom F Harding:

> Apple should refuse to allow this new 'front door' to be added - The precedent is too large. No one can believe that this will be a one off and will never be used again, it's very much the thin end of the wedge.

So do you think that the police here should not have lawful access to the contents of a locked iPhone where they have reasonable grounds to be investigating a suspect?
4
 Dauphin 18 Feb 2016
In reply to The Lemming:

Its a bit odd from the FBI. Most of, if not all the personal data will sit in iCloud the rest they will have access to the at very least the metadata of any phone calls; texts and data will be recoverable from isps and the NSA data centre if that's impossible. Suppose they may want to keep the pretence going that they need access to the individual device, Apple gets protect its largely fake ( fake as far as three letter agencies who have motivationb and means )security integrity and its share price boosted, while the idiots that failed to grasp what Snowden released think Apple is doing them a favour. I believe fake updates to spoof iphones with malware as one of the main routes NSA GCHQ had into smartphone.

D
 JoshOvki 18 Feb 2016
In reply to Dave Garnett:

Of course they should, which is why they can tell the owner to give them the unlock code. If they don't hand it over they go off to jail 2 years, where they have they have the request for the keys given to them again, don't hand them over, back in prison for 2 years. Rinse and repeat.

No need to effectively making some insecure, that is if it is technically possible for the brute force attack to work
2
Removed User 18 Feb 2016
In reply to Dave Garnett:

You (and others in the thread) are misunderstanding what's happening in this case.

Apple has created a product and protects users data with encryption. There's no backdoor built into it by design, that means even apple employees can't decrypt/read that data.

The phone is an iphone and so has a 4digit code. If you enter the code incorrectly 10 times it wipes the data and rewrites the memory so it can't be retrieved (don't be fooled by these people saying 'the government could easily get the data back', they can't, and have never been able to retrieve overwritten data in any meaningful amount). The FBI have requested that apple create a version of iOS that allows them to enter unlimited attempts at the 4 digit combo electronically (rather than physically typing each one in) and remove the 10 guess limit so that they can essentially crack the phone.

The issue at hand is, if such a version of iOS is created it creates a backdoor into the iPhone. No iPhone data would be safe for any person, anywhere. The FBI aren't asking for the data from a single phone, they are asking for a way to break the encryption on ALL iPhones. To suggest that this program wouldn't then be used in the future with no legal oversight is laughable and Apple knows this. Just look at the StingRay program (https://en.wikipedia.org/wiki/Stingray_phone_tracker), where it's used by regular cops with no warrant required.

This court battle will be one of the most important tech cases in a long time and for a long time, encryption should be available and uncrackable (many of them are, even by the NSA/GHCQ with all their clusters, no one can break even fairly basic 2048 bit encryption let alone higher standards or multiple standards) to everyone in the world. Fortunately it still is, but if the court instructs Apple to backdoor a way around their encryption it makes you wonder who will be the next that's forced to do the same.

There's a reason big tech companies like google and fb are supporting apple in this case, and a reason tech consultants and security professionals are rallying behind apples stand. If you don't understand why Apple is in the right in this matter you don't understand the full picture. If you did I think you'd support them as well.

TL;DR Apple are 10000% right, the public doesn't understand what's good for them, it's not a simple case of 'help the government or you support terrorism'.


 zebidee 18 Feb 2016
In reply to Dave Garnett:

> So do you think that the police here should not have lawful access to the contents of a locked iPhone where they have reasonable grounds to be investigating a suspect?

"Reasonable grounds" is a very fluffy term though - there have been several cases (particularly in the US) where "reasonable grounds" have been shown to be fishing expeditions by law enforcement.

Smart phones have also been described as a "container" by law enforcement, trying to suggest that they are like the boot of your car or a box of stuff when in fact they are hand-held computers which can tell you far more about someone than just a box of stuff (who you speak to, what you speak about, likes, dislikes, etc. etc.).

Techdirt has a good discussion of the legal and technical aspects of this case - https://www.techdirt.com/articles/20160216/23254533618/apple-responds-to-or...

 zebidee 18 Feb 2016
 Dave Garnett 18 Feb 2016
In reply to JoshOvki:

> Of course they should, which is why they can tell the owner to give them the unlock code. If they don't hand it over they go off to jail 2 years, where they have they have the request for the keys given to them again, don't hand them over, back in prison for 2 years. Rinse and repeat.

But if the police (or God forbid, HMRC) have a warrant and want to see the contents of one of my locked filing cabinets how long do you think my refusing to give them the key would hold them up?

Why should my phone be any different?

4
 Tom F Harding 18 Feb 2016
In reply to Dave Garnett:

Does it not seem a little strange to you that this has gone public in this way and at this time. Up to this point there seems to have been murmurings about encryption but no country, as far as I'm aware, has filed a public court order asking for a manufacturer to specifically design software to break it's own encryption. If this had happened before wouldn't apple have gone public the last time?

We know about the high level tools used such as warrior-pride and Dropoutjeep - they already have these, arguably illegal, tools at their disposal which have the same function as they are asking Apple to develop. But it seems pretty clear that governments now want to be able to use these tools to generate evidence in court of law. When this appears in court and the defence lawyer asks what the source of this evidence is, the prosecutor can say it came from a classified/secret back-door that they are unable to give details about - it wont matter if it comes from the court ordered Apple hack or the existing 'secret' tools.

Another interesting point is why pick this case to put in the court order? Why not any of the other 300+ mass murder cases in the US each year. Do you not think it has something to do with getting the public behind the government as of course how could apple be so evil as to stop the lawful prosecution of a mass murderer.

 Dave Garnett 18 Feb 2016
In reply to Removed User:

> You (and others in the thread) are misunderstanding what's happening in this case.

You're right, and thanks for the explanation - very helpful.

> If you don't understand why Apple is in the right in this matter you don't understand the full picture. If you did I think you'd support them as well.

Well, maybe, maybe not. I assume that stuff on my phone is private, but not highly confidential. I assume that anything on the cloud is pretty much public access.



2
 off-duty 18 Feb 2016
In reply to Removed User:

> TL;DR Apple are 10000% right, the public doesn't understand what's good for them, it's not a simple case of 'help the government or you support terrorism'.

You do realise the irony in your summary ?
The public need a private international company to decide what's good for them....
5
Removed User 18 Feb 2016
In reply to Dave Garnett:

This happened recently in Liverpool. The kid didn't give up the password and he ate the jail time. In his case I daresay it was preferable to giving up the password given the charges against him. They won't break that password with current technology.

http://www.bbc.co.uk/news/uk-england-11479831
Removed User 18 Feb 2016
In reply to off-duty:

You're misreading. They are separate points. The public doesn't understand what this case is about and it's being incorrectly condensed down to terrorists vs civilians. It's a case about privacy and the right to privacy (and by extension, the right to encryption), something Apple is upholding. Those rights as I understand it are created by the government, something that other parts of the government are now trying to get around.
 Shani 18 Feb 2016
In reply to The Lemming:

This isn't about "getting in to a killer's phone". The FBI are fomenting an emotive debate on the wider issue of privacy and secrecy.

What would be the West's reaction if China asked for a backdoor access to media devices? Do we really trust governments with access to our personal data, never mind the profiling that can be undertaken from the metadata on phones (who you know, where you go, when you go there etc.... on a physical basis as well as that defined by your browsing habits).

Until there is transparency into government lobbying (that is where the REAL power lies), I'd be wary of who I let peep over my shoulder and intrude upon my daily life.

1
 Dave Garnett 18 Feb 2016
In reply to Tom F Harding:

> Another interesting point is why pick this case to put in the court order? Why not any of the other 300+ mass murder cases in the US each year. Do you not think it has something to do with getting the public behind the government as of course how could apple be so evil as to stop the lawful prosecution of a mass murderer.

Well, it's a good example of a case to show why law enforcement should, given the appropriate permission, be allowed access to encrypted data. Of course this should be approved on an individual case basis, and maybe this is a problem in the US but I don't have a problem with it in principle.

It seems odd to me that the people who most object to this seem to be the ones who most approve of Wikileaks!
3
 Shani 18 Feb 2016
In reply to Dave Garnett:

> Well, it's a good example of a case to show why law enforcement should, given the appropriate permission, be allowed access to encrypted data. Of course this should be approved on an individual case basis, and maybe this is a problem in the US but I don't have a problem with it in principle.

Because you cannot trust that the backdoor in to any device will be limited in use to ONLY those of whom you approve, and ONLY under the conditions agreed, for the purposes stated.
 JoshOvki 18 Feb 2016
In reply to Dave Garnett:
If the locked cabinet would take a billion years to break open, then a billion years.

Just because you have a rubbish filing cabinet that only take 5 minutes to open that is your own problem.

(Edit to say I didn't down vote your question)
Post edited at 11:17
Andy Gamisou 18 Feb 2016
In reply to john arran:

> I'm surprised nobody has yet complained about the greengrocer's apostrophe in the title - which is curiously appropriate in a discussion about apple's

Ha ha. Nice one.

 FactorXXX 18 Feb 2016
In reply to Shani:

Because you cannot trust that the backdoor in to any device will be limited in use to ONLY those of whom you approve, and ONLY under the conditions agreed, for the purposes stated.

Why not give the individual device to Apple and have them retrieve the data?
4
OP The Lemming 18 Feb 2016
In reply to off-duty:

> You do realise the irony in your summary ?

> The public need a private international company to decide what's good for them....


Corporations, evil tax evaders that they are, are simple creatures who's goal in life is to make stuff, sell it and then make a profit.

National governments, on the other hand, are complex and duplicitous organisations who have a myriad of ulterior motives with the goal of staying in power.

Its sad that I trust Apple and Google to keep my personal life safe and private above any government professing to have my best interests at heart.

And if this involves 'Bad people' having the same abilities, then isn't that the price we pay to live in a free and democratic world?
1
 Shani 18 Feb 2016
In reply to FactorXXX:

> Why not give the individual device to Apple and have them retrieve the data?

It is a libertarian principle that Apple is trying to uphold. In the Snowdon-era, that a private business would do such a thing on behalf of the FBI would be corporate suicide - particularly in the US.
 creag 18 Feb 2016
In reply to The Lemming:

But in the movies, they just give it to the tech guy and after a tense 20 minutes or so, the data is produced form a computer that makes lots of beeping and some impressive graphics!
 Mike Stretford 18 Feb 2016
In reply to Dauphin:

> Its a bit odd from the FBI. Most of, if not all the personal data will sit in iCloud the rest they will have access to the at very least the metadata of any phone calls; texts and data will be recoverable from isps and the NSA data centre if that's impossible.

Can you not turn off icloud backup? I don't know I've never had an Iphone. Could this be used to hide photos taken?

OP The Lemming 18 Feb 2016
In reply to Shani:

> It is a libertarian principle that Apple is trying to uphold. In the Snowdon-era, that a private business would do such a thing on behalf of the FBI would be corporate suicide - particularly in the US.

Also if Apple win, then this creates publicity that money can't buy that Apples' encryption in un-crackable.

This has got me wondering. My own Nexus phone is encrypted by default using both a pattern swipe and fingerprint reader to open it up.

Does all this mean that a simple 4 figure password is more secure than a fingerprint scanner?
My finger could easily be broken or cut off my hand to open the phone.
1
 SenzuBean 18 Feb 2016
In reply to FactorXXX:

> Because you cannot trust that the backdoor in to any device will be limited in use to ONLY those of whom you approve, and ONLY under the conditions agreed, for the purposes stated.

> Why not give the individual device to Apple and have them retrieve the data?

That doesn't make it safer - if Apple has an easy way to crack the device, then everyone has the same way to crack the device (once it's independently discovered or leaked).
 Dave Garnett 18 Feb 2016
In reply to JoshOvki:

> If the locked cabinet would take a billion years to break open, then a billion years.

> Just because you have a rubbish filing cabinet that only take 5 minutes to open that is your own problem.

But if they have a warrant they have a right to open it.
 Shani 18 Feb 2016
In reply to The Lemming:

> This has got me wondering. My own Nexus phone is encrypted by default using both a pattern swipe and fingerprint reader to open it up.

This just means your phone is locked - not that the data is encrypted.
 JoshOvki 18 Feb 2016
In reply to The Lemming:

I would like to think you would start yelling out the pattern way before they started to cut off your finger
 Dauphin 18 Feb 2016
In reply to Tom F Harding:
They would normally manufacture a scenario whereby the data has been collected by a third party if necessary, in a case like this one where there is no defence to question the particulars of the data collection; which rarely happens in criminal cases in U.S. courts because plea bargain early in the case the accused circumvents due process. FBI have likely picked this case because of the individual involved, its going to be hard for anyone to protect in law the privacy of a successful mass murderer and islamist terrorist. Also the phone was property of the California government, his employer. The FBI already likely have all the data recovered from both the iphone and data and voice networks.

D
Post edited at 11:28
 mullermn 18 Feb 2016
In reply to JoshOvki:

I realise you're joking, but the finger wouldn't actually help (unless it was cut off pretty immediately).

The data isn't encrypted using the fingerprint, the fingerprint scanner holds a copy of your passcode and when your finger is applied uses the code to unlock the phone. That is why you have to re-enter the code when the phone boots each time and why once you have failed with the fingerprint a few times you have to enter the code again to unlock the phone - the fingerprint sensor has ditched its copy of the code and no longer has the ability to unlock the phone even if it wanted to.

Obviously the correct thing to do if someone wants to cut your finger off to get the print is give them the wrong finger. A couple of wrong tries and none of the fingerprints will be any use AND you get to keep 7 of your fingers!
OP The Lemming 18 Feb 2016
In reply to Shani:

> This just means your phone is locked - not that the data is encrypted.

I was under the impression that Nexus encrypted its data by default?
 JoshOvki 18 Feb 2016
In reply to Dave Garnett:

This is not about having the right to open something, they have the right to open the iPhone. They just can't, same with your metaphorical cabinet that takes a billion years to break into.
 Shani 18 Feb 2016
In reply to SenzuBean:

> That doesn't make it safer - if Apple has an easy way to crack the device, then everyone has the same way to crack the device (once it's independently discovered or leaked).

This is the crux. Apple is not being asked to build a key that would unlock one iPhone. They are being asked to build a key that would unlock millions of iPhones.

Also, bear in mind that there are more than 600 companies and groups selling encrypted products, and two-thirds of them are located outside the U.S., beyond the reach of congressional legislation. Backdoors open up your phone to people outside of any one jurisdiction. You are building in a weakness to something that should otherwise be VERY secure.
In reply to Dave Garnett:

> Are they 'standing up to the government'? And, if so, by what right, given its democratic* legitimacy?

They have been served with an order from a court which they have the right to appeal against. There is nothing undemocratic about that, they are just exercising their rights under US law. They have some strong legal arguments and a higher court may well agree with them.

 Dave Garnett 18 Feb 2016
In reply to JoshOvki:

> This is not about having the right to open something, they have the right to open the iPhone. They just can't, same with your metaphorical cabinet that takes a billion years to break into.

Yes, I see. That's a tricky one. Do we have the moral right to construct (and sell indiscriminately) something completely secure, knowing that it can and will be used to conceal horrible crimes? Do we have a less technically secure infrastructure but have faith in our political and legal systems to access it reasonably and responsibly?

1
 JoshOvki 18 Feb 2016
In reply to Dave Garnett:

Yes, that seems to be the crux of the matter. Personally I think we should focus on keeping things secure, if the government have a way in, so do the criminals. Also criminals will just use different methods of hiding their details, while we are more vulnerable.

Interestingly enough if this was about an iPhone5s or 6 then it wouldn't be technically possible to do, as they use another chip for doing the encryption instead of the operating system.
 SenzuBean 18 Feb 2016
In reply to Shani:

> This is the crux. Apple is not being asked to build a key that would unlock one iPhone. They are being asked to build a key that would unlock millions of iPhones.

> Also, bear in mind that there are more than 600 companies and groups selling encrypted products, and two-thirds of them are located outside the U.S., beyond the reach of congressional legislation. Backdoors open up your phone to people outside of any one jurisdiction. You are building in a weakness to something that should otherwise be VERY secure.

Yes but if they are able to easily update the OS on an iPhone without the key, then someone else can do that too. That is what people are not understanding.
 zebidee 18 Feb 2016
In reply to Dave Garnett:

> Yes, I see. That's a tricky one. Do we have the moral right to construct (and sell indiscriminately) something completely secure, knowing that it can and will be used to conceal horrible crimes? Do we have a less technically secure infrastructure but have faith in our political and legal systems to access it reasonably and responsibly?

Pretty much every technology ever invented can be used for both good and evil. It's a slippery slope if you start legislating which specific technologies can and can't be manufactured.

You then have a conflict between encouraging businesses to set up, operate (and pay tax) in your jurisdiction and relocating to less legally strict locations. The US screwed themselves back in the 90's by declaring encryption technologies as a munition preventing them from export - https://en.wikipedia.org/wiki/Export_of_cryptography_from_the_United_States
 Dauphin 18 Feb 2016
In reply to zebidee:

They exported and continue to develop plenty of broken or hobbled crypto which is being used worldwide as we type. Makes life much easier for the trawlers and the crab lines.

D
1
 off-duty 18 Feb 2016
In reply to Removed User:

> You're misreading. They are separate points. The public doesn't understand what this case is about and it's being incorrectly condensed down to terrorists vs civilians. It's a case about privacy and the right to privacy (and by extension, the right to encryption), something Apple is upholding. Those rights as I understand it are created by the government, something that other parts of the government are now trying to get around.

Just commenting on your summary. I know how well I'd go down if I tried to explain a nuanced point of law with - "the public just don't understand, its for their own good."

In the meantime I'm off to make sure my kitten photos can't be hacked by the evil people...
4
 elsewhere 18 Feb 2016
In reply to Dave Garnett:
Do you regard access to face to face* communication as immoral knowing it has been, is and will be used to plan and conceal horrible crimes?

Why is access to private electronic communication morally different?

*face to face is probably more private and secure than anything encrypted
 SenzuBean 18 Feb 2016
In reply to off-duty:

> In the meantime I'm off to make sure my kitten photos can't be hacked by the evil people...

That's great for you...

But some people actually have important things that need to be kept secret. For example there are Chinese dissidents, people fighting for the freedom of West Papua who don't want Indonesia to be able to access their information, the much feted "Arab Spring" would not have been possible if their government had the ability to pre-emptively round up the protesters, etc.
1
 off-duty 18 Feb 2016
In reply to SenzuBean:

> That's great for you...

> But some people actually have important things that need to be kept secret. For example there are Chinese dissidents, people fighting for the freedom of West Papua who don't want Indonesia to be able to access their information, the much feted "Arab Spring" would not have been possible if their government had the ability to pre-emptively round up the protesters, etc.

To be fair your summary of "because dissidents" is also on a par with the state often being accused of trying to impose measures "because paedophiles".

My kitten pics are safe though, you'll be pleased to know.
2
 jkarran 18 Feb 2016
In reply to Dave Garnett:

> Yes, I see. That's a tricky one. Do we have the moral right to construct (and sell indiscriminately) something completely secure, knowing that it can and will be used to conceal horrible crimes?

Why wouldn't we, we have the right to grow/breed such a thing.

> Do we have a less technically secure infrastructure but have faith in our political and legal systems to access it reasonably and responsibly?

I'd say they haven't earned that trust of late but also that the government, their spies and enforcement agencies aren't the problem if we decide to weaken encryption.
jk
1
Removed User 18 Feb 2016
In reply to off-duty:

> My kitten pics are safe though, you'll be pleased to know.

All that matters really!
 nutme 18 Feb 2016

Somehow I have very little trust in such kind of the news.
That could have happened is that Apple gave access to data in one or another way and asked to keep it private. As result NDA agreement between USA government and Apple inc, FBI got data and Apple got publicity. Everyone is happy.
Post edited at 12:45
1
 Sir Chasm 18 Feb 2016
In reply to nutme:

Can I have the key to decrypt your message?
1
 Shani 18 Feb 2016
In reply to The Lemming:

> I was under the impression that Nexus encrypted its data by default?

This is possible. Not sure it applies to all phones. There is also the issue of backing stuff up to the Cloud - which may not necessarily be encrypted.
 SenzuBean 18 Feb 2016
In reply to off-duty:

> To be fair your summary of "because dissidents" is also on a par with the state often being accused of trying to impose measures "because paedophiles".

They're not equivalent however. It's a very similar argument to whether to presume innocence or to presume guilt, in which case the state should not be able to pre-emptively decrypt your data.
1
 off-duty 18 Feb 2016
In reply to SenzuBean:

> They're not equivalent however.

They are two extremes of the same argument.

It's a very similar argument to whether to presume innocence or to presume guilt,

Don't follow your reasoning here.

in which case the state should not be able to pre-emptively decrypt your data.

Which isn't what this case is about either. To carry out a targetted search of suspect property under some form of legal process usually requires persuading someone that your specific suspect has involvement in offence that amounts to at least reasonable suspicion / probable cause.
 Mike00010 18 Feb 2016
In reply to JoshOvki:

> If the locked cabinet would take a billion years to break open, then a billion years.
> Just because you have a rubbish filing cabinet that only take 5 minutes to open that is your own problem.
> (Edit to say I didn't down vote your question)

I would liken this scenario more to they can open up the cabinet and look at the data but you've written it in a code that makes no sense to anyone unless they have the code book to translate it. This is the same as the iphone scenario as they can see the data that's on the phone. They just can't translate it to anything that's meaningful due to the encryption hence why they want the code book/password.
 Shani 18 Feb 2016
In reply to The Lemming:
On a related note, the government's IP bill is getting a bit of a mauling ( http://opendemocracy.net/digitaliberties/julian-huppert/three-strikes-again... ):

...they savaged the bill, describing it as a "missed opportunity". They say that "the privacy protections are inconsistent and in our view need strengthening", and that some of the provisions £ equipment interference, bulk personal data sets, and communications data £ "are too broad and lack sufficient clarity". The proposals around communications data are described as "inconsistent and largely incomprehensible".
Post edited at 13:17
 mullermn 18 Feb 2016
As a slight aside I've been reading about this on more techie sites and it's interesting to note that this is only an issue because of the age of the phone in question (iPhone 5C). Anything more recent (5S onwards) has improved hardware that means it isn't possible for Apple to decrypt the phone even if they want(/are ordered) to.

The summary is that in the more recent phones the decryption is controlled by a dedicated processor and half of the key used to encrypt the data is created randomly at manufacture and never leaves that processor. The key cannot be read out of the processor and the chip is designed in such a way that the key is lost if it is physically tampered with. The time delay is enforced within this processor, and so is the wiping of the device (because wiping the phone actually just involves wiping the encryption key from within the processor). Even if you completely replaced iOS with a compromised version it wouldn't be able to force the security chip to decrypt the storage.

Apple have a 60 page document covering the security design of the iPhone here ( https://www.apple.com/business/docs/iOS_Security_Guide.pdf )


OP The Lemming 18 Feb 2016
In reply to The Lemming:

Out of curiosity how many people on this site routinely encrypt their personal stuff, not work stuff but personal data?

As for my, only my phone and tablet are encrypted but that is because this was done by the manufacturer and as yet there is no method to remove that encryption process from the current android operating system.
 elsewhere 18 Feb 2016
In reply to The Lemming:
Never that I'm aware of as I want my kitten photos to be retrievable when I pop my clogs.
Post edited at 14:20
 jkarran 18 Feb 2016
In reply to off-duty:

> My kitten pics are safe though, you'll be pleased to know.

For someone so careful about their online privacy you seem rather blasé about this.
jk
1
 toad 18 Feb 2016
In reply to toad:

> presumably it's ok for a large multinational company to be able to decrypt an iphone and read it's contents, just not the police...

Actually, having read a bit more about this it seems the fbi have been less that upfront about this, it's rather they appeared to have been waiting for a sympathetic case to push for a generic way to get into these phones in any event. Whilst I'm not convinced this is apple trying to make an altruistic stand for privacy ( they have capitalised the word by default on this iPad) and it suggests they are looking after their commercial interests, neither do I think this is particularly about the fbi vs the terrorists
 JoshOvki 18 Feb 2016
In reply to The Lemming:

Yes, I keep some of my personal stuff, that I would rather not end up in the public domain encrypted.
 Shani 18 Feb 2016
In reply to toad:

> Actually, having read a bit more about this it seems the fbi have been less that upfront about this, it's rather they appeared to have been waiting for a sympathetic case to push for a generic way to get into these phones in any event.

Yep!

http://theintercept.com/2016/01/12/apples-tim-cook-lashes-out-at-white-hous...
 Dave Garnett 18 Feb 2016
In reply to The Lemming:

> Out of curiosity how many people on this site routinely encrypt their personal stuff, not work stuff but personal data?

My laptop is encrypted for work reasons.

I've no idea how my phone works.
 Lurking Dave 19 Feb 2016
In reply to The Lemming:

I've routinely encrypted data since I first got PGP... I'd guess late 90's. Things are quicker and almost fool proof now, at least use an encrypted vault for information that you "might" not want to be public.
Cheers
LD
 DancingOnRock 19 Feb 2016
In reply to The Lemming:

Locks only keep honest people honest.

The real question here is should we be so trusting to people/corporations/governments to keep our data private.

I can see how certain countries persecute some sections of society, but should you be planning uprisings and terrorist activities using the Internet and mobile devices. Sounds a very silly thing to do from my point of view.

Bit like keeping all your tax evasion activities in a notebook.
In reply to DancingOnRock:
> I can see how certain countries persecute some sections of society, but should you be planning uprisings and terrorist activities using the Internet and mobile devices. Sounds a very silly thing to do from my point of view.

Agencies like NSA and GCHQ can already hack phones. The terrorism thing, like the paedophile meme is a red herring. What this is actually about is making it cheap and convenient to hack phones so it can be done for far less serious crimes and provide evidence for use in court.

As soon as this software is available every country in the world is going to compel Apple to provide access to it and it will leak out of law enforcement into less sophisticated intelligence services in poorer countries and eventually criminals. All the litter and dog poo cops working for councils are going to think about the Investigatory Powers Act and wonder if they can look at the fly tippers phones to see where they have been from their 'favourite places' list. Divorce lawyers will start requesting courts to seize phones.

There are larger things at stake. If people can't trust phones and cloud services they won't use them. They'll stick to face to face meetings which means more plane travel and working in an office in a big city rather than from home. We'll lose the opportunity to use phones to monitor health problems.
Post edited at 10:50
 DancingOnRock 19 Feb 2016
In reply to tom_in_edinburgh:
Possibly. But we need a new more serious law then. The problem is the technology is still in relative infancy.

Something like electronic data theft.

I don't think it's the technology that is the problem, more the way that people are not perceived for being harshly prosecuted for hacking personal data.

Prison sentences for directors of insurance companies that distribute phone numbers would be first on my list.
Post edited at 10:57
 MG 19 Feb 2016
In reply to tom_in_edinburgh:

> Agencies like NSA and GCHQ can already hack phones.

You sure? I thought that wasn't the case with up to date encryption, although of course it will difficult to say for sure.
 Dave Garnett 19 Feb 2016
In reply to DancingOnRock:

> Locks only keep honest people honest.

And dim, lazy dishonest people. Which is most of them.
KevinD 19 Feb 2016
In reply to MG:

> You sure? I thought that wasn't the case with up to date encryption, although of course it will difficult to say for sure.

No one is quite sure.
Modern encryption in mathematical theory is fairly safe. However its whether the implementation has flaws which can be taken advantage of.
In reply to MG:

> You sure? I thought that wasn't the case with up to date encryption, although of course it will difficult to say for sure.

An intelligence service with the ability to run agents inside Apple/Google/Cisco as well as hack into their computers and grab their internet traffic is not going to have a problem getting all the code they need to make their own modified iOS build. The other approach would be to go after the physical implementation of the security using high end test equipment, there's three or four obvious approaches and probably all of them could be made to work with enough time, effort and money.
1
 MG 19 Feb 2016
In reply to tom_in_edinburgh:

I don't think that's correct - the whole point of the new iOS is that you can't access data, even with the code. Possibly of course Apple could be lying, or as KevinD points out there could be flaws with the encryption.
In reply to MG:

> I don't think that's correct - the whole point of the new iOS is that you can't access data, even with the code. Possibly of course Apple could be lying, or as KevinD points out there could be flaws with the encryption.

The attack the courts are trying to compel Apple to carry out is to build a version of iOS which lets them have as many tries as they like at guessing the PIN. I'm saying other agencies could easily do that attack with no help from Apple.
 wintertree 19 Feb 2016
In reply to KevinD:

> Modern encryption in mathematical theory is fairly safe.

In published mathematical theory. The U.K. government kept their discovery of the theory of asymmetric key encryption secret for almost 3 decades and were using it long before Zimmerman came along. Even the Yanks they shared it with kept a lid on it.

Makes you stop and wonder what other proofs and theories they're sitting on. They still hire mathematicians...
 remus Global Crag Moderator 19 Feb 2016
In reply to tom_in_edinburgh:

> The attack the courts are trying to compel Apple to carry out is to build a version of iOS which lets them have as many tries as they like at guessing the PIN. I'm saying other agencies could easily do that attack with no help from Apple.

Im far from convinced that's true. It's not just a case of knocking up a new version of iOS, the software needs to be digitally signed with an apple certificate, and doing this should be very tightly controlled within apple. Certainly some random programmer within apple is not going to be able to sign a production software build without setting off a lot of alarms.

I also wonder about apple's ability to remotely force software upgrades. Is this a capability they have? I don't know. It's pretty spicey territory so i wouldn't be surprised if they've purposefully not designed this capability in to iOS.
 MG 19 Feb 2016
In reply to tom_in_edinburgh:

OK, I see that.
 remus Global Crag Moderator 19 Feb 2016
In reply to wintertree:

Obviously hard to say for sure, but I find it hard to believe there'd be so much fuss around encryption if theyd found some mathematical work around to the problems encryption poses. Surely they'd be much better to sit back, happily deciphering anything and everything and try not to draw too much attention to themselves. Instead, we have a large, focused effort by the largest companies on the planet to improve software security.

My suspicion is that so far they've relied on weakening existing implementations (i.e. finding bugs in existing libraries, introducing backdoors in to existing software libraries, providing software with hidden backdoors etc.), and as these start to run out (i.e. vulnerabilities being publicly discovered and fixed) they're starting to push more through the courts.
In reply to The Lemming:

Mcaffe (yeah, the one who makes the security software) says he'll hack it within 3 weeks or eat his hat on live TV, just so Apple don't have to set the precedent:
http://www.businessinsider.com/john-mcafee-ill-decrypt-san-bernardino-phone...

Surely if it was about this one case you'd take him up on it?
 jkarran 19 Feb 2016
In reply to KevinD:

> No one is quite sure.
> Modern encryption in mathematical theory is fairly safe. However its whether the implementation has flaws which can be taken advantage of.

Whether or not encryption is as secure as published theory suggests or the implementation robust enough the human readable information pushed into and out of the device when it's used is always going to be a relatively soft target by comparison with breaking the cipher. Robust encryption is no good if it's implemented on a device with inadequate virus protection.
jk

 mullermn 19 Feb 2016
In reply to remus:

This attack will only work with an iPhone older than a 5S. For anything 5S or more recent the security is all done in firmware that iOS can't manipulate - it doesn't matter what software you put on the phone, it won't be able to decrypt the data, and the security processor is also the bit that controls the number of guesses/self destruct functionality.

Apple aren't kidding when they say they have made phones that they themselves can't break *

* Obviously, if you think they're lying or incompetent then this means nothing, but there's plenty of security researchers who could make a name for themselves overnight if they proved them wrong.
 mullermn 19 Feb 2016
In reply to willworkforfoodjnr:

If he messes up the hack then the phone content is probably gone forever.

Also, from McAffe's side, he could prove he could do this with any iPhone 5C, he doesn't need this particular one to prove it. He's got a bit of a fruity reputation in tech circles (ie, he's a nutter). He was wanted for questioning as part of a murder investigation and other things at one point.
OP The Lemming 19 Feb 2016
In reply to remus:
> I also wonder about apple's ability to remotely force software upgrades. Is this a capability they have? I don't know. It's pretty spicey territory so i wouldn't be surprised if they've purposefully not designed this capability in to iOS.


I thought that this was a standard option for Window's 10. I seem to remember somewhere that Microsoft could remove software and alter the OS at will from a remote connection.

This is one reason why I have opted not to download Window's 10 yet.
Post edited at 14:52
 wintertree 19 Feb 2016
In reply to remus:

> Obviously hard to say for sure, but I find it hard to believe there'd be so much fuss around encryption if theyd found some mathematical work around to the problems encryption poses

If there has been a breakthrough with regards prime numbers and encryption, it will be held far tighter and closer than asymmetric key encryption was, because the consequences would be so disruptive for everyone but those in the know.

I doubt it would be judged worth revealing it to more than a dozen people for anything short of pre-empting a nuclear exchange. It certainly wouldn't be disclosed for chasing run of the mill murderous scum.

I'm not saying someone has done it, just that I wouldn't discount the possibility as suggested up thread, but don't expect to hear about it!
Post edited at 17:24
 Dauphin 19 Feb 2016
In reply to remus:


I also wonder about apple's ability to remotely force software upgrades. Is this a capability they have? I don't know. It's pretty spicey territory so i wouldn't be surprised if they've purposefully not designed this capability in to iOS.

Why would you not? Do you even need the phone? I'm sure you can take the data image from the phone and run it on an emulator somewhere in a virtual machine on a big powerful supercomputer node and then of yeah non of our suppositions about what it and what is not possible are in terms of brute forcing or breaking the crypto with a software update are anyway credible.

D

New Topic
This topic has been archived, and won't accept reply postings.
Loading Notifications...