thisacctforreal 6 years ago

Passphrases are always going to be the strongest, but you can have more than 6 digits in your pincode.

Select "Custom Alphanumeric Code" in Passcode Options[1], but only enter digits using the keyboard. iOS will display a pin pad on the lock screen that will accept any number of digits[2].

I picked this up from the delicious iOS 11 security whitepaper[3].

[1] https://i.imgur.com/KEEC71B.png [2] https://i.imgur.com/YrgQA5s.png [3] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

  • Twisell 6 years ago

    This is plainly brilliant. Just done that and the interface seems to not give any clue about the expected number of digits. Meaning that an attacker have no mean to even estimat the time needed to unlock. One could only figure out that complexity of password increased after failing all attempts with less digits (which will already take a lot of time).

    But of course alphanumerical would be even safer.

    • nerdponx 6 years ago

      I love it, but the XKCD wrench comic is fresh in my mind. When I was mugged, they just made me unlock the phone right there on the street.

      • jostylr 6 years ago

        It would be nice to have a panic code that one could type in, making it look unlocked, hiding apps a user could mark as secret, and optionally sending a security/tracking alert. This could also thwart a cracking device by making it look cracked at a shorter code, but giving no actually useful info.

        • martin_bech 6 years ago

          Many fingerprint systems have this for access. A "Duress" Finger. If you use that finger, the system can be set up for a silent alarm, full on alarm, lockdown or whatever configuration you need.

      • wil421 6 years ago

        I would gladly give my passcode if I was mugged. The main issue is either smarter more mischevious criminal, identity thieves and the like, or bad governments.

        It would be terrible if all airports had these and could crack you on demand. I can imagine them holding you until the device cracks it.

      • xoa 6 years ago

        That XKCD comic was wrong and one of the rare truly stupid and actively misinformative ones. Security measures are always about responding to threat scenarios and economics, and must in turn be analyzed in context. The general use paradigm for an average device involves many different threats and in turn separate measures for each one. FDE for example is only about protecting from cold attacks, warm/hot ones are just plain out of scope.

        Similarly, authentication systems (including passcodes and biometrics) are purely about ensuring that only authorized people may gain access with whatever permissions they're authorized for and no more. Their threat scenario and usage does not involve intent. Authentication "primitives" can be used as part of a counter-intent security measure, but by themselves they only determine a "who" not a "why". Strong authentication backed by strong encryption is a stand alone good and a required foundation to do more complex stuff, but that's it. The "wrench" attack is an intent-based threat scenario: the person performing access is in fact authorized, and there is nothing at all buggy, malfunctioning, misdesigned, weak or wrong in any way with an authentication system allowing access. It'd take an intent reactive security measure to deal with.

        Regrettably there are still no main stream smartphones that implement this as far as I know (though at times it's been possible to do your own to some extent via jailbreaking on iDevices at least and I assume on rooted Android as well). Which is really too bad because Apple in particular have a lot of very good tools at this point to do a really, really good implementation. A classic measure would be coercion/distress codes, ie., alternate passcodes an operator can enter that will cause the device to perform different actions then a straight authentication (these can range from a full deletion to more subtle actions like a silent alarm). Apple though could use TouchID/FaceID to make this even more user friendly, merely "use this finger vs that finger" or "be making this facial expression vs that one" as triggers. They (and anyone else with a secure hardware stack) could also implement temporal and spatial trigger options which would be very useful. Finally the iOS design is decently placed to allow relatively easy and more selective view filters on accessing apps and data due to how heavily everything is silo'd. The silos have been intensely frustrating a lot of the time vs standard computer access patterns, but in this instance it could be a real strength. Imagine being able to have a "travel view" we could set before a trip so that sensitive apps simply vanish until GPS indicates we've arrived at our destination or connected to a specific network or whatever. Or there could be distress views that react to a facial expression or code or finger and then permanently hide everything but a cleaned minimal data and app set, including your cloud stuff, until you get home and enter a special unlock code (or for a set amount of time or whatever).

        I think intent-systems/view triggers will (or at least should) be the next big leap forward for helping our personal information devices get not just more secure and better for privacy but more productive. I'd like to see that start to show up in future versions of iOS and Android more then nearly anything else.

        • nerdponx 6 years ago

          Fine, but a wrench, or something analogous, is the most serious threat model for most smartphones. On Apple devices, at least, the Find my iPhone feature mitigates most others.

    • chiefalchemist 6 years ago

      But why not take it a step further and present the standard KB? That would be a slight inconvenience, with a massive gain in confusion for anyone trying to guess their way in.

      • Rychard 6 years ago

        I had a bug recently on my Windows Phone after adding my corporate email account to Outlook. I assume it was some sort of group policy being erroneously applied, but it turned out to be a massive inconvenience.

        Mostly because trying to unlock it displayed the standard keyboard[1], but simply wouldn't allow me to create a PIN with anything other than numeric characters[2], so every time I needed to unlock my phone I had to swipe the screen up, then change the keyboard over to the number/symbol mode, and enter my PIN using the small row of numbers there.

        In the end, all I had to do was change my pin, and from that moment on, it only ever displayed the standard number pad for unlocking.

        [1]: https://i.imgur.com/YENnjtY.png

        [2]: https://i.imgur.com/OaSqLYO.png

      • chiefalchemist 6 years ago

        To clarify. The PIN pad tells you it's undoubtedly going to be numeric. The keyboard masks that fact. But still allows the entry of a # only PIN.

        Thanks for the down votes.

  • teilo 6 years ago

    It was rumors of this process that made me encourage everyone to use a 12-digit or greater passcodes.

    • CydeWeys 6 years ago

      I have a 15+ character password that is a mix of alphanumeric and punctuation. I have to enter it at startup and also once per day, which isn't a huge deal.

      Naturally I use fingerprint/smartwatch authentication throughout my normal day, but once the phone is off, good luck getting access to its contents.

abalone 6 years ago

If this actually works there has to be some huge, embarrassing vuln in Apple's Secure Enclave Processor on par with the "CTS Labs" AMD secure coprocessor hoopla that hit the news just this week.[1][2]

The SEP is supposed to enforce a time delay between passcode attempts to prevent this sort of brute forcing. The timer could be defeated in older models by cutting power at just the right time, but Apple's whitepaper says it's supposed to survive restarts now.[3]

Based on the screenshots it looks like it can load custom firmware on the iPhone. That's bad.

[1] https://www.anandtech.com/show/12525/security-researchers-pu...

[2] HN discussion: https://news.ycombinator.com/item?id=16597626

[3] p15: https://images.apple.com/business/docs/iOS_Security_Guide.pd...

  • TazeTSchnitzel 6 years ago

    It seems like they don't have the exponential delays, but they do have delays. Why else would it take 3 days to unlock the phone if it has a 6-digit passcode?

    Apple's security paper says it would take more than 50 years to brute-force an alphanumeric 6-digit passcode at 80ms per iteration. I suspect that's still correct here (if “3 days or more” is for 6 numeric digits).

    • TazeTSchnitzel 6 years ago

      er, that should say 5 years, not 50. typo.

  • slededit 6 years ago

    If this new time delay is via an off-chip RC circuit (one of the few ways for a timer to work with the power off as the capacitor acts like a battery) then it could be defeated by changing the components. RC circuits can be made on chip, however large value resistors and capacitors are very expensive to place on-chip.

  • Scaevolus 6 years ago

    It's worse, since this attack doesn't require root privileges to run!

    • thisacctforreal 6 years ago

      sure, but it does require physical access for at least a couple hours, up to multiple days, if you don't use a predictable pincode. And from what I gather it doesn't even threaten a passphrase.

      • abalone 6 years ago

        It supposedly threatens 6 digit passcodes, the default that 99.9% of iOS users will use. And the whole point of a passcode is to protect against physical access.

  • dheera 6 years ago

    Time delays only provide a false sense of security. In theory I could always cut open the casing and just plug wires straight into the EMMC or whatever you have in there. Your time delay UI is useless if I just bypass your UI and wire straight into the hardware.

    Of course that's non-trivial EE work, but the point is it's possible, for someone with enough money and the right equipment. What would make it intractable is to ditch the idea that a 4 digit pin is protecting you from anything. There's simply not enough entropy in that.

    Time delays are useful protection when over a network. But not when the attacker has physical console access, e.g. to a phone. At that point proper cryptography and mathematics is the only good protection.

    • jrockway 6 years ago

      I mean, I don't think they're dumb enough to put unencrypted data out on busses that you can just probe. All the encryption is done inside the same package as the CPU; if you read the RAM or flash directly, you'll just get garbage because it's encrypted.

      https://www.arm.com/products/security-on-arm/trustzone is the generic version of this, no idea what exactly Apple is doing.

      Anyway, to break the encryption involves finding bugs in the software that runs in the secure zone (the same way you'd defeat a time delay in a networked application) or by opening up the CPU die and figuring it out with an electron microscope (perhaps while the CPU is running).

      Ultimately, software is going to be the productive attack vector. While the TrustZone docs emphasize "small" for the amount of code that runs in the secure zone, no programmer can ever do "small" on a deadline and so I wouldn't trust it to provide any actual security. But the hardware infrastructure is there to be pretty secure against any adversary that doesn't have a lot of time or money, if the programmers do their job correctly. (I don't trust it because of the time schedules involved for these SoCs. One or two a year, with a lifetime of maybe a couple years. Nobody is going to spend the money to do the hard work like looking for bugs just so that someone can't rip Netflix video feeds. The engineering work would be more expensive than whatever their contract with the DRM vendors says they have to pay if they get hacked, which is the only incentive to be secure... and I doubt Apple would sign one of those.

    • abalone 6 years ago

      You misunderstand the SEP. It contains an externally unreadable private key baked in at manufacturing time that encrypts protected data. Your "wires" would read garbage. The iOS security white paper is worth a read.

      Perhaps a nation-state actor could shave down the processor and read that key with a SEM or some crazy thing, but that's literally how far the design is supposed to have pushed iOS security. Which is what makes this hack so embarrassingly bad (if confirmed).

      • bufferoverflow 6 years ago

        SEMs are not that expensive. The cheap ones on eBay are $12-14K. The more expensive Chinese ones are closer to $200K.

        A security company can easily afford either.

        • matthewmacleod 6 years ago

          I don't think the acquisition of a SEM is the barrier to performing this kind of attack. It's still extremely hard

          • abalone 6 years ago

            I think I read somewhere that some secure coprocessors incorporate physical defenses that will destroy keys if you try to shave them down or physically tamper. So yeah. Hard.

        • whyenot 6 years ago

          Getting a SEM would be the easy part. The hard part would be sample preparation and interpreting the results.

      • dheera 6 years ago

        > Perhaps a nation-state actor could shave down the processor and read that key with a SEM or some crazy thing, but that's literally how far the design is supposed to have pushed iOS security. Which is what makes this hack so embarrassingly bad (if confirmed).

        Shaving down the processor is hard but still possible. A government with a billion-dollar budget can probably do it, if they want, including after practicing on a million throwaway phones in the process. Apple's design is also known as security by obscurity (https://en.wikipedia.org/wiki/Security_through_obscurity).

        I much prefer an open design and security by high-entropy keys and well-established practices in the cryptography. I'd feel much safer using open cryptography libraries and strong passwords than arbitrarily and blindly trusting some secretive engineers at Apple.

        Apple's design is super-hard to break but possible. My Linux laptop, on the other hand -- the hard drive is standard SATA. Have at it. I don't care if you have a billion dollars, you'll have a hard time breaking into my data for a probably at least a few decades.

        • abalone 6 years ago

          That is just totally false. Apple does not rely on security through obscurity. They may not publish the code for the SEP but they do have a whitepaper that lays out the key architectural features.[1]

          The only reason your Linux laptop would be more secure than an iPhone is if you were using a high-entropy key to unlock it every time you wanted to use it -- and the iPhone wasn't. That's it. But remembering high entropy keys without storing them in some less secure manner is so inconvenient and failure-prone for most people that it's actually a less secure design than Apple's SEP-assisted approach. And you could opt-in to a high entropy key on an iPhone if you wanted to, so even that is a false comparison.

          [1] https://images.apple.com/business/docs/iOS_Security_Guide.pd...

    • greglindahl 6 years ago

      One of the points of having a secure enclave is that it can enforce things like time delays. I doubt that the time delay in question is only enforced by the UI outside the enclave.

userbinator 6 years ago

However, it does mean that an iPhone’s security cannot be ensured if it falls into a third party’s hands.

That was and will always continue to be true. Even secure cryptoprocessors of the type used in smartcards and HSMs can be cracked with enough determination and time. There are companies in China who will read and clone them for surprisingly little money.

It has always amused me somewhat how scared (or the impression that articles like this give) some people are of governments, while at the same time completely accepting and trusting to being herded and controlled by the companies they purchase these locked-down computers from. Anything you truly want to keep secret should be encrypted by systems you have knowledge of, with a key that only you know, or even better --- not leaving your brain at all.

Unfortunately, the IP-Box 2 became widely available and was almost exclusively used illegitimately, rather than in law enforcement

If by "illegitimately" you mean third-party repair shops... I know Apple doesn't like that, but the whole *-box series are aimed at the mobile repair industry (a huge business in China), not law enforcement.

  • SlowRobotAhead 6 years ago

    >or even better --- not leaving your brain at all.

    The faintest of ink will outlast the best of memory, or something like that.

    • Zhenya 6 years ago

      "A dull pencil is better than the sharpest mind."

      Thats the way I've always heard it.

      • eltoozero 6 years ago

        I’ll have to write that down so I don’t forget it.

        • StavrosK 6 years ago

          Make sure you use a dull pencil, we have no data about how others types compare with minds.

  • throwaway2048 6 years ago

    the fact the methods of accessing the device are so secret seems very prone to a court rules of evidence challenge.

    The accused has a right to know exactly how evidence was obtained, and if the chain of custody was broken, just hiding behind an NDA isnt going to cut it.

jsizzle 6 years ago

Is it just me or does the price point seem extremely low? They have a device that should be in high demand globally, and maybe one competitor. And they are charging 15-30k, for basically unlimited usage?? You can't tell me federal law enforcement wouldn't pay at minimum ten times that amount for metered usage...

  • ashman5 6 years ago

    I bet they realize the lifespan of this device is very short and are trying to maximize ROI short-term.

    • lostapathy 6 years ago

      That was my thought as well - but on the flip side, by dealing in quantity they are a lot more likely to have one leak and be reverse engineered, and thus have Apple render them all useless.

      It's certainly an interesting problem of profit maximization!

      • SlowRobotAhead 6 years ago

        Unless the vulnerability is in the CPU/DMA/whatever and not easily patched. Everyone assumes that Apple has no idea what it is, maybe they are keenly aware and it’s just not fixable.

    • djsumdog 6 years ago

      True. If Apple or a security firm get a hold of one of these devices, surely they'd be able to reverse engineer it and patch the vulnerabilities.

  • Mtinie 6 years ago

    That was my thought, too. Other than this report, has there been verifiable evidence the device even works?

    The photo stagings remind me of ones I’d use on a pre-release marketing site for a vapor-ware product to test demand and a price point.

shawnz 6 years ago

> The cheaper model isn’t much of a danger if stolen—unless it’s stolen prior to setup—but at 4″x 4″x 2″, the unlimited model could be pocketed fairly easily, along with its token, if stored nearby. Once off-site, it would continue to work.

Presumably even the cheaper model could be reverse engineered to reveal the exploit used. But once it becomes known, it would be patched.

incresp 6 years ago

Can Apple sue the makers of this program under DMCA anti-circumvention acts?

  • saagarjha 6 years ago

    I believe jailbreaking is a protected category for which an exception is made.

    • coldcode 6 years ago

      I find it hard to believe the jailbreaking for profit is a protected category.

      • jessedhillon 6 years ago

        > (e) Law Enforcement, Intelligence, and Other Government Activities.— This section does not prohibit any lawfully authorized investigative, protective, information security, or intelligence activity of an officer, agent, or employee of the United States, a State, or a political subdivision of a State, or a person acting pursuant to a contract with the United States, a State, or a political subdivision of a State.

NotSammyHagar 6 years ago

I hate this stuff. I want to secure my device and not have the govt or companies steal it, I want to control my device. Still, it's fascinating to learn about.

Did no one think, when they take someone's phone for 5 minutes at the border, they could be doing this to your phone.

  • djrogers 6 years ago

    Well, as the article makes it clear it takes from hours to days to crack, no - they’re not doing this in 5 minutes at the border.

    • tonyztan 6 years ago

      They can routinely keep you at the border for a few hours though.

    • enraged_camel 6 years ago

      From the article:

      >>It can take up to three days or longer for six-digit passcodes, according to Grayshift documents, and the time needed for longer passphrases is not mentioned.

      So yeah, up to three days for six-digit passcodes. If you have a longer passcode with letters and special characters, you could wait a long, long time.

nobeliefs 6 years ago

Can GreyKey or anything else really bypass the unlock attempt counter of an iPhone set to erase itself after 10 unsuccessful attempts? Have they found a way to replace the firmware that executes that erase procedure? In that case, only password complexity can save you. But no evidence is shown that proves they can accomplish this.

rphlx 6 years ago

Humans being abysmal PIN and password generators, a decent fraction of phones can probably be unlocked within 5 attempts by just trying 123456, 123123, 111111, 654321, 000000. Unless/until the phone forces the user to learn rather than select a PIN that's probably going to remain the biggest vuln.

  • Tempest1981 6 years ago

    I think iOS warns you if you choose one of these. But it may not stop you.

verroq 6 years ago

How much bounty would Apple pay, say if somebody steals one and sends it to them? Is it illegal to them to make such an offer?

  • craftyguy 6 years ago

    Well in most places, theft is illegal.

    On the other hand, perhaps Apple could secretly partner with a law enforcement team and purchase one for themselves. $15k and $30k are literally nothing to Apple with their warchest in the tens of billions.

    • retrac98 6 years ago

      Hundreds of billions

  • jlgaddis 6 years ago

    "Possession of stolen property" is a crime in many places.

    I imagine that telling someone, "Steal that TV and I'll give you $100 for it" would, additionally, make you a conspirator to the crime of theft.

  • djrogers 6 years ago

    The ‘offer’ isn’t illegal - going through with it would be though, for both sides. Grand theft and receiving stolen goods. Both not great, plus you’d be actively acting against the law enforcement system which would ensure a zealous prosecution.

    • sgc 6 years ago

      Of course it is. You can't broadcast a bounty on someone's life, or their property, or any other illegal act. It is solicitation.

    • TheSpiceIsLife 6 years ago

      Would making the offer be considered conspiracy to commit crime, or some such?

  • ryanlol 6 years ago

    You would obviously just refrain from discussing the manner in which the device was acquired.

    • paulcole 6 years ago

      The cops have never heard that one before.

      • gh02t 6 years ago

        It's a realistic point though. Illegal industrial espionage definitely happens and with the resources of a huge multinational corporation it becomes easier to conceal behind a wall of secrecy and misdirection. I doubt Apple would do it, but it also wouldn't surprise me and they definitely could.

        All they have to do is discretely obtain the device in question and have a few good engineers quietly pick it apart for a few weeks to figure out how it works. They then patch the vulnerability in a regular update claiming they discovered it as part of normal procedure and nobody takes notice.

        Edit: the legality of the device itself is kinda interesting to me. Like, even if it is doing something illegal (like using stolen code or something), how would Apple prove it as long as it was only sold to law enforcement? If the police aren't asking too many questions and Apple can't legally acquire one, how do they prove it? I suppose they'd have to gather enough circumstantial evidence to get a judge to issue a subpoena, but things get a bit dark and fuzzy.

        • robin_reala 6 years ago

          Is this not automatically illegal under the DMCA? Well, assuming the company in question is based in the US.

          • matthewmacleod 6 years ago

            The DMCA specifically criminalises the circumvention of copyright protection methods, not all access controls.

            • robin_reala 6 years ago

              I’m not a lawyer, but quoting from the Wikipedia article the DMCA “also criminalizes the act of circumventing an access control, whether or not there is actual infringement of copyright itself.” You could argue that you hold copyright on for example a photo you’ve taken, and the passcode is the access control method.

              • matthewmacleod 6 years ago

                I’m also not a lawyer, but I have no reason to believe that would be an arguable case; the fact that an access control method incidentally makes gaining access to copyright materials harder doesn’t make it a copyright protection method.

            • corrideat 6 years ago

              Not a lawyer, but copyright protections methods may be implemented as an access control system. For example, I could sell content to iPhone users under the assumption that third parties won't be able to copy or access said content. More to the point, things like Apple Music, the App Store or the iTunes Store all could be argued to rely on the access control mechanisms as part of the DRM system.

      • ryanlol 6 years ago

        That’s the idea, it’s harder for cops to hear things which you don’t say in the first place.

ams6110 6 years ago

> An iPhone typically contains all manner of sensitive information: account credentials, names and phone numbers, email messages, text messages, banking account information, even credit card numbers or social security numbers. All of this information, even the most seemingly innocuous, has value on the black market

My phone has no banking information, credit card information, Social Security numbers, or email accounts that can be used to recover or reset access to any online service. Why? Because I don't trust my phone.

But aside from all that, all that information is already on the black market. There have been so many breaches, Equifax just to name one, to think otherwise.

  • laggyluke 6 years ago

    If you're not trusting your phone, is there a different kind of computing device that you trust?

mschuster91 6 years ago

I wonder: Apple has hundreds of billions in overseas cash. Why don't they go after Cellebrite and Grayshift and offer the owners something to the tune of 1-2 billion US$ in hard cash? Given the reputation hit once this knowledge becomes widespread, a couple billion dollars are pocket change.

  • nocobot 6 years ago

    I don't think this becoming common knowledge would have a noticeable impact on sales.

    Few people imagine themselves to ever be in a position where they would want to protect the info on their phones from LE.

  • Piskvorrr 6 years ago

    Perhaps this is a viable model for vaporware (or perhaps extortion); not sure how much it matches the present case.

Buge 6 years ago

>The cheaper model isn’t much of a danger if stolen—unless it’s stolen prior to setup—but at 4″x 4″x 2″, the unlimited model could be pocketed fairly easily, along with its token, if stored nearby. Once off-site, it would continue to work. Such a device could fetch a high price on the black market, giving thieves the ability to unlock and resell stolen phones, as well as access to the high-value data on those phones.

If this gets stolen and put on the black market, that would be a good thing. Because then Apple can buy one, figure out what vulnerabilities it's using, and patch them.

closeparen 6 years ago

This seems at odds with Apple’s claims about holding the device encryption keys in a secure coprocessor that only releases them in response to a valid passcode, and self-destructs the keys if too many passcodes are tried.

  • djrogers 6 years ago

    It’s not at odds with it - it’s pretty obviously using a vulnerability to run a crack against the passcode. Once the passcode is found, that is used to unlock the phone and this the Secure Enclave.

    • matthewmacleod 6 years ago

      I agree it’s not at odds with it, but it’s not even that simple - the passcode is enforced by the Secure Enclave itself. It’s not a case of “try passcodes until you find the right one then tell the SEP” - it has to be exploiting a vulnerability in the SEP itself, assuming what we know of the design and attack is true.

      • rphlx 6 years ago

        Technically, this does not absolutely have to be a SEP vuln. It's possible that the PIN is being stored (not properly zero'd-after-use) somewhere outside the SEP, i.e. a pinpad entry buffer or something. Often criminals do not think of, or are unable to, turn off their phone when being arrested. Furthermore the iPhone battery is not removable and some portions of its DRAM - in theory anyway - could persist even when the phone is "off". Apple has (or at least - prior to public outrage - had) a fairly loose definition of "off" for other aspects of the system, such as bluetooth.

    • hedora 6 years ago

      The enclave software has supposedly been formally verified (It is based on the L4 microkernel).

      This looks like a software flaw, not a hardware attack. It will be interesting to know how Apple screwed this up.

  • matthewmacleod 6 years ago

    Nah, I don’t think so, but if true it would hint at a vulnerability in the implementation of the Secure Enclave.

uselpa 6 years ago

Would pair-locking prevent this kind of attack?

jokoon 6 years ago

I thought iPhone were electronically secure, it seems they are not. I thought the FBI had to just do some Xray of some chip to read some ROM thing.

Sometimes I wonder if real security is really and theoretically possible, or if it's just engineers who never manage to achieve it because designers want things to be usable for consumers.

What ever happens it doesn't seem really secure, consumer oriented device do exist. I wonder if there are android devices who do a good job at that, and what's the status of the security of android device in general, I would guess it's not better.

  • ozim 6 years ago

    What is real security? Is it absolute security?

    Are you willing to pay $500k for a phone? Is there a vendor who is willing to put R&D investment of $20mil so you can buy one? How much more phones they would sell? If you would be Ed Snowden would you even trust that company?

    Are you going to buy a safe to keep family photos in it?

    What does it even mean for you to have absolutely secure phone if you are going to be hit by a bus tomorrow?

  • 0culus 6 years ago

    Yes it is, and has been done. Subject to [edit: a] security policy (NOT defined by implementation) and a meaningful statement of threat (what are you protecting against?)

    This paper might prove to be an enlightening read: http://www.mdpi.com/2078-2489/7/2/23

joering2 6 years ago

I bet there is lawsuit in works by Apple et al!

I mean if they truly broke and iPhone lock, then it means they had to be tampering with a true Apple device (not a dummy) in order to make their device work. Therefore, they violate Apple TOS that I am sure forbids any sort of backdooring. I doubt they will go after a rouge chinese jailbreaker sitting in moms basement and trying to make a name for him/herself, but here we have example of a for-profit incorporated business that makes 100% of their money by breaking Apple's devices.

On the other hand, if this is all just some sort of marketing gimmick, or that device never been truly tested on iPhone, then I am sure they can go after them for attempting to shame iOS/iPhone for users to think their devices are less secure than they actually are, which could hit their bottom line.