180 points by krtkush 80 days ago
I mean, why can't they just have a regular bootloader like a PC? It's a lot of effort for what is essentially 0 gain if you just have full device encryption.
The phone should at least always be unlockable by a single switch in the developer settings - none of this "oh go to our website and generate an unlock code/pay us money for an unlock code/phone us" which is a HUGE pain in the butt.
I swap between roms a lot, and the whole idea of a system partition and data partition etc is just stupid. They always are the wrong size for what they need to be and then I try to install gapps and its like "oh your system partition is too small my dude," so why separate these and just have everything like a regular linux OS.
Maybe I'm missing something stupid? But I kinda just want a phone that just starts a linux based operating system like a PC, and lets me do everything I can on my linux PC in a touch-centric way.
These secure boot systems exist because of DRM protection to media that is mandated by the content companies. Their point is to verify that the whole OS from kernel to userspace can't be modified by 3rd parties (such as you, the owner of the device).
It boils down to "no secure boot, no DRM, no Netflix 4k for you".
At least this was the case when I was still working with consumer electronics a few years ago.
This. Embedded guy formerly involved with media devices here, and this is 100% correct. The secrecy involved in content protection is far beyond anything I've seen while working as a military contractor. At some shops, developers don't even get unlocked bootloaders. You get a device with a bootloader which must be cryptographically unlocked and allowed to boot a limited number of times before needing to be unlocked again. You also have to sign your life away if you ever divulge any confidential details. Chances are though, that you won't even be exposed to the juiciest details because only a select group of senior devs get to see that stuff. Working with content protection companies like NDS or Nagravision is frankly maddening. I know a guy who has his design rejected(because they review your entire system design before certification) and they couldn't even tell him why they rejected his design.
DRM restrictions like this are always a joke - HDCP was a great one. Also, I doubt many people are going to be getting their Netflix captures/dumps from an Android device anyway so its a useless restriction, they'll be doing it on a PC where this cannot be enforced.
There are also things such as Google Pay/Banking Apps that don't like it if you are using a custom ROM (although this is root based) it can be bypassed easily using Magisk.
There will always be away round frivolous DRM attempts, even Denuvo and it really just hurts consumer choice and device freedom.
>Also, I doubt many people are going to be getting their Netflix captures/dumps from an Android device anyway so its a useless restriction, they'll be doing it on a PC where this cannot be enforced
Not for Netflix 4K, which requires SGX capable CPUs to play.
And windows 10 anniversary update, Microsoft Edge, Direct Play 2.0 support (which can only be found on recent graphics cards or 7th gen Intel integrated and only works over certain video outputs). For example I have a 7th gen Intel cpu, I can boot into windows and use Edge, download the media extensions, update my graphic drivers, but I still can't get 4k Netflix because my GPU is a year or so older than AMF supports for Direct Play 2.0 and additionally I have my 4k 10bit color monitor using Display port for the higher bandwidth. From what I've read at this time Display port isn't even supported with Direct Play 2.0 by AMD.
Basically it's such a pain in the ass, you aren't going to have 4k Netflix on PC, possibly even if you try...
Yet there are 4K rips sourced from Netflix being shared on pirate sites.
It's a really bizarre situation that BitTorrent is actually the most convenient consumption mechanism if you want a lot of diverse, specific or old content. All you have to do is be behind the "new releases" by a month or two and wait a few minutes for the downloads to complete, but that's a minor issue.
I think it ultimately is caused by: a) content is not fungible, and b) content-delivery platforms (Netflix, HBO, whatever) cannot seem to agree on cross-licensing the content. This means that they simply cannot deliver the convenience of torrented content. For some reason this translates to "we must protect our content with even 'stronger' DRM" in the business-peoples' minds, but piracy isn't the root of the problem.
>All you have to do is be behind the "new releases" by a month or two and wait a few minutes for the downloads to complete, but that's a minor issue.
Well, even that can be a non-issue if you have access to the better indexers, be it P2P or Usenet.
I pay subscriptions to services like Netflix and Amazon Prime and still end up downloading lots of the same stuff from... alternative sources simply due to the convenience that confers.
Anything bad that happens to the companies or entities that are responsible for shoving this crap down customers' throats is their doing alone.
And you can play them on anything! :)
Maybe they are attacking the display.
Just wait. Human-made designs are susceptible to human-made hacks.
PC's get lower quality streams precisely because they're more open to recording.
Almost all high end release scene attacks on DRMed content services target Android or Tizen.
XBox DRM is no joke.
> These secure boot systems exist because of DRM protection to media that is mandated by the content companies.
That statement sounds too simplified. How about non-approved images pumping power upstream above regulatory limit on RF? Or overclocking the CPU? Or reporting all keystrokes to a place of your choice? Would you, as a manufacturer of the device, like to make that easily possible?
> Their point is to verify that the whole OS from kernel to userspace can't be modified by 3rd parties (such as you, the owner of the device).
What makes it impossible is not the secure boot but the fact that the _unauthorized_ 3rd parties don't have the keys for signing stages of the images. Some owners of some secure-boot devices, do have keys and sign their images. One example: a telecom operator, owner of its park of the set-top box decoders, does that. Secure boot can be made to allow several/many authorized keys.
My statement above is simplified, thanks for your additions.
> How about non-approved images pumping power upstream above regulatory limit on RF?
This is possible through other means (you can buy RF chips and wreck havoc on the frequencies). And it's illegal to do so anyway.
>Or overclocking the CPU? Or reporting all keystrokes to a place of your choice?
I, the owner of the device, might want to do this and these are sometimes completely reasonable things to do (if you're a developer, for instance).
> Would you, as a manufacturer of the device, like to make that easily possible?
Probably not, but that goes against the interests of the customer. ("but most of them won't care anyways")
> What makes it impossible is not the secure boot but the fact that the _unauthorized_ 3rd parties don't have the keys for signing stages of the images.
I am completely fine with this if, and only if, I can add my own secure boot keys and sign my own firmwares/OS images which allow.
I, the owner of the device, should be an "authorized" party when it comes to my device.
I am willing to accept that I do not get 8k service from YouFlix Inc if I add my own keys.
I have no problem with secure boot (the technology) but I do take issue when it is used to limit the rights of the owner to their own device.
I actually replied for the sake of technical discussion (secure boot is ~1/10 of what a decent DRM should be, and adding the remaining 9/10 is a huge cost for the manufacturer).
But since you say:
> I, the owner of the device, ...
I am tempted to ask -- how much would you agree to pay for a phone that is working but will never be able to connect to the network (no 4G and no WiFi)?
That... is not a phone? It's a pocket computer, isn't it? I could absolutely see a value for such a device, although I don't see why "no WiFi".
Then -- back to my question -- would you agree to pay $600 for an iPad of the size of the iPhone, with no network capability?
> ... phone that is working but will never be able to connect to the network ...
If you're implying that devices without locked bootloaders can't connect to RF networks, that's just incorrect. There are lots of devices that aren't locked, and it wasn't a common practice a decade ago.
Why no WiFi? This makes absolutely no sense at all.
4G is also a pointless restriction. It is illegal to connect a device with the wrong specification to 4G networks already. This does not need to be enforced by DRM it is already enforced by law.
If I don't own the device then why am I paying for it?
Unclear what you had in mind. I'll assume you were speaking about your own secure-booting mobile phone that you paid full price and now would like to be able to reflash?
I am not sure what to tell you. The situation pre-dates the secure boot. If I recall correctly, Iiyama monitors were known to be unrepairable because the manufacturer always refused to release schematics. It did not prevent their extreme popularity.
As an end-consumer, I support very much the freedom to change any products I own, the way I want. I once replaced ball bearings in the drum of my washing machine -- you get the idea.
As a person authorizing a product that can potentially emit unhealthy or unsafe levels of RF, or a parent of teenagers using social media, or a developer of medical applications intended for mobile phone... I am none of that, but I guess I would have a very different opinion.
> As a person authorizing a product that can potentially emit unhealthy or unsafe levels of RF,
This used to be, can be, and should continue to be limited on a hardware level. You can't flash a phone to emit "unhealthy or unsafe levels of RF" if the circuit itself won't let such amounts of power to flow to the radio. Of course, someone could still mess with the hardware itself, but that doesn't scale and (in case of breaking RF limits) is illegal.
> or a parent of teenagers using social media,
This is interesting because, in a way, it's telling me that you the maker of the device are my parent. The argument sounds like, "to prevent your child from flashing your device, we'll prevent you from flashing your device". How about giving me the tools to limit my child's access, if I desire to do so, while not limiting my access?
> or a developer of medical applications intended for mobile phone...
This is a large topic, but to a first approximation, I don't believe a software that cannot be modified by the end-user for health reasons should be allowed on the phone. I also don't believe most medical apps in the smartphone spaces are of this category. The way I see it currently: either accept that some users (like myself) want to have full read & write access to their data on their phones, or make your solution a separate, tamper-proof, black-box device (and get a regulator to rule that for me to open it is illegal).
Maybe not unsafe per se from an emissions perspective, but there are spectrum considerations and other things, and with more and more done in software and less with discrete components, the trend is clear.
Safety is complex - what if the phone starts emitting on bands allocated to emergency services.
> As a person authorizing a product that can potentially emit unhealthy or unsafe levels of RF, or a parent of teenagers using social media, or a developer of medical applications intended for mobile phone... I am none of that, but I guess I would have a very different opinion.
Let's be honest and accept it's just about money and control to the makers of these devices. They will bring reasons such as these in their defense, but really they just want to keep the user from being in control of their device and e.g. making tracking them more difficult. I mean there have probably been tens of thousands of roms flashed on different devices. How many of them have emitted unhealthy levels of RF? And parents could retain the ability to flash their devices without giving that ability to their children. The reasons really just don't hold any water imo.
It is not like precedents are not known:
> Let's be honest and accept it's just about money and control to the makers of these devices.
Agree. My point is that less control for manufacturers over the image presumably will lead to lesser profits. My illustrations might be more or less relevant, depending on the standpoint.
>I am not sure what to tell you. The situation pre-dates the secure boot.
That's an invalid argument. It doesn't matter how long this has been the case.
>As a person authorizing a product that can potentially emit unhealthy or unsafe levels of RF, or a parent of teenagers using social media, or a developer of medical applications intended for mobile phone...
You're presuming this is the only and the correct way of handling this. That is, you're arguing it is the manufacturer's affair and right to restrict one's basic rights to enforce lawful handling of their product.
I'd rather argue that in reality the manufacturer's main interests lie in reducing support costs and the enforcement of their patents and copyrights.
> You're presuming this is the only and the correct way of handling this. That is, you're arguing it is the manufacturer's affair and right to restrict one's basic rights to enforce lawful handling of their product.
This is not really what I said (somehow people focus on the second part), but I do believe that offering this freedom (to install any image) will lead to lesser profits for the manufacturers. Quite possibly because other potential customers might find this feature unacceptable (my illustrations may be good or bad). That is, your personal freedom runs into someone else's. Then the market should decide, right?
> I'd rather argue that in reality the manufacturer's main interests lie in reducing support costs and the enforcement of their patents and copyrights.
Yes. Add there loss of face in case of hostile hacking. Etc. Since when this is condemnable?
>Since when this is condemnable?
Do not confuse the tools businesses are provided by the law with the interest to profit. For instance, 10 to 80 years long copyright terms after death are not your inherent rights as a creator but tools the current legal infrastructure offers. Hence it is not the only solution.
>This is not really what I said (somehow people focus on the second part)
No, you did, and the market decides only a share because it is already affects by numerous direct and indirect regulatory measures, e.g., patents. In the past explicit enforcement of the right to repair was rarely an issue for consumers just as net neutrality had not been a serious issue. This has obviously changed.
> That statement sounds too simplified.
Having worked with DRM for 10 years, I can confirm that while very simple, that statement is entirely correct. He says this is the reason the secure boot process exists, not that this is the only benefit.
> How about X, Y, Z
Those scenarios certainly benefit from a secure chain of trust but they are not what brought it about. The whole secure boot chain exists because of companies that require DRM to access their content.
Samsung wants Netflix to play the best content on their devices. That's the only motivation that has ever mattered to device manufacturers.
this point is moot.
didn't iphone 3g did that? it was alwas a few W over what the spec and regulations called for to steal better reception than other devices around it. nobody was punished and it actually become the norm.
> These secure boot systems exist because of DRM protection to media that is mandated by the content companies
That might be an effect but I doubt it's the cause. Signed bootloader's have been around longer than Android and longer than the Netflix app. They were originally introduced to do things like -- protect the device subsidy lock, prevent customers from flashing their phone with invalid or buggy firmware, prevent modification of radio functions (maybe required for FCC license?).
Nowadays they're useful for more than just DRM too. I'd guess that since the owner unlocks the phone functionality, they are a big disincentive to thefts.
At least in the mobile space, I get the feeling that theft of the device factors in here. A lot of carriers advertise some sort of remote lock + wipe mechanism, which isn't guaranteed to work if the device's bootloader is unlocked, allowing an attacker to bypass the feature before it has a chance to run.
I have no idea if that's actually the case (I only worked tangential to a retail store's mobile department) so take that with a grain of salt, that's just what I observed the carriers making a big fuss about on the regular. I don't think that excuses the practice either; I'll always be in favor of being in full control of any hardware I'm supposed to trust with my data, but ofc I realize I'm in the minority there.
I'm totally fine w/ there being a disclaimer on bootloader unlocking, e.g: "Warning! Some security features of your device may not work properly w/ secure boot disabled. Visit foo.com/secure-boot to learn more." To a lesser extent I'm OK w/ bootloader unlocks voiding my warranty, esp. since many of these devices ship w/ permanent fuses whereby writing to the wrong MSRs may brick your device. After all that's how my Moto X (2013) worked: I needed a code from Motorola to unlock the bootloader, which in turn required agreeing to a warranty waiver.
What's not OK is having a permanently locked bootloader that guarantees my device will be useless after the carrier decides to stop delivering OTA updates in a year or two.
> no DRM, no Netflix 4k for you".
No drm is a feature, and fine I'll just pirate it, it's easier anyways.
> It boils down to "no secure boot, no DRM, no Netflix 4k for you".
I really can't say I'm worried about the theoretical inability to stream 4k video to a 5" screen. At that size, 4k is just wasting bandwidth.
Netflix streams are usually so compressed that there is a noticeable improvement if you use 4K over 1080p. What you're actually comparing is "blurry 4K" to "blurry 1080p".
And that's if they give you 1080p. I cancelled Netflix because they rolled out price hikes and yet even dominant platforms like Chrome on Windows only get blurry 720p which is completely unacceptable these days:
You maybe aren't, but manufacturers of that hardware will get negative PR if their Netflix only runs at 480p - so they comply to limitations.
Is it really that hard to think a bit farther than yourself?
I'm not so sure as many will notice it on a device with a 5" screen.
By my calculations, assuming an angular resolution of the eye of about 1 arcminute and holding the device at 300mm, I get about 700 lines as the upper limit of the human eye.  So really there should be no benefit for anything above 720p.
There may be other factors relating to scaling and compression that might make higher resolutions on small devices look better, but it's definitely of diminishing returns.
Allowing for variable angular resolution, and holding the device closer than 300mm, seems to allow plenty of room where you _would_ notice the different. Further, 1080p scaled to 720p is probably going to look better than bit-starved 720p, even if the resolution isn't directly a factor.
Depends on the eye.
I just happen to have some objects from my early childhood, and late teens. I now keep them just for an aging reference.
Until about age 20, I could focus right up to roughly an inch away from my eyes. However, at moderate distances, comparable to some phone viewing, there are features on those objects I know well and could resolve then which take a lens now.
Used to judge quality of LCD displays by the ratio of pixel to space between with naked eye into my early 30s.
Hard to do today. And for me specifically, because of those references, I know about how much harder, or if impossible without a lens.
My point here is I believe many younger people will appreciate 720p plus on modern phone displays. I know I would have.
You're still approaching this in a completely wrong way. Noone CARES if you can actually see the difference.
It's a marketing issue - "Galaxy S9 can do Netflix in HD and OnePlus can't" news article is something people will change their purchasing behaviour on.
Not to mention other media providers which will fully refuse to allow video streaming to non-DRMed platforms. Don't get hung up on this one example case.
To be fair you're probably right given the way the capitalist system being developed defecates all over its own foundations, but I wonder how Netflix might feel about their trademark being used for marketing purposes in a way that makes it sound like they are deficient in the devices they support.
Don't you need at least 1400 (well, 1399) pixels to distinguish 700 lines? Otherwise they won't be lines anymore.
Or even 1401 if it's black on white. I was being a bit sloppy in my language. Lines was historically a common term for vertical resolution back when it would have been scan lines.
Resolution is historically and otherwise commonly expressed as line pairs, so I assumed you used "line" as shorthand for that.
Thanks for that clarification. Nice to learn something new :)
I was incorrect in my usage of the term.
EDIT: Actually, if I was referring to TV lines then I guess I wasn't wrong, but "lines" is ambiguous to the point of confusion if n lines can mean a vertical resolution of both n and 2n.
They're more worried about people ripping the content, not pumping 4k streams to your smartphone.
Or, _perhaps_, someone who doesn't own the device.
The PC was an accident, which IBM repented themselves quite heavily, tried to regain with PS2 and failed to do so.
If the current trend of UEFI and laptops without any expansion options is any indication, that open PC of yore will fade into a very tiny niche, as computers become appliances.
The previous accident of that kind was UNIX. That's acute, considering an open-source is mainstream today - a couple of accidents is enough to change the world.
Yes, had Bell Labs not been forbidden to sell it, thus going to the market with a closed source OS, selling it the same price range as VMS, OS/360 and similar systems, the history outcome would have been quite different.
Still anyone that has had the pleasure to write portable UNIX code knows that portability is relative.
Maybe UNIX and PC wouldn't be enough actually, without at least four points (half-accidents at least) of that history - RMS, BSD, Sun and Linus. That's quite a fragile chain of events.
PC bootloader is totally insecure. And no, full-device encryption doesn't protect you much in my opinion.
First, full-device encryption is much weaker if you can do offline bruteforce.
With a secure boot trust of chain, it is online, and is rate-limited by the TEE (and could be made to destroy after too many tries)
Even better, per Android documentation, full-device encryption key is tied to boot public key. So if someone flashes another OS, even if you enter the passphrase, noone will access your data.
Also, with a PC, if someone replaces your PC, and changes the OS to simply ask for your passcode and send it back to that someone, you won't notice it.
With a proper boot chain of trust (which PCs don't have, except for Librem BIOS), it is evident that the OS has been changed, so you know not to give this device your passcode.
Also, boot chain of trust makes it harder to do cold-boot attacks.
So globally, having a chain of trust, is much more secure for full-device encryption, because it forces online bruteforce.
Considering that usually passwords used on smartphones are much more weaker (many people have 6 digits for their smartphone, vs 10 characters on their computers), the added security seem necessary to me.
Okay, but if my password is complex enough and there is no known data inside the encrypted data then I do not see how bruteforcing would be a viable option.
Even with a chain of trust, what is to stop someone desoldering the storage chip off my device and reading the data and bruteforcing that way?
I had a long think while writing this reply about different ways of stopping the issue you described of "change[ing] the OS to simply ask for your passcode and send it back to that someone, you won't notice it", and I couldn't think of one that would be entirely secure (eg. encrypting system, extra checks for this part of the OS) so I agree there - however:
- There are 100s of millions (billions?) of PCs that start and do not have a chain of trust or SecureBoot that store much more confidential information than who my mother last called.
- Lots of servers that store confidential information do not have a chain of trust or SecureBoot.
- SecureBoot is vulnerable.
- Users who get a little popup when starting their phone would likely just ignore it. (if you disagree with this then you severely overestimate the average phone user)
- The chain of trust becomes completely worthless after I decide to put a new operating system on my device. This limits my choice and freedoms as a user of a device.
“if my password is complex enough” - fine for you, the average consumer chooses weak passwords, especially on mobile platforms.
“what is to stop someone desoldering the storage chip off my device and reading the data and bruteforcing that way?” - the key will be 128-bit, not bruteforcable in the next hundred years. The TEE contains that key, and can only be unlocked by giving it your user password, online.
The remainder of your arguments seem to be “well it’s not perfect, so why bother?” Servers don’t get left on the subway or stolen. Modern PCs mostly do have forms of secure boot. I can’t speak for Android ecosystem, but users won’t get a popup on iPhone, a modded firmware will just not let the phone boot.
I agree, weak passwords are a big issue.
I see your point about bruteforcing that key, however I was more talking about bruteforcing the password.
Servers have been stolen in the past, and I am not just talking about a big rack server here.
In Android, all you tend to get is an unlock symbol on the bootloader logo, or a warning triangle.
I may be a bit confused. I apologize in advance if I misunderstood.
With the desoldering the chip to read the data off, you’ll get an encrypted blob which is encrypted with a securely random 128-bit key, so it’s impossible to crack. the Key and the users password are embedded inside the TEE and cannot be read out - the TEE only provides an interface where you can supply a candidate password, and if you get it correct it’ll provide you the 128-bit key to decrypt your data. It has an internal delay and counter, and will destroy the key if you get the password wrong X number of times.
So I’m not sure how you might go about being able to brute force the password, given you can only brute-force online, which is slow and will only give you a small number of attacks.
Well, I understand your opinion, though I don't agree with you.
> - The chain of trust becomes completely worthless after I decide to put a new operating system on my device. This limits my choice and freedoms as a user of a device.
Actually, no. If you just want to replace the operating system, you're good to go. Modern android phones display the public key of the kernel when booting a non-OEM ROM and bootloader has been re-locked.
(granted, pretty much noone does that)
I don't disagree with it limiting choice and freedom, but you are much more insecure.
It doesn't matter how good your encryption or password is if I can change your bootloader to have a program steal that password or read the encryption key out of memory.
See some of the thunderstrike attacks for examples of what a compromised bootloader can do.
With a PC it's easy for an attacker to modify the hardware, for example by inserting a recording device in the keyboard connector, so preventing an attacker from modifying the bootloader wouldn't add much security. Even with a typical laptop it's fairly quick and easy to replace or insert components.
In general, it's funny how security bods sometimes worry about attackers modifying the hardware when the attacker could achieve the same thing more easily by completely replacing the hardware.
(Is that really your phone that you just entered your passcode into, or is it an identical phone modified so as to transmit the passcode to the thief who just substituted one for the other?)
PC bootloader is totally insecure.
A good thing, IMHO.
"Insecurity is freedom."
"Those who give up freedom for security deserve neither."
Aren’t these quotes contextual?
I’m sure most people are prepared to give up the freedom of easy access to their house in order to require access with a key, as a fair trade off of security.
If you bought a house, you own the keys.
That isn't the case with these locked-down systems.
Using offline bruteforce at one hundred trillion guesses per second, https://www.grc.com/haystack.htm says you would need a 15 thousand centuries to crack the password '7jk_nhAYTasd76'.
Yeah, if you can pick a secure password and the device is always on your person like a smartphone usually is, evil maid attacks are out as are most other blackbag techniques. The big problem is that entering a password like that is a real pain on mobile, especially on device unlock. You could potentially set a different password on boot vs on unlock - that's a tad trickier than the average user is likely to want to deal with, but entirely plausible for the paranoid power user.
That's exactly how fingerprint locks work on the Pixel. First unlock after boot requires a password, after that fingerprint can be used. It requires password entry about once a day or so as well for additional security.
For false additional security. More than that, it makes the password effectively useless, considering how easy is to peep when user is entering it, especially when it happened at random so the probability the user happened to be at some public place at that time is high.
> But I kinda just want a phone that just starts a linux based operating system like a PC, and lets me do everything I can on my linux PC in a touch-centric way.
I wanted almost the same thing so I built Maru: https://maruos.com
Maru runs Linux in a container alongside Android so you can have the best of both worlds. Hook up to an external monitor with a BT keyboard and mouse and you have a full Debian-based desktop at your disposal.
Maru only supports older hardware (Nexus 5, Nexus 7) at the moment, but newer devices (Nexus 5X/6P, HTC 10, Moto Z2 Force, Galaxy S9) have working ports on our forum and should have official images available soon: https://groups.google.com/forum/#!forum/maru-os-dev
We're an open-source project too, so you can even attempt your own port if you own an unsupported device: https://github.com/maruos/maruos
>But I kinda just want a phone that just starts a linux based operating system like a PC, and lets me do everything I can on my linux PC in a touch-centric way.
Here you go: https://puri.sm/shop/librem-5/
>I mean, why can't they just have a regular bootloader like a PC?
Because there are several limitations with the PC boot process. See the heads project for what a reasonably secure PC boot would look like.
>The phone should at least always be unlockable by a single switch in the developer settings
True. That's a business thing. Not technical. Buy pixels in general for fewer hassles. Also, not just unlockable, but also re-lockable with custom keys. Again, only pixels support this.
>the whole idea of a system partition and data partition etc is just stupid
It's not. System is readonly. It reduces the attack vector significantly. Also makes block based updates simpler, and allows other security features like dm-verity. If anything, traditional desktop OSes are trying to move towards this direction. Look at Fedora's atomic workstation/silverblue.
> unlockable by a single switch in the developer settings
Nexuses/Pixels, OnePlus, Xiaomi's Android One line (Mi A1/A2) do not require unlock codes.
Yeah I know quite a lot of Android phones do support this, which is why I mentioned it, however some phones still don't have this functionality or require me to mess on with some online/call service to receive and unlock code.
So just buy one that does support it.
Yup, I outright refuse to buy a device which doesn't support trivial firmware unlocking. Google's official "fastboot oem unlock" and a forced wipe of the device seems like the best compromise between freedom and security around.
OnePlus and Google still honor warranties on unlocked devices so I've been sticking with them, but may hop over to Xiaomi due to the high prices of newer OnePlus and Google devices next time - if only they had one with a 3.5mm jack and NFC this would be a no-brainer. Still running my OnePlus one nearly 4 years in...
I am still using my Nexus 5 for this reason. I bought a Huawei phone last week, and discovered that they stopped releasing bootloader unlock codes a month prior. So I returned it and got a Sony XA2 ! Sony is surprisingly good in this domain with an online unlock app and even raw AOSP builds from the manufacturer! I had my new phone running lineage within a few hours.
You can unlock the bootloader on almost any smartphone nowadays, but every one of them requires original firmware installed to lock it back.
Google Pixel (and Nexus before it) can be locked with custom signing keys.
If you are talking about this https://mjg59.dreamwidth.org/31765.html does it still work on recent Pixel devices?
Self signing is one of the main reasons to buy a pixel. CopperheadOS (now dead) used that. For aosp builds with self signing, look at rattlesnakeos on github.
The goal is control and to deny you ownership. The device is not meant to be yours. It is owned and takes its orders from its true owners. It is not to obey you, the physical owner.
I don't know about now, but for a long time MediaTek was the go-to for Android devices without locked bootloaders. My buying decision was almost entirely based on that (and the surprising availability of leaked documentation for the SoC on Chinese sites.)
Huh, could you elaborate? From what I know MediaTek has a terrible reputation for not publishing information about their chips.
They don't, yet Linux sources from them show up with good frequency on GitHub and other sites; and if you search hard enough you could find the full (thousands of pages) documentation, reference designs, schematics, etc. for their SoCs.
I've had far less (as in, zero) luck with Qualcomm and Broadcom; they publish sources, but there is basically no documentation beyond that.
There is a difference between 'published' and 'available', especially in China.
Xiaomi had to switch to such a system (delayed unlock request) because many of their phone were altered to contain malware/spyware from resellers.
Not saying that I agree with the method, but I see why they're doing it.
What happens to all those locked devices when the server/company goes away?
Most consumers treat smartphones as disposable products that get replaced at least every couple of years or, so if the company goes away, it isn’t a big hassle. Unlike PCs or wifi routers, smartphones are not viewed as something people want to hang on to and use for years and years. (Plus, in Xiaomi's case, the build quality is so poor your phone might not last long anyway.)
Without some pretty drastic changes, I don't think I'm ever going to feel confident that a computer with a Qualcomm or Broadcom SoC can be made secure. These two companies are basically the worst at making their documentation / datasheets / code available for inspection.
The article suggests that the boot process has been simplified, but as an outsider it still looks astoundingly complex, and the "random oem additions that are likely to be insecure" part is absolutely terrifying.
Am I wrong?
>Motorola’s use of a single QFUSE that must be blown to unlock the device, permanently voiding the warranty.
If this isn't illegal (the warranty part), it should be.
Not that simple. Depends on what warranty is on.
Hardware warranty shouldn't be generally affected, unless it's about parts that can fail because of software malfunction - e.g. when failsafes are defined in user-overridable software. It is fair that a hardware vendor shouldn't be responsible for user e.g. overclocking something.
Software warranty should be voided. Although I can't remember when it was the last time I saw a warranty for software, explicit or implied. Rather, I believe there's an explicit dismissal of any warranties in every single licensing agreement out there.
My phone's screen sometimes wouldn't come back on after a phone call. The sound quality transmitted was sometimes terrible. Both of those things present as hardware problems (prox sensor, microphone) but turned out to be software related. How would you prove that a hardware problem's not actually software?
As an aside, LineageOS seems to have been getting gradually more buggy on my Nexus 5X. 15.1 has the two issues above, plus Bluetooth music streaming is broken and crashes the music player. I reverted to an old 14.1 ROM and everything works perfectly.
> As an aside, LineageOS seems to have been getting gradually more buggy on my Nexus 5X. 15.1 has the two issues above, plus Bluetooth music streaming is broken and crashes the music player.
I have none of those problems with 15.1 on my nexus 5x, but I also make it a point to install the latest weekly updates and the latest vendor partition blob.
Maybe I set it up wrong, then? My basic install flow:
- Get latest nightly build from lineageos.org
- Get matching factory image from Google and extract vendor and radio images from it
- Install TWRP
- Do a full wipe of the phone
- Install LineageOS, vendor image and radio image
Then when an OTA update came out, I'd install it and update the vendor image manually if it complained.
What'd I miss?
That seems about right, and similar to what I do. Maybe it's a hardware issue? I also don't use the default music app for streaming music over bluetooth, so it could be an issue with that..
Yeah, I assumed it was hardware (there's a known issue with some 5Xes with poor mic quality) but I've gone back to 14.1 and all is fixed, so meh. The only thing I really miss is the new power saving features and I can live with charging every two days instead of every three days.
> I can live with charging every two days instead of every three days.
Yea it's incredible how long the battery lasts on this device, compared to previous nexus devices (though I never owned a nexus 5..) I've only ever run lineageos on this thing and regularly get multiple days on a charge with light/moderate usage. I once took it on a 8 day backpacking trip where it was powered on but in airplane mode (to serve as camera and backup map), got out of the wilderness with something like 40% battery left. IMHO that really shows how much negative impact on performance google's shit has on a mobile device when it can last that long without it.
I believe Samsung have/had something similar, though it doesn't blow on unlocking but only if you insert an unsigned kernel.
It's also my understanding the rooting a samsung phone will forever disable samsung pay from working on it ever again.
Haven't installed a rom on my sony xperia yet because it explicitly said in a few places that doing that voids the warranty. I still have around 1 year and half of waiting until mandatory warranty ends so I can feel ol with doing it.
It is illegal.
Yep and if the teacher doesn’t arrive in 10 minutes you can legally leave the class room.
Come on. If you modify the device to work in a way it wasn’t intended (eg overclocked) you have clearly voided the warranty. This is what this is there to detect.
Most western countries already had this discussion.
There were car makers voiding warranties when users replaced a tire. They gave the same arguments: Users modifying the car might do anything, so better give them no chance, as nobody could ever walk safely on the streets again.
Wiser heads prevailed, and e.g. the USA created the Magnuson-Moss Warranty Act. It is today legal to replace tires on your car. It is nevertheless illegal to make it non-street-worthy. If you stupidly damage your car while replacing a tire, you do in fact void your wararanty. If reality isn't clear enough, a judge decides. While streets might not be that safe, it is extremely rare that modified vehicles are part of the problem. The system works.
Why do we have to rediscuss all of this again? Exactly the same idea is applicable to software.
I’d say this is more like remapping the ecu, not really like changing tires...if you mapped the ecu to run in an unsafe way (increased engine pressure etc) then you have voided the warranty.
You made 2 independent changes to this example: Replace tire with ECU, and add 'in an unsafe way'.
The important point is that the second of your changes is the one that voided the warranty: Replacing the tire to run in an unsafe way will void your warranty too. Modifying the ECU, on itself, doesn't do anything to the warranty.
And it is up to the manufacturer to prove your specific modification is unsafe. Claiming that some parts are inherently so complex that any change voids your warranty is legal bullshit for cars.
If I understand your argument well, it is that for cell phones the OS is so complex that any person on this planet should be declared unfit to make changes to it, except the manufacturer.
* First of all, the very audience of this site disproves this statement. Note the 'hacker' in hacker news.
* Second, I invite you to take a look to the garbage you find in most software powering our hardware. Someone going trough the effort to examine a device and create a modification is almost guaranteed to create better quality. There is something Darwinian going on here: Software mods, when they are supported for some time, give by their very existence away something of the persistent and qualified nature of their creator (otherwise they would not have been published).
* Third, the nature of software is that it is hard to create and easy to copy. Only 1 person has to create the firmware modification. The rest of us can copy it. This is exactly the reverse of the tire example and makes firmware mods a lot more easy.
UPDATE: After rereading, this post sees a bit harsh and a personal attack. So let me add that I have respect for your 2 posts. You provide a valid viewpoint in a reasonably toned, adult post. I also respond to more than can be found in your posts alone. I disagree with people downvoting you. I do however respectfully believe your viewpoint is incorrect and dangerous.
You are moving the goalposts, because we were talking about "changing the ROM", not about "changing the ROM and then doing something that will fuck the hardware up"
I think it’s fair for a device manufacturer to say you have voided the warranty if you load a 3rd party ROM onto the device, the device manufacturer has no control over the 3rd party software, how safe it is to run etc. If the 3rd party rom bricks the device why should the supplier pick up the bill when they haven’t approved the software that did the damage?
The point is that changing the ROM allows the user to do something that will "fuck the hardware". Presumably the stock ROM doesn't allow overclocking.
Even if changing the ROM allows me to break the hardware, just by changing the ROM I don't lose my right to have the mobile repaired under free warranty. The manufacturer has to prove I managed to break the hardware only because I changed the ROM.
By the way, if you fuck up the CPU by overclocking it I guarantee they won't notice it because they will just throw the entire motherboard away. The subcontracted company that fixes your mobile (it's almost never the manufacturer) won't spend their time doing some CSI on it.
Yep, in the EU it's definitely illegal.
If you modify your car in any way, have you voided the warranty?
This is a lie sold to you to convince you to give up control to the manufacturers, to make a warranty a flimsy and easily discarded thing. Their products are getting lower in long term quality and reliability each year, but they don't want to take responsibility for them.
If you purposefully bypass control measures that maintain the integrity of the cpu, then run the cpu outside of what it is supposed to do, you have voided the warranty.
If you modify your cars engine increasing the pressure in the cylinders beyond what they are designed to handle you have voided the warranty, and wouldn’t be covered when the seals fail etc.
Your warranty covers the intended use cases. It does not cover you t do what ever you want...
For those who haven't heard of it, LineageOS is essentially a fork of CyanogenMod. Cyanogen (the corporation) is now dead.
OnePlus' oxygenOS which they ship on the 3, 3T, 5, and 6 is very close to cyanogenmod in functionality and no-bullshit-added android experience.
"No bullshit added" if you ignore the insanely detailed tracking stuff they have/had running in the background. I've been running LineageOS ever since because of this. They make very nice hardware at a good price but I don't think their OS can be trusted at this point.
And the OnePlus kernel and modem firmware are broken so much the maintainer of 3 and 3T LineageOS branches couldn't take it anymore.
As a LineageOS OP3 user this is fascinating. Although presumably someone else picked this up as I'm still receiving updates.
15.1 is officially supported by LineageOS but it's not as good as 14.1, that's why 14.1 is still getting security updates thanks to denser@xda https://forum.xda-developers.com/showpost.php?p=77567550&pos...
>QFUSE: Microscopic hardware fuse that is integrated into the SoC - Once physically blown, impossible to reset or replace
I understand that once set, it can't be restored.
My question: could some of the earliest priviliged states decide to burn unburnt fuses of the public key? or is there an extra special root fuse which only allows burning the key fuses if it isn't yet burnt, and which is burnt as soon as the root public key is burnt?
This may seem irrelevant if we don't have access to the privileged states, but perhaps with hardware vulnerabilities as we have been seeing more and more, it may conceivably be possible to force it to run code to burn an unburnt fuse of the key. I.e. perhaps we can set bits in the public key such that the new key is easier to factor? (in case of say RSA)
EDIT: if this is possible we could sign our own root bootloader
There is a fuse-antifuse pair most likely.
Think of it this way:
You burn the bits "0001" into some fuse bank, which is mirrored in an anti-fuse bank that now equals "1110".
You can burn the unburned fuses in the normal bank, but can't unburn them in the antifuse bank.
I actually thought of that, but it seems rather inefficient compared to a single fuse, unblown before writing a key, blown after written key is verified as correctly written, once blown no fuses for this key can be blown...
But I don't know if they actually use such a system, or perhaps they simply assume the attackers never get fuseblowing privilege, and perhaps they assume that a key in a blown set of fuses is no longer rewritable...
so.. can somebody in SoC land explain to me if there is any technology extant, which can reliably un-blow a blown fuse on die, and not irrevocably ruin the entire chip?
a lot gets built into these 'cannot be undone' moments. The cryptech people talked about building large capacitor circuits into their design to be charged up and blow the secure keystore if the tamper switch went "bing" but I always wonder if you could eg use a electron microscope to reverse engineer the keystore state before it was wiped.
I actually did a bit of research on this before.
For easier chips like TPMs(Trusted Platform Module). What I've seen people do when trying to steal secrets(key) on a chip is to buy multiples of the same chip. On each chip expose the layer under a microscope(A chip exist of multiple layers, thats why we need multiple chips). Now, you know everything to reverse engineer the secrets. There exists tools for drilling and probing each trace on the chip. Here is an example.
I think what you mean by charging a huge capacitor and blowing up the secrets. You refer to an HSM(Hardware Security Module). They typically have sensors to detect any fraudulent behavior. They are much harder hack. But there is always holes in each HSM. I think hackers use the same technic there as well. Buy multiple HSM to figure out the design. Reverse engineer the traces and try to probe without being detected on the target device. To my knowledge there are currently no sensors that are 100% bullet proof. They can either be fooled or have weak spots.
> On each chip expose the layer under a microscope
Or use X-ray tomography:
Gemalto at least had that flaw - it had a lot of circuits to detect probing, but a circuit that feeds fuse blower is easy to disconnect/damage even under optical microscope
Thats reading the chip. How about mending a blown fuse?
I have nipples, Greg. Can you milk me?
Not here, please.