That's not fair. Multiple books of his books are award winning. Even if you only like one, the critics rate him. Other writers, rate him.
jabjoe
That's the sequel to Ender's Game. It is good, but it is Orson Scott Card.
Linus's VFS is where the 256 limit is hard. Some Linux filesystem, like RaiserFS, go way beyond it. If it was a big deal, it would be patched and widely spread. The magic of Linux, is you can try it yourself, run your own fork and submit patches.
LUKS is the one to talk about as the others aren't as good an approach in general. LUKS is the recommended approach.
Edit: oh and NTFS is 512 bytes. UTF16 = 16bit = 2 bytes. 256*2 = 512
You prefer your monopolies to not be democratically accountable?
I prefer no monopolies, but if it's something that is a natural monopoly, I certainly don't want it by a for profit foreign company.
Maybe the answer is to split these guys up by country and each government decides what they do with their chunk. We'll see which works best.
Independent not for profits, straight up nationalised, private still(baby Bell), publicly owned and privately run, etc etc.
It's a good read, but he then back on it all and went all Apple. So it's a bit bitter sweat. Snow Crash is probably better.
Of course UTF8 is Unicode. The cool thing about UTF8 is that is ASCII, until it isn't. It cover all of Unicode, but doesn't need any bloat if you are just doing latin characters. Plus UTF8 will seamless go through ASCII code and things that understand it do, others just have patches of jibberish, but still work otherwise. It's a way better approach. Better legacy handling and more efficient packing for latin languages. Which is why it "won" out. UTF16 pretty much only exists in Windows because it's legacy it will be hard for it to escape.
LUKS is by far the most common encryption setup on Linux. It's done at block layer and the filesystem doesn't know about it. No effect of filename length, or anything else.
NTFS also has a 255 limit, but it's UTF16, so for unicode, you will get more out of it. High price to pay for UTF16. Windows basically is moving stuff between UTF16 and ASCII all the time. Most apps are ASCII but Windows is natively UTF16. All other modernly maintained OS do UTF8, which "won" unicode.
The fact that all major Unix (not just Linux) filesystems are to 255 bytes says it's not a feature in demand.
I'd much rather have COW subvolume snapshotting and incremental backup of btrfs or zfs. Plus all the other things Linux has over Windows of course.
May I recommended watching "The YouTube Effect" if you don't see the problem with big tech companies.
The app will almost certainly mostly be just wrapping a web interface. But this dedicated browser can provide the site with all the access of an app. The idea will be only this browser can be trusted to access this site and can check the run environment before connects. I'm they'd do the same on the desktop, if they thought it would be swallowed.
I do blame Google. It's their platform. They could mandate upstream kernels.
They could define auto discoverablity for their platform hardware. Then it would be possible for generic ROMs to boot on any Android phone.
These patents seam trivial obvious ones. Hope they get knocked down during the case.
I have lived quite happily, on pretty much only open source for over 12 years now. Professionally and at home (longer at home). Debian I put with Wikipedia as an example of what humans can be.
There is no gate keepers in who can do what where. Only on who will accept the patches. Projects fork for all kinds of reasons, though even Google failed to fork the Linux kernel. If there is some good patch to extend the filename limit, it will get in. Enough pressure and maybe the core team of that subsystem will do it.
Open source already won I'm affriad. Most of the internet, IoT to super computers, runs open source. Has been that way for a while. If you use Windows, fine, but it is just a consumer end node OS for muggels. 😉
If you setup a new install, and say you want encryption, LUKS is what you get.