mdhughes

joined 2 years ago
[–] mdhughes@lemmy.ml 6 points 6 months ago

Even original Nolan Bushnell's Atari, was bought by Warner Brothers, then (mostly) bought by Jack Tramiel after leaving Commodore. So it's not an unbroken line. Infogrames Fr's new management has quit with the NFT nonsense, and is making Atari-related stuff that isn't awful.

[–] mdhughes@lemmy.ml 2 points 8 months ago

Safari's fast, less crashy, highest privacy protections, and uses less memory per tab; I often have hundreds of tabs so that's important. It also has the best inspector, much better than Firebug. Add in StopTheMadness and an adblocker (currently using Ghostery), and it's pretty great.

Degoogled Chromium is useful for sites that don't work in Safari, or as a sandbox I don't mind crashing in development.

I've given up on Firefox, it's too fat and bloated.

[–] mdhughes@lemmy.ml 4 points 8 months ago (3 children)

I play a lot of MineTest, using the Asuna "game" (big modpack) and a huge custom set of mods, and have a game that's like MineCraft but utterly different. Others play the MineClone2 game, and it's fine, like MC 1.12 + some stuff. Repixture is an adorable mini-minecraft-like. There's a lot of people who use it more as creative, and many servers with various games.

It's definitely a little harder to set up the specific thing you want, but it's incredible how much variety there is.

[–] mdhughes@lemmy.ml 3 points 8 months ago

I'm very interested in the "floating giant 4K screens" part, especially paired with a tiny MacBook Air, and some other uses seem fun. Real uses of AR passthru can be amazing, tagging everything around you with information. At $3500, it's half the price of a single XDR display.

But I'm waiting for gen 2 or later, there's no way the current weight & battery life are usable for my needs. It's a dev kit right now, and while I'm an iOS dev sometimes, it's too small a market to be profitable for me.

[–] mdhughes@lemmy.ml 5 points 8 months ago (1 children)

In the good old days, you had to learn assembly/machine language, C, and OS-level programming to get anything done. Even if you mostly worked on applications, you'd drop down and do something useful. At the time, this was writing machine language routines to call from BASIC. This is still a practical skill, for instance I mostly work in Scheme, but use C FFI to hook into native functionality, and debug in lldb.

Computer Science is supposed to be more math than practical, though when I took it we also did low-level graphics (BIOS calls & framebuffers), OS implementation, and other useful skills. These days almost all CS courses are job training, no theory and no implementation.

Younger programmers typically have no experience below the application language (Java, C#, Python, PHP) they work in, and only those with extensive CS degrees will ever see a C compiler. Even a shell, filesystems, and simple toolchains like Make are lost arts.

The MIT Missing Semester covers some of the mid-high levels of that, but there's no real training in the digital logic to OS levels.

[–] mdhughes@lemmy.ml 8 points 9 months ago (1 children)

RCS is not end-to-end encrypted, so their bubbles will remain green.

Google's proprietary extensions add E2EE, and Apple's not going to pull a Beeper on Google.

[–] mdhughes@lemmy.ml 6 points 10 months ago (1 children)

To misquote John Waters,

If you go home with someone and they have LinkedIn in their browser history, don't fuck them!

[–] mdhughes@lemmy.ml 2 points 10 months ago

Videogame companies literally did use "megabit" when the truth was "128KiB", because it sounded better. Actual computer companies were still listing binary power numbers, because buyers had more to invest and care about accuracy.

You say "sensible", but it's lying for profit.

[–] mdhughes@lemmy.ml 14 points 11 months ago (4 children)

It's a scam by HDD makers to sell less storage for more money.