Looks like certified mail, not notarized. But still, ridiculous. https://retrofitness.com/faq/
kibiz0r
The Switch is ARM and uses several components from FreeBSD and Android. It would not be surprising to learn that they have the ability to compile system components like Virtual Console for an ARM Linux with stubs for Switch-specific stuff.
The SNES Classic is also ARM, and has much less going on than the full Switch OS (Horizon). That could be the core of what they use for the museum displays, considering there’s an ARM version of Windows too.
Either way, devs gonna dev. If you can’t get feedback at your workstation and always have to deploy to your target platform to test anything, you’re gonna move too slow to catch and fix bugs or build flexible enough systems to prevent them.
So much of dev testing is about trade-offs between rapid iteration and thorough fidelity. You need access to both.
From my own experience, I’ve done stuff like:
- built mobile apps that can also be deployed as desktop apps or web apps for the sake of dev testing
- built testing tools for car systems that fake out sensor input
- built HTTP wrappers for cloud-deployed services to allow them to be run locally
- faked out camera feeds for AR apps
It can get janky, cuz not everything works the same way, but most of what you work on is not platform-specific anyway and a good architecture will minimize the portion of code that only works on the target platform.
The article notes that they are likely using a proprietary in-house emulator.
Conker’s bad ferrous day
Good call, thank you.
Also: Referencing Wikipedia in this context is kinda funny.
So I put 2 and 2 together, and decide this whole thing is pissing me off.
Still waters run deep.
Unleashing generative AI on the world was basically the information equivalent of jumping headfirst into Kessler Syndrome.
“We can’t solve climate change by repeating our past behavior. Let’s ignore climate change and build a machine that regurgitates our past behavior.”
This just in: Author/professor/CEO whose books/classes/company are about manipulative technologies… voluntarily installs manipulative technologies.
Not sure anyone actually read the article, cuz yall are talkin about apps vs. web sites, and data collection. Two points which are briefly covered, but ultimately shrugged off in favor of the larger thesis:
Smartphones … meant [companies] could use their apps to off-load effort. … In other words, apps became bureaucratized. What started as a source of fun, efficiency, and convenience became enmeshed in daily life. Now it seems like every ordinary activity has been turned into an app, while the benefit of those apps has diminished.
I’d like to think that this hellscape is a temporary one. As the number of apps multiplies beyond all logic or utility, won’t people start resisting them? And if platform owners such as Apple ratchet up their privacy restrictions, won’t businesses adjust? Don’t count on it. Our app-ocalypse is much too far along already. Every crevice of contemporary life has been colonized. At every branch in your life, and with each new responsibility, apps will keep sprouting from your phone. You can't escape them. You won’t escape them, not even as you die, because—of course—there’s an app for that too.
It’s not simply the code delivery mechanism, and it’s not whether the data exchange is safe from prying eyes… It’s the fact that a digital UX has invaded every aspect of human interaction, including mourning.
This is where we need something other than copyright law. The problem with generative AI companies isn't that somebody looked at something without permission, or remixed some bits and bytes.
It's that their products are potentially incredibly harmful to society. They would be harmful even if they worked perfectly. But as they stand, they're a wide-open spigot of nonsense, spewing viscous sludge into every single channel of human communication.
I think we can bring out antitrust law against them, and labor unions are also a great tool. Privacy, and a right to your own identity factor in, too. But I think we're also going to need to develop some equivalent of ecological protections when it comes to information.
For a long time, our capacity to dump toxic waste into the environment was minuscule compared to the scale of the natural world. But as we automated more and more, it became clear that the natural world has limits. I think we're headed towards discovering the same thing for the world of information.