To be fair, I’ve heard Wii U isn’t particularly pleasant running Linux either. I’ve never run Linux on my homebrewed Wii U, but someone in my Linux Users Group brought theirs in once.
data1701d
The Adobe installer doesn’t run on Wine; someone got a recent version of Photoshop running once, but it’s a pirated version and it’s super buggy.
You can’t use Windows as a Docker container. Docker containers are not running full operating systems; they just run software on top of the current kernel but isolated from the main userspace, making it look to programs inside the container as if it’s a separate system. Anything that claims to be a “Windows Docker container” is just running a VM in a Docker container, which falls into the same pitfalls.
Unlike what others say, a bog standard VM might be the wrong choice depending on which features you use due to lack of graphics acceleration in said VM.
You might be able to get GPU passthrough working on a VM, which I have, with both a Windows and macOS VM that can use the GPU (not at the same time); however, this is really complex (took me ages the first time, though I’ve since discovered tricks to make it a bit easier), and you have to have dual GPUs. Single GPU passthrough is technically possible, but then you can’t use your Linux DE while using the VM. I will say, though, that once it’s set up, it’s a better experience than dual-booting; you get to run graphics-intensive Windows apps quite snappily on one monitor (or monitor input) and use your Linux desktop on the other.
Honestly, it's just a search engine search away, a pretty well-covered thing.
Here's this for starters: https://wii.hacks.guide/
It's basically jailbreaking the console, after which you can run pretty much anything that the console has the power for, including the Linux kernel.
The GPU driver issue would really only be a problem for Nvidia stuff.
I feel like it was more than the package manager whining; I think xorg literally wouldn’t start after the update, although it’s been so long now that I could be misremembering.
Honestly, I probably could have salvaged the install if I’d wanted to without too much difficulty, but it was just a VM for testing distro packaging rather than a daily driver device.
Still, what you say is good to know, and perhaps I should hold back on the Pacman slander. I’ve just been using Debian for around 4 years now and had pretty good reliability; then again, Debian (and most distros, with their pitiful documentation) would probably be very hard to use without Archwiki.
Eh, I disagree with you on Pacman. It could be possible I was doing something stupid, but I've had Arch VMs where I didn't open them for three months, and when I tried to update them I got a colossally messed up install.
I just made a new VM, as I really only need it when I need to make sure a package has the correct dependencies on Arch.
Eh; testing doesn't break THAT often. Having used it on many of my devices for almost 4 years, I can count on one hand the number of times it broke in a way I had to chroot in to fix it.
This is very unlikely to be because they are using testing.
Still, using Debian Stable is probably a smarter idea for this user.
I like using this on my desktop, but it's way too easy to trigger this by accident on a laptop, so I disable it on there.
My best guess is that it's not a Flatpak permissions issue as others are claiming; the software is just trying to use your iGPU (which is usually crappy) instead of your dGPU.
Try taking whatever command you use to start the program and tacking DRI_PRIME=1 on the front. This has often worked for me on applications regardless of whether they're native or Flatpak.
iOS has been getting a bit buggier for me these past few years, but iOS 26 is a whole other level of bad.
With what Google's been doing to AOSP, I just hope GrapheneOS and LineageOS can hold on just long enough until we can get some livable solution for Linux phones.
It might be possible, depending on if the screen is connected to the dGPU or iGPU (I'd guess iGPU). I wouldn't know because I did my setup on a desktop with two dGPUs. I would think it's possible, but you might need an external monitor (?). I don't know how Optimus laptops are wired.
Where I started for GPU passthrough, which got me ~90% of the way there, is https://github.com/bryansteiner/gpu-passthrough-tutorial . Gives you the shell scripts, XML, etcetera needed to do it; I had to modify some bits (some of which you can see in issues), but this is my preferred tutorial. Basically, try it, get really frustrated, take a break for a while, get back to it and keep tinkering with it (check permissions, logs, PCIe driver binds, etcetera), and eventually, you'll figure it out.
https://github.com/mysteryx93/GPU-Passthrough-with-Optimus-Manager-Guide is linked in one of the issues and specifically concerns your kind of laptop.
I might be able to send over some of my XML to get you started, but I don't know how helpful that will actually be over the tutorial, as our systems are completely different, and the AMD GPU I use has different bugs/quirks when doing this than Nvidia ones. The truth of the matter on why there's not really a single-click, easy way to do GPU passthrough is because each system is unique, from the motherboard PCIe implementation to bugs in GPU firmware. That doesn't mean you shouldn't try, but it takes a bit of ingenuity.