As we don't have much to work with (error messages from apt/dpkg or anything), at first, try:
apt clean
apt -f install
and post results.
As we don't have much to work with (error messages from apt/dpkg or anything), at first, try:
apt clean
apt -f install
and post results.
Yes, Ethernet, not Wifi, which would have been understandable.
Back in the day there was 'software NICs' on the market which required separate (driver-ish) software to do anything. Also there was RTL chips which required propietary parts from a driver and all the fun stuff. On wifi it's still a thing now and then, but everything works far better today, and it's at least partially because hardware is better too. Of course even in late 90's when ethernet started to gain traction you could just throw something like 3c509 or e100 to your box and call it a day, but standards were far less mature than they're today.
I've never really checked, but I'm pretty sure it doesn't.
Not that you asked, but you can get a .eu domain for ~9€/year (assuming you live in the EU). It's obviously more than 1USD, but I still have few for around my hobbies as it's not that big of an expense.
The process is to go step-by-step. First direct connect to modem you have, bridged connection if possible, and test with multiple bandwidth measurements (speedtest, fast.com, downloading a big file from some university ftp...) and work your way downstream of the network. And on every step test multiple scenarios where it's possible, preferably with multiple devices.
When I got a 1Gbit fiber connection few years back I got an Ubiquiti Edgerouter-X with PoE-options. On paper that should've been plenty for my network, but in theory with NAT, DNAT, firewall rules and things like that it capped on 6-700Mbps depending on what I used it for. With small packets and VPN it dropped even more. So now that thing acts as an glorified PoE switch and the main routing is handled with Mikrotik device, which on manufacturers tests should be able to push 7Gbps on optimal conditions. I only have 1/1Gbps, so there's plenty of room, but with very specific loads that thing still is still pushed to the limit (mostly small packet size with other stuff on top of it) but it can manage the full duplex 1000Base-T. And on normal everyday use it's running at 20% (or so) load, but I like the fact that it can manage even the more challenging scenarios.
I'm pretty sure that you've already checked, but the obvious things sometimes fly under the radar and go unnoticed: is the phone in file transfer mode in the first place? Other one (which has bitten me) is if you're using an usb-hub, try direct connection and/or different ports on the host computer.
Personally I've spent far too long to try and hunt down something obscure while the fix was really simple as some default option changed with updates or whatever. And in general I've forgotten to check the simple things first way too many times and that has caused wasted hours way more than I want to count or admit.
Another type of “ports” were game engines made from scratch that used the level files of the original
ScummVM is one of these which plays (some) LucasArts point'n'click adventure games, like Day of the Tentacle. But it's a bit newer than 1990s, quick search says that it was released around 2001/2002.
There's already a ton of great examples which I can relate (I've been using linux since 1998 or 99) but maybe the biggest difference today, apart from that everything is SO MUCH EASIER now, is that the internet wasn't really the thing it is today. Specially the bandwidth. It took hours and hours over the phone line to download anything, on a good day you could get 100MB just under 4 hours. Of course things were a lot smaller too back then, but it still took ages and I'm pretty sure I now have more bandwidth on my home connection than most of the local universities had back in the 90s.
Back when CRT monitors were a thing and all this fancy plug'n'play technology wasn't around you had modelines on your configuration files which told the system what kind of resolutions and refresh rates your actual hardware could support. And if you put wrong values there your analog and dumb monitor would just try to eat them as is with wildly different results. Most of the time it resulted just in a blank screen but other times the monitor would literally squeal when it attempted to push components well over their limits. And in extreme cases with older monitors it could actually physically break your hardware. And everything was expensive back then.
Fun times.
I've used Seafile for years just for this. I haven't ran that on pi, but on virtual machine it runs pretty smoothly and android client is pretty hassle free.
I want to prevent myself from reinstalling my system.
Any even remotely normal file on disk doesn't stop that, regardless of encryption, privileges, attributes or anything your running OS could do to the drive. If you erase partition table it'll lose your 'safety' file too without any questions asked as on that point the installer doesn't care (nor see/manage) on individual files on the medium. And this is exactly what 'use this drive automatically for installation' -option does on pretty much all of the installers I've seen.
Protecting myself from myself.
That's what backups are for. If you want to block any random usb-stick installer from running you could set up a boot options on bios to exclude those and set up a bios password, but that only limits on if you can 'accidently' reinstall system from external media.
And neither of those has anything to do on read/copy protection for the files. If they contain sensitive enough data they should be encrypted (and backed up), but that's a whole another problem than protecting the drive from accidental wipe. Any software based limitation concerning your files falls apart immediately (excluding reading the data if it's encrypted) when you boot another system from external media or other hard drive as whatever solution you're using to protect them is no longer running.
Unless you give up the system management to someone else (root passwords, bios password and settings...) who can keep you from shooting yourself on the foot, there's nothing that could get you what you want. Maybe some cloud-based filesystem from Amazon with immutable copies could achieve that, but it's not really practical on any level, financial very much included. And even with that (if it's even possible in the first place, I'm not sure) if you're the one holding all the keys and passwords, the whole system is on your mercy anyways.
So the real solution is to back up your files, verify regularly that backups work and learn not to break your things.
There is a handful of vendors and they indeed monitor a ton more than just viruses. The solution we're running at the office monitors pretty much all kinds of logs (dns, dhcp, authentication, network traffic....) and it can lock down clients which are behaving wrongly enough. For example every time I change a hosts file (for a legitimate reason) on my own laptop I get a question from security team if that was intented. And it combines logs/data gathered from different systems to identify potential threats and problematic hosts and that's why our fleet feeds in data from all kinds of devices.
I haven't seen that many different solutions which do this, but the few I've worked with are a bit hit or miss with linux. The current solution has a funny feature where it breaks dpkg if the server doesn't have certain things installed (which are not depencies on the packet itself). And they eat up a pretty decent chunk of CPU-cycles and RAM while running. But apparently someone has done the math and decided that it's worth the additional capacity, it's outside my pay range so I just install whatever I'm told to.