Barx

joined 6 months ago
[–] Barx@hexbear.net 22 points 3 weeks ago

Really sticking it to those... friendly Russian kernel maintainers. Really doing your part for your individual Two Minutes Hate.

So presumably, as a consistent person that is outrages by invasions and death, you call for the expulsion of all Americans and Usraelis, right?

[–] Barx@hexbear.net 3 points 2 months ago (1 children)

As a start, follow the 3-2-1 rule:

  • At least 3 copies of the data.

  • On at least 2 different devices / media.

  • At least 1 offsite backup.

I would add one more thing: invest in a process for verifying that your backups are working. Like a test system that is occasionally restored to from backups.

Let's say what you care about most is photos. You will want to store them locally on a computer somewhere (one copy) and offsite somewhere (second copy). So all you need to do is figure out one more local or offsite location for your third copy. Offsite is probably best but is more expensive. I would encrypt the data and then store on the cloud for my main offsite backup. This way your data is private so it doesn't matter that it is stored in someone else's server.

I am personally a fan of Borg backup because you can do incremental backups with a retention policy (like Macs' Time Machine), the archive is deduped, and the archive can be encrypted.

Consider this option:

  1. Your data raw on a server/computer in your home.

  2. An encrypted, deduped archive on that sane computer.

  3. That archive regularly copied to a second device (ideally another medium) and synchronized to a cloud file storage system.

  4. A backup restoration test process that takes the backups and shows that they restores important files, the right number, size, etc.

If disaster strikes and all your local copies are toast, this strategy ensures you don't lose important data. Regular restore testing ensures the remote copy is valid. If you have two cloyd copies, you are protected against one of the providers screwing up and removing data without you knowing and fixing it.

[–] Barx@hexbear.net 2 points 2 months ago

Why does it need to be a scripting (by this I assume interpreted) language? For your requirements - particularly lightweight distribution - a precompiled binary seems more appropriate. Maybe look into Go, which is a pretty simple language that can be easily compiled to native binaries.

[–] Barx@hexbear.net 11 points 3 months ago
[–] Barx@hexbear.net 17 points 3 months ago (3 children)

What in Marxism needs to be upgraded, specifically?

[–] Barx@hexbear.net 12 points 3 months ago (5 children)

What in Marxism do you think needs upgrades?

[–] Barx@hexbear.net 4 points 5 months ago

apt is good for most things.

Flatpak is good for applications where you want the people who write the software to be creating the releases and for closed source apps that you want to isolate a bit from your system.

For example, on a new system you might install everything using apt except for Zoom. Zoom isn't in the Debian repos, it's closed source and proprietary. But you can get the official Zoom application using flathub. Zoom will also be fairly isolated from the rest of your system so it has less access to your files and can be removed more cleanly later on if needed.

[–] Barx@hexbear.net 14 points 5 months ago

tyranny.gov is just a proxy for tyranny.com

[–] Barx@hexbear.net 1 points 5 months ago

It's realistic if security is a priority.