MangoPenguin

joined 2 years ago
[–] MangoPenguin@lemmy.blahaj.zone 3 points 4 months ago* (last edited 4 months ago) (1 children)

Ease of use mostly, one click to restore everything including the OS is nice. Can also easily move them to other hosts for HA or maintenance.

Not everything runs in docker too, so it's extra useful for those VMs.

[–] MangoPenguin@lemmy.blahaj.zone 2 points 4 months ago (1 children)

How do you handle backups? Install restic or whatever in every container and set it up? What about updates for the OS and docker images, watchtower on them I imagine?

It sounds like a ton of admin overhead for no real benefit to me.

[–] MangoPenguin@lemmy.blahaj.zone 4 points 4 months ago (3 children)

A couple posts down explains it, docker completely steamrolls networking when you install it. https://forum.proxmox.com/threads/running-docker-on-the-proxmox-host-not-in-vm-ct.147580/

The other reason is if it's on the host you can't back it up using proxmox backup server with the rest of the VMs/CTs

[–] MangoPenguin@lemmy.blahaj.zone 11 points 4 months ago* (last edited 4 months ago) (7 children)

Regardless of VM or LXC, I would only install docker once. There's generally no need to create multiple docker VMs/LXCs on the same host. Unless you have a specific reason; like isolating outside traffic by creating a docker setup for only public services.

Backups are the same with VM or LXC on Proxmox.

The main advantages of LXC that I can think of:

  • Slightly less resource overhead, but not much (debian minimal or alpine VM is pretty lightweight already).
  • Ability to pass-through directories from the host.
  • Ability to pass-through hardware acceleration from a GPU, without passing through the entire GPU.
  • Ability to change CPU cores or RAM while it's running.
[–] MangoPenguin@lemmy.blahaj.zone 4 points 4 months ago (6 children)

Dockers 'take-over-system' style of network management will interfere with proxmox networking.

[–] MangoPenguin@lemmy.blahaj.zone 3 points 4 months ago (1 children)

Ahh gotcha, selective sync or virtual file system are the common terms for that. Nextcloud supports it, Owncloud does too and I think Owncloud Infinite Scale does but it's not 100% clear.

When you say Owncloud couldn't keep files local without uploading, was that with VFS enabled on the client?

[–] MangoPenguin@lemmy.blahaj.zone 5 points 4 months ago (3 children)

Syncthing works great, if you want a web based file browser you can install one of the many available on a server with syncthing.

[–] MangoPenguin@lemmy.blahaj.zone 2 points 4 months ago* (last edited 4 months ago)

Longest interval is every 24 hours. With some more frequent like every 6 hours or so, like the ones for my game servers.

I have multiple backups (3-2-1 rule), 1 is just important stuff as a file backup, the other is a full bootable system image of everything.

With proper backup software incremental backups don't use any more space unless files are changed, so no real downside to more frequent backups.

[–] MangoPenguin@lemmy.blahaj.zone 3 points 4 months ago

802.11ac will hit 600-800Mbps easily, and those APs are dirt cheap since it's old tech.

[–] MangoPenguin@lemmy.blahaj.zone 2 points 4 months ago (1 children)

That PC can stream anything basically, it sounds like your browser isn't properly using hardware acceleration maybe.

[–] MangoPenguin@lemmy.blahaj.zone 5 points 4 months ago (1 children)

USB hard drive? If we're talking about a cold backup that's easy to access a USB drive is reliable and easy.

[–] MangoPenguin@lemmy.blahaj.zone 2 points 4 months ago (1 children)

Yeah pinning is great, you'll still need watchtower for auto updates too

view more: ‹ prev next ›