this post was submitted on 19 May 2024
71 points (97.3% liked)

Linux

48328 readers
599 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Title is TLDR. More info about what I'm trying to do below.

My daily driver computer is Laptop with an SSD. No possibility to expand.

So for storage of lots n lots of files, I have an old, low resource Desktop with a bunch of HDDs plugged in (mostly via USB).

I can access Desktop files via SSH/SFTP on the LAN. But it can be quite slow.

And sometimes (not too often; this isn't a main requirement) I take Laptop to use elsewhere. I do not plan to make Desktop available outside the network so I need to have a copy of required files on Laptop.

Therefor, sometimes I like to move the remote files from Desktop to Laptop to work on them. To make a sort of local cache. This could be individual files or directory trees.

But then I have a mess of duplication. Sometimes I forget to put the files back.

Seems like Laptop could be a lot more clever than I am and help with this. Like could it always fetch a remote file which is being edited and save it locally?

Is there any way to have Laptop fetch files, information about file trees, etc, located on Desktop when needed and smartly put them back after editing?

Or even keep some stuff around. Like lists of files, attributes, thumbnails etc. Even browsing the directory tree on Desktop can be slow sometimes.

I am not sure what this would be called.

Ideas and tools I am already comfortable with:

  • rsync is the most obvious foundation to work from but I am not sure exactly what would be the best configuration and how to manage it.

  • luckybackup is my favorite rsync GUI front end; it lets you save profiles, jobs etc which is sweet

  • freeFileSync is another GUI front end I've used but I am preferring lucky/rsync these days

  • I don't think git is a viable solution here because there are already git directories included, there are many non-text files, and some of the directory trees are so large that they would cause git to choke looking at all the files.

  • syncthing might work. I've been having issues with it lately but I may have gotten these ironed out.

Something a little more transparent than the above would be cool but I am not sure if that exists?

Any help appreciated even just idea on what to web search for because I am stumped even on that.

you are viewing a single comment's thread
view the rest of the comments
[–] wargreymon2023@sopuli.xyz 3 points 6 months ago* (last edited 6 months ago) (1 children)
  1. wifi is less responsive than ethernet
  2. You don't even need encryption on local connection(no need for ssh) if it is never global.
  3. if you have that many external HDD, I would rather get a NAS and call it a day, it has everything configured for you as well.
  4. I don't know what big data you access on, get a larger SSD to store whatever you might visit often from your HDDs.
[–] nickwitha_k@lemmy.sdf.org 6 points 6 months ago (1 children)

Strong disagreement on №2. That kinda thinking is how you get devices on your home network to join a malicious botnet without your knowledge and more identity theft. ALL network communications should assume that a malicious actor may be present and use encryption-in-transit for anything remotely approaching private, identifiable, or sensitive.