Title is TLDR. More info about what I'm trying to do below.
My daily driver computer is Laptop with an SSD. No possibility to expand.
So for storage of lots n lots of files, I have an old, low resource Desktop with a bunch of HDDs plugged in (mostly via USB).
I can access Desktop files via SSH/SFTP on the LAN. But it can be quite slow.
And sometimes (not too often; this isn't a main requirement) I take Laptop to use elsewhere. I do not plan to make Desktop available outside the network so I need to have a copy of required files on Laptop.
Therefor, sometimes I like to move the remote files from Desktop to Laptop to work on them. To make a sort of local cache. This could be individual files or directory trees.
But then I have a mess of duplication. Sometimes I forget to put the files back.
Seems like Laptop could be a lot more clever than I am and help with this. Like could it always fetch a remote file which is being edited and save it locally?
Is there any way to have Laptop fetch files, information about file trees, etc, located on Desktop when needed and smartly put them back after editing?
Or even keep some stuff around. Like lists of files, attributes, thumbnails etc. Even browsing the directory tree on Desktop can be slow sometimes.
I am not sure what this would be called.
Ideas and tools I am already comfortable with:
-
rsync is the most obvious foundation to work from but I am not sure exactly what would be the best configuration and how to manage it.
-
luckybackup is my favorite rsync GUI front end; it lets you save profiles, jobs etc which is sweet
-
freeFileSync is another GUI front end I've used but I am preferring lucky/rsync these days
-
I don't think git is a viable solution here because there are already git directories included, there are many non-text files, and some of the directory trees are so large that they would cause git to choke looking at all the files.
-
syncthing might work. I've been having issues with it lately but I may have gotten these ironed out.
Something a little more transparent than the above would be cool but I am not sure if that exists?
Any help appreciated even just idea on what to web search for because I am stumped even on that.
NFS and ZeroTier would likely work.
When at home NFS will be similar to a local drive, though a but slower. Faster than SSHFS. NFS is often used to expand limited local space.
I expect a cache layer on NFS is simple enough, but that is outside my experience.
The issue with syncing, is usually needing to sync everything.
What would be the role of Zerotier? It seems like some sort of VPN-type application. I don't understand what it's needed for though. Someone else also suggested it albeit in a different configuration.
Just doing some reading on NFS, it certainly seems promising. Naturally ArchWiki has a fairly clear instruction document. But I am having a ahrd time seeing what it is exactly? Why is it faster than SSHFS?
Using the Cache with NFS > Cache Limitations with NFS:
Which raises the question what is "direct I/O" and is it something I use? This page calls direct I/O "an alternative caching policy" and the limited amount I can understand elsewhere leads me to infer I don't need to worry about this. Does anyone know otherwise?
yes this is why syncthing proved difficult when I last tried it for this purpose.
Beyond the actual files ti would be really handy if some lower-level stuff could be cache/synced between devices. Like thumbnails and other metadata. To my mind, remotely perusing Desktop filesystem from Laptop should be just as fast as looking through local files. I wouldn't mind having a reasonable chunk of local storage dedicated to keeping this available.
If there is sufficient RAM on the laptop, Linux will cache a lot of metadata in other cache layers without NFS-Cache.