this post was submitted on 05 Sep 2024
39 points (100.0% liked)

Selfhosted

40296 readers
311 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

So, I'm selfhosting immich, the issue is we tend to take a lot of pictures of the same scene/thing to later pick the best, and well, we can have 5~10 photos which are basically duplicates but not quite.
Some duplicate finding programs put those images at 95% or more similarity.

I'm wondering if there's any way, probably at file system level, for the same images to be compressed together.
Maybe deduplication?
Have any of you guys handled a similar situation?

you are viewing a single comment's thread
view the rest of the comments
[–] cizra@lemm.ee 8 points 2 months ago (1 children)

Highly unlikely to succeed. The tiny differences are spread out all over the image.

[–] ptz@dubvee.org 3 points 2 months ago* (last edited 2 months ago)

That's what I was thinking, but wasn't sure enough to say beyond "give it a shot and see".

There might be some savings to be had by enabling compression, though it would depend on what format the images are in to start with. If they're already in a compressed format, it would probably just be a waste of CPU to try compressing them further at the filesystem level.