lvxferre

joined 3 years ago
[–] lvxferre@lemmy.ml 1 points 1 year ago

It's a way to share files across seeders/peers of different torrents, as long as the torrents contain at least one identical file. For example, let's say that:

  • you're downloading torrent1. It shares the files A and B. You got A already, but since the number of seeders dropped considerably, you're having a really hard time downloading B.
  • there's also torrent2, sharing files B and C. Fairly active, with lots of seeders.

Without swarm merging, you're stuck waiting for new seeders or peers from torrent1 that have the file B. With swarm merging, your torrent program will get the file B from people sharing torrent2 too.

I recall this feature from Vuze; but apparently BiglyBT also uses it.

[–] lvxferre@lemmy.ml 1 points 1 year ago

although why you would not want the latest stable version of an app for example is beyond me, like, it's a stable version, you should want the new features

Because most developers don't follow Torvalds' first rule of kernel development: "We don't cause regressions". They're completely fine releasing so-called newer stable versions that are less usable than the earlier ones - removing features, demanding more of the system, letting known bugs to slip through because they assume user case ("it's fine~").

And, contrariwise to the guy in the video plenty, plenty users know this: that the latest "stable" version might cause a regression. But they usually don't have time and/or knowledge to check every single new version of every single piece of software that they might use. So it would be great if there was someone or a group doing this for them, while taking into account that the difference between "this shit is broken!", "this shit is usable but worse" and "this is actually better" is subjective and depends on user case. Right?

Well. That's what a distributor does. This is a critical role of distributions that the video does not address - they sort and trial software versions for the users, based on user case.

because they depend on all the versions of libraries that you would not be able to install on the distro because they would break your system or conflict with a newer version

If library developers did what the kernel devs did, this would not be a problem. So while the video guy is addressing a real problem, he's being unable to pinpoint where the problem lies in; it is not in the distros, but upstream.

duplication, storage, etc.

Is the increased amount of storage necessary a real problem in 2023? I'm not sure given that storage has become dirty cheap even for users, and the cost is usually spread out among the distro maintainers.

Regarding developers releasing multiple versions: usually the ones doing this are the distro maintainers.

I've stopped watching the video at 4:09.

view more: ‹ prev next ›