this post was submitted on 04 Feb 2024
65 points (86.5% liked)
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
54716 readers
182 users here now
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
💰 Please help cover server costs.
Ko-fi | Liberapay |
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Back in the day, people shared files on usenet news, which was similar to the forums you're seeing on Lemmy or Reddit.
You'd take a file, image, video, whatever, and turn it into text via a program called uuencode.
Text size posting limits often meant having to split the image into multiple text pieces all marked (1/34, 2/34, etc.)
The person downloading the file would then need to stitch all the pieces together, in order, making one large text file, then use uudecode to turn it back into a usable file.
Here's a sample of what a uuencoded file looked like:
From a user aspect: nowadays all that is burried in/handed by the usenet client you use.
Downloading from usenet is very similar to torrenting in that you receive an index file (.nzb) that is effectively equivalent to a torrent file. You pass that to your usenet client, and it'll handle downloading each of the parts, called articles, then stitching them together into the actual file shares. (while even recovering missing/corrupted data via added parity data)
The big difference is you're downloading each of these articles from whichever usenet providers you've configured; instead of from random individual peers discovered through public/private trackers.
Usenet providers usually offer more consistent and faster speeds, typically saturating my disk write speed; where as torrent peers are often slow or unreliable in comparison. Also as it's a standard tls connection between you and a private service, and you don't have to re-upload the data you download; you're not exposed to copyright claimants and don't need a vpn.
Been like a decade since I touched usenet but I do recall that requests were pretty common. Especially since the content expires. With a 5 year old torrent there's a decent chance you'll find a couple of seeders even on a public tracker and get it eventually, but with usenet that stuff does eventually rot away and you'll have to request a reup.
I mean usenet servers are running with insane retention... omicron hosts have 5648 day retention, other backbones are over 4500
Unless it gets taken down, it doesn't go away anymore... providers just keep retaining. I suppose that will end eventually... maybe some day the cost of storage will prohibit archiving 15 year old binary usenet posts.
Usenet retention has been pretty much infinite the last 15 years.
That's down the the indexer you decide to use. The one I use (NZBGeek) does have a requests section where you can enter an IMDB id, TVDB id, or just a general description and any other necessary/desired details like quality and they'll be filled by volunteers.
TBH not something I'd actually looked into until now. I'm gonna go drop a request or two in there right now. There's not much I'm missing, but the things I am I haven't been able to find regardless of source.