vithigar

joined 1 year ago
[–] vithigar@lemmy.ca 2 points 5 months ago

Literally the only reason I ever fire up a different browser. Come on guys.

[–] vithigar@lemmy.ca 4 points 5 months ago (1 children)

True. Until you responded I actually completely forgot that you can selectively download torrents. Would be nice to not have to manually manage that at the user level though.

Some kind of bespoke torrent client that managed it under the hood could probably work without having to invent your own peer-to-peer protocol for it. I wonder how long it would take to compute the torrent hash values for 100PB of data? :D

[–] vithigar@lemmy.ca 7 points 5 months ago

Yes, it's a big ask, because it's a lot of data. Any distributed solution will require either a large number of people or a huge commitment of storage capacity. Both 100,000 people and 1TB per node is a lot to ask for, but that's basically the minimum viable level for that much data. Ten million people each committing 50GB would be great, and offer sufficient redundancy that you could lose 80% of the nodes before losing data, but that's not a realistic number to expect to participate.

[–] vithigar@lemmy.ca 3 points 5 months ago

Which Vivobook? It's a wildly varied brand that ranges from factory made e-waste to decent.

[–] vithigar@lemmy.ca 26 points 5 months ago* (last edited 5 months ago) (5 children)

That wouldn't distribute the load of storing it though. Anyone on the torrent would need to set aside 100PBs of storage for it, which is clearly never going to happen.

You'd want a federated (or otherwise distributed) storage scheme where thousands of people could each contribute a smaller portion of storage, while also being accessible to any federated client. 100,000 clients each contributing 1TB of storage would be enough to get you one copy of the full data set with no redundancy. Ideally you'd have more than that so that a single node going down doesn't mean permanent data loss.

[–] vithigar@lemmy.ca 8 points 6 months ago* (last edited 6 months ago)

They don't even have to go down. Staying stable or even going up at a consistent rate are both considered failure states, or at least unfavorable. If the rate of growth is not itself growing then they start worrying.

It's insane.

[–] vithigar@lemmy.ca 3 points 6 months ago (1 children)

Do you use it in dark mode with a completely black background and white text? You get pretty nasty retinal afterimages from closely clustered bright spots like that, and can make the center of your vision blurry/hazy for a few seconds to minutes. It's a harmless temporary effect, but can be a minor annoyance.

[–] vithigar@lemmy.ca 31 points 6 months ago (1 children)

No, because it's not poorly processing anything. It's not even really a bug. It's doing exactly what it's supposed to do, spit out words in the "shape" of an appropriate response to whatever was just said

[–] vithigar@lemmy.ca 10 points 6 months ago (1 children)

But then I wouldn't have had the opportunity to dump an attache case full of grenades on the head of the last boss in Resident Evil 4!

[–] vithigar@lemmy.ca -2 points 6 months ago (1 children)

I probably should've been a little clearer that I'm taking scales of thousands of km here.

I'm on an island in the North Atlantic. I don't hold it against my ISP if I can't get my full 1.5Gbps down from services hosted in California.

[–] vithigar@lemmy.ca 17 points 6 months ago (8 children)

If you're paying for 100mbps, and the person you're talking to is paying for 100mpbs, and you're not consistently getting 100mbps between you, then at least one of you is getting ripped off.

That's only really true of you're relatively close to each other on the same ISP. The father apart and the more hops you need to make the less likely it becomes, through no fault of your ISP.

[–] vithigar@lemmy.ca 2 points 6 months ago

Totally agree that a lot of them are poor implementations. Or just have a terrible UX such that it's almost guaranteed that a layperson is going to set it up badly and have a degraded experience that they've convinced themselves is good. Obviously the "correct" thing to do is check every box for "enhancements", right?

Gaming peripheral software supplied by the OEM being bad is probably the least surprising thing I'm likely to read all day.

As for stereo sounding better, I think in the purest sense that's always going to be true. Any kind of processing is going to alter the audio to some degree away from the original "intent". A pure triangle wave from a NES isn't going to be a pure triangle wave after it goes through any HRTF, good, bad, or otherwise. If you want your sound to be clean then yes, avoid extra processing at all costs.

view more: ‹ prev next ›