Ah so you didn't actually answer OPs question, I misunderstood your intent.
Kazumara
What I said was still true according to his even newer followup; Fortinet really told that to the Journalists:
https://cyberplace.social/@GossiTheDog/111895724464138614
I'm really glad Kevin got them to admit it was a fabrication. The way he asserted that it was a made up example first, before having anything concrete to back it up, made him seem unreliable to me at first.
Oh how does one use Signal as a Telegram Client?
the original 1s and 0s
I think your issue starts there, you already have to decide how to build your sensor:
- If it's a CMOS sensor how strong do the MOSFETs amplify? That should affect brightness and probably noise.
- How quickly do you vertically shift the data rows? The slower the stronger the rolling shutter effect will be.
- What are the thresholds in your ADC? Affects the brightness curve.
- How do you layout the color filter grid? Will you put in twice as many green sensors compared to blue or red as usual? This should affect the color balance.
- How many pixels will you use in the first place? If there is many each will be more noisy, but spacial resolution should be better.
All of these choices will lead to different original 1s and 0s, even before any post-processing.
I mean we haven't seen any proof, but Stefan Züger of Fortinet told that story as a supposedly true event to Journalists of CH-Media. The very article Kevin Beaumont posts says that the scenario is a real event.
Btw did you know Swiss cheese has copy protection? I know the thought is pretty random, but I thought I'd share anyway.
Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.
This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn't enough anymore. Even though the cards were still rasterizing quickly enough they weren't useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.
And I'm not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don't remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn't deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.
At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren't current anymore, and it was only used in my Linux server.
That seems weird, it's called mother of all breaches, but isn't the result of any one breach. It's just data collection from ordinary breaches with perhaps some credential stuffing in the mix.
600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.
~~The easy way to get a clean copy is to download it directly from Microsoft. You can generate a valid download link using the follwing website (just make sure the domain of the generated link points to Microsoft and nothing bad can happen)~~
~~\https://tb.rg-adguard.net/public.php~~
As for the activation, I have found that the My Digital Life Forums have good activators available usually:
https://forums.mydigitallife.net/forums/kms-and-other-tools.51/
Edit: Sorry it looks like TechBench is dead, but this site seems to work the same way:
https://massgrave.dev/msdl/