Trainguyrom

joined 1 year ago
[–] Trainguyrom@reddthat.com 2 points 2 weeks ago* (last edited 2 weeks ago)

I just accepted a job with a small MSP starting early next year. I kept a close ear out during the interview for signs of the classic MSP hell stuff that would chew through techs but it does look like I got a good one (small 8 or so man shop) but check in in about 3 months and we'll see how I'm feeling haha

My longer term plan is to use this as a stepping stone to then move onto being in-house then figuring out my exit strategy before burnout takes me, which I'm thinking I'll either be aiming to move into IT management or possibly moving into a business analytics or cloud administration type role. Technical sales probably wouldn't be too bad either.

[–] Trainguyrom@reddthat.com 3 points 2 weeks ago

even if there was 1000+ mg of pure THC per slice, I’m still not worried cause it is impossible to overdose

See, this is the conventional wisdom but I'm skeptical given we've hit a similar point with capsaicin where after a few years of arms race to make the hottest peppers in the world and making peppers hotter by orders of magnitude from what previously occured in nature we now have a couple of individuals who have died as a result of eating extremely spicy food (granted it exposed underlying health conditions but they would very likely still be alive today if they hadn't eaten overly spicy food) so now there's some question to the conventional wisdom of spicy food can't kill you. And I seriously suspect that we'll see the same with THC sooner or later

[–] Trainguyrom@reddthat.com 3 points 2 weeks ago (2 children)

Nobody knows the dosage in the pizza for sure since it was a cooking accident that it was dosed with THC to begin with. You don't really measure olive oil when you cook with it, plus the distribution wouldn't be even, so even if you do make a guess based on about how much oil you used and the concentration of the THC in the oil, it might have simply pooled more on one side of the pan if they used it as a non-stick coating, or just based on how it mixed into the dough if they mix in olive oil normally.

With the quantities involved it's just impossible to reliably guess the dosage that any affected product might have, and with any kind of drug, recreational or not, the dosage absolutely matters a ton.

[–] Trainguyrom@reddthat.com 5 points 2 weeks ago* (last edited 2 weeks ago)

Especially with how normal memory tiering is nowadays, especially in the datacenter (Intel's bread and butter) now that you can stick a box of memory on a CXL network and put the memory from your last gen servers you just retired into said box for a third or fourth tier of memory before swapping. And the fun not tiered memory stuff the CXL enables. Really CXL just enables so much cool stuff that it's going to be incredible once that starts hitting small single row datacenters

[–] Trainguyrom@reddthat.com 4 points 2 weeks ago

The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard

Funnily enough this is actually changing because of the AI boom. Would-be buyers can't get Nvidia AI cards so they're buying AMD and Intel and reworking their stacks as needed. It helps that there's also translation layers available now too which translate CUDA and other otherwise vebdor-specific stuff to the open protocols supported by Intel and AMD

[–] Trainguyrom@reddthat.com 12 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

He’s not wrong that GPUs in the desktop space are going away because SoCs are inevitably going to be the future. This isn’t because the market has demanded it or some sort of conspiracy, but literally we can’t get faster without chips getting smaller and closer together.

While I agree with you on a technical level, I read it as Pat Gelsinger intends to stop development of discrete graphics cards after Battlemage, which is disappointing but not surprising. Intel's GPUs while incredibly impressive simply have an uphill battle for desktop users and particularly gamers to ensure every game a user wishes to run can generally run without compatibility problems.

Ideally Intel would keep their GPU department going because they have a fighting chance at holding a significant market share now that they're past the hardest hurdles, but they're in a hard spot financially so I can't be surprised if they're forced to divest from discrete GPUs entirely

[–] Trainguyrom@reddthat.com 7 points 2 weeks ago

Seriously putting a couple gigs of on-package graphics memory would completely change the game, especially if it does some intelligent caching and uses RAM for additional memory as needed.

I want to see what happens if Intel or AMD seriously let a generation rip with on package graphics memory for the iGPU. The only real drawback I could see is if the power/thermal budget just isn't sufficient and it ends up with wonky performance (which I have seen on an overly thin and light laptop I have in my personal fleet. It's got a Ryzen 2600 by memory that's horribly thermally limited and because of that it leaves so much performance on the table)

[–] Trainguyrom@reddthat.com 11 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

To be fair, the arm SOCs on phones use BigLittle cores, where it will enable/disable cores on the fly and move software around so it's either running on the Big high performance cores or the Little low power cores based on power budget needs at that second. So effectively not all of those 6+ cores would be available and in use at the same time on phones

[–] Trainguyrom@reddthat.com 2 points 3 weeks ago

I have to disagree. When I tried out a VR headset at a con I spent 2 hours with the headset on in Space Pirate Training Simulator thinking it had only been 20 minutes. This was the $250 Meta Quest 2 while I had a heavy backpack on my back because I didn't have anyone with me to leave my bag with. I was trying to be conscious with not taking too much time with the headset so others could have a chance and figured about 15-20 minutes would be appropriate but apparently I was completely in the zone!

I can count on one hand how many times I've had that much of a time traveling game experience, so I'd say VR is a pretty dang cool experience and once hardware costs come down (or headsets become more ubiquitous) it'll probably be a pretty big market for gamers, much like how consoles are now

[–] Trainguyrom@reddthat.com -1 points 3 weeks ago (9 children)

They have a slim chance if they keep subsidizing VR headsets to hold a and luceative chunk of the VR market when that actually takes off. VR is genuinely cool enough that enough people will get hooked once they experience a headset on their face with a VR experience that jives with them

[–] Trainguyrom@reddthat.com 3 points 3 weeks ago

New business idea!

[–] Trainguyrom@reddthat.com 3 points 3 weeks ago

You raise a good point

Honestly for me it's muscle memory from the Windows 95 days of "it is now safe to turn off your computer" but I also don't trust the OS to correctly interpret the ACPI signal sent by the power button 100% of the time. Obviously I'm not an average user, but I could see where an average user might consistently single press the power button to turn off a computer

view more: ‹ prev next ›