this post was submitted on 15 Sep 2024
263 points (89.7% liked)

Technology

59589 readers
2910 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Hard to believe it's been 24 years since Y2K (2000) And it feels like we've come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways

I'm a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don't want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.

Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we're not going to see huge gains in performance anymore because AMD isn't caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don't have the capability to play games in full native 4K 144 Hertz. That's at least a decade away

Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn't very widespread. We're again a decade away from seeing anything really substantial in terms of performance

Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It's so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they're just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that's copyrighted, but claim it's for the public good, and then randomly swap to a for-profit model. Doesn't make any sense and just looks like they're going to be a vessel for widespread economic poverty...

It just seems like there's a lot of bubbles that are about to burst all at the same time, like I don't see how things are going to possibly get better for a while now?

you are viewing a single comment's thread
view the rest of the comments
[–] frezik@midwest.social 54 points 2 months ago (6 children)

. . . with 10% increase in performance rather than 50 or 60% like we really need

Why is this a need? The constant push for better and better has not been healthy for humanity or the planet. Exponential growth was always going to hit a ceiling. The limit on Moore's Law has been more to the economic side than actually packing transistors in.

We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away

Sure you can, today, and this is why:

So many gaming companies are incapable of putting out a successful AAA title because . . .

Regardless of the reasons, the AAA space is going to have to pull back. Which is perfectly fine by me, because their games are trash. Even the good ones are often filled with micro transaction nonsense. None of them have innovated anything in years; that's all been done at the indie level. Which is where the real party is at.

Would it be so bad if graphics were locked at the PS4 level? Comparable hardware can run some incredible games from 50 years of development. We're not even close to innovating new types of games that can run on that. Planet X2 is a recent RTS game that runs on a Commodore 64. The genre didn't really exist at the time, and the control scheme is a bit wonky, but it's playable. If you can essentially backport a genre to the C64, what could we do with PS4 level hardware that we just haven't thought of yet?

Yeah, there will be worse graphics because of this. Meh. You'll have native 4K/144Hz just by nature of pulling back on pushing GPUs. Even big games like Rocket League, LoL, and CS:GO have been doing this by not pushing graphics as far as they can go. Those games all look fine for what they're trying to do.

I want smaller games with worse graphics made by people who are paid more to work less, and I'm not kidding.

[–] barsoap@lemm.ee 2 points 2 months ago* (last edited 2 months ago)

The limit on Moore’s Law has been more to the economic side than actually packing transistors in.

The reason why those economic limits exist is because we're reaching the limit of what's physically possible. Fabs are still squeezing more transistors into less space, for now, but the cost per transistor hasn't fallen for some time, IIRC about 10nm thereabouts is still the most economical node. Things just get difficult and exponentially fickle the smaller you get, and at some point there's going to be a wall. Of note currently we're talking more about things like backside power delivery than actually shrinking anything. Die-on-die packaging and stuff.

Long story short: Node shrinks aren't the low-hanging fruit any more. Haven't been since the end of planar transistors (if it had been possible to just shrink back then they wouldn't have engineered FinFETs) but it's really been taking up speed with the start of the EUV era. Finer and finer pitches don't really matter if you have to have more and more lithography/etching/coating steps because the structures you're building are getting more and more involved in the z axis, every additional step costs additional machine time. On the upside, newer production lines could spit out older nodes at pretty much printing press speed.

load more comments (5 replies)