this post was submitted on 27 Mar 2024
131 points (93.4% liked)

Technology

59605 readers
3435 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Grumbles about generative AI's shortcomings are coalescing into a "trough of disillusionment" after a year and a half of hype about ChatGPT and other bots.
Why it matters: AI is still changing the world — but improving and integrating the technology is raising harder and more complex questions than first envisioned, and no chatbot has the magic answers.
Driving the news: The hurdles are everything from embarrassing errors, such as extra fingers or Black founding fathers in generated images, to significant concerns about intellectual property infringement, cost, environmental impact and other issues.

you are viewing a single comment's thread
view the rest of the comments
[–] knightly@pawb.social 64 points 8 months ago (5 children)

It's only a matter of time 'til the "AI" bubble really pops and all those tech companies that fired too much of their workforce have to start hiring back like crazy.

[–] Docus@lemmy.world 39 points 8 months ago* (last edited 8 months ago) (3 children)

While there are some bubbles that need popping, especially in board rooms - i work for a large tech company that has not fired anyone because of AI. Rather the opposite, we have been expanding our AI team in the last 5+ years and have delivered succesful AI products. There is a lot more to AI than ChatGPT. Which, while impressive as a proof of concept, is not actually useful to business.

[–] agressivelyPassive@feddit.de 18 points 8 months ago

I'm so skeptical that AI will deliver large scale economic value.

The current boom is essentially fueled by free money. VCs pump billions into start-ups, more established companies get billions in subsidies or get their customers to pay outrageous amounts on promises. Yet, I have yet to see a single AI product that is worth the hassle. The results are either not that good or way too expensive, and if you couldn't rely on open models paid for by VC, you wouldn't be able to get anything off the ground.

[–] knightly@pawb.social 12 points 8 months ago

It's the same at my employer, which has wasted untold thousands on subscriptions to ChatGPT and CoPilot and all we've gotten out of it so far is a script that takes in transaction data and spits out "customer loyalty recommendations".. as if we don't already have a marketing department for that. XD

[–] CosmoNova@lemmy.world 4 points 8 months ago

I think most people don’t understand the one fundamental thing about AI: ChatGPT, Dall-E and what not are just products produced by machine learning, not AI themselves. Machine learning is already doing a lot of work for science and it‘s utterly unthinkable to not utilize it in fields like chemistry for example. We only read about media producing LLMs because we just consume so damn much media. Maybe that‘s something we should think about.

[–] agressivelyPassive@feddit.de 8 points 8 months ago (2 children)

Nobody fired workers because of AI, that's just the narrative so they don't have to say "we're running out of money".

[–] knightly@pawb.social 11 points 8 months ago

It's also downsizing from when tech companies went on a hiring spree during the early years of the pandemic.

[–] SharkAttak@kbin.social 3 points 8 months ago

A lot of top level big brains thought they could fire people and replace them with AI, cause for them they're like robots they see in movies.

[–] the_ocs@lemmy.world 6 points 8 months ago

So changing the name to corp.ai wasn't clever?

[–] FaceDeer@fedia.io 6 points 8 months ago (1 children)

Not every new technology or shift in the economy is a "bubble" that's inevitably going to "pop" someday.

[–] knightly@pawb.social 16 points 8 months ago (1 children)

But this one definitely is.

It's like watching the Blockchain saga in fast-forward.

[–] FaceDeer@fedia.io -3 points 8 months ago* (last edited 8 months ago) (2 children)

But this one definitely is.

Such confidence. Why do you think so?

Many of the shifts that have happened in the economy are a result of capabilities that existing AI models actually demonstrably have right now, rather than anticipation of future developments. Even if no further developments happen those existing capabilities aren't going to just "go away" again somehow.

Also worth noting, blockchains are still around and are doing just fine.

[–] NaibofTabr@infosec.pub 2 points 8 months ago* (last edited 8 months ago) (1 children)

blockchains are still around and are doing just fine.

[–] FaceDeer@fedia.io -3 points 8 months ago (1 children)

They are, though. The total market cap across cryptocurrencies right now is about $2.75 trillion.

[–] knightly@pawb.social 7 points 8 months ago (1 children)

You misspelled "Unlicensed Securities", and taking crypto scammers at their word when they tell you how much their bits are worth is an easy way to lose actual money. XD

[–] FaceDeer@fedia.io -5 points 8 months ago (2 children)

I'm just pointing out that they're still there. If it's a scam then at this point it's one of history's biggest and longest-running.

And whether any particular cryptocurrency qualifies as a security in any particular jurisdiction is a complicated question, some do and some don't. This is about cryptocurrency as a whole so calling them an unlicensed security would not be accurate.

[–] knightly@pawb.social 5 points 8 months ago (1 children)

Calling them a "currency" wouldn't be accurate either.

And the fact that they still exist as a fraction of a shadow of their former hype doesn't perish the fact that they have accomplished none of their stated goals.

Not as an untracable currency, not as a store of value, not as a medium of exchange, and most especially not as a thing to make government-issued money obsolete.

Cryptocurrency as a whole isn't worth the disk space it occupies.

[–] FaceDeer@fedia.io -3 points 8 months ago (1 children)

It isn't a "shadow", its current market cap is near its historical high point right now.

If you have no use for it then by all means ignore it. But calling it a bubble that has popped is simply factually inaccurate. Other people evidently find value in it.

[–] knightly@pawb.social 3 points 8 months ago (1 children)

And that's their problem. The price is up because anybody with sense and no morals wants to sell off their holdings for as much as possible before the next "market adjustment" leaves them hodling the bag.

[–] FaceDeer@fedia.io -2 points 8 months ago (1 children)

I have no interest in the specifics of why the price is up or down, I'm not a speculator. That's not the point of all this. The only point is it's still there. Which it is.

[–] knightly@pawb.social 2 points 8 months ago (1 children)

But it isn't. It was never there in the first place. Even the tulip bubble still had tulips at the end, but crypto is just creaking along like a zombie on the inertia of hardware investments.

[–] FaceDeer@fedia.io 0 points 8 months ago* (last edited 8 months ago) (1 children)

By "hardware investments" I take it you mean mining rigs? You're two years out of date, Ethereum moved to proof-of-stake in 2022. It doesn't depend on special-purpose hardware any more, or any hardware of particularly significant quantity. Bitcoin still does does but it's a shrinking share of the total cryptocurrency ecosystem.

Again, if you aren't interested in cryptocurrencies then feel free to ignore them. But making confident pronouncements about them when you're not familiar with how they work or are used is not that. Poetic and emotional language is no substitute for knowledge.

[–] knightly@pawb.social 2 points 8 months ago* (last edited 8 months ago) (1 children)

No, I'm talking about hardware as in the corporate infrastructure. Exchanges, wallet operators, customer service, etc.

Almost nobody does their own crypto, they just make an account at Coinbase or something. This defeats the purpose of crypto by re-centralizing it, making the network susceptible to another Mt. Gox situation.

[–] FaceDeer@fedia.io 0 points 8 months ago (1 children)

People are only using crypto because so many people are using crypto? Okay.

[–] knightly@pawb.social 1 points 8 months ago (1 children)

Right?

If you're effectively forced to go through a corporate bank equivalent anyway then what even is the point? You might as well invest through a local credit union, at least it will still be around when the hodlers finish divesting and let the remains of the crypto market collapse into insolvency.

[–] FaceDeer@fedia.io -1 points 8 months ago (1 children)

You're not forced to, though. It's an option.

I have no idea what you're trying to argue here. One the one hand you say crypotocurrency is moribund, but on the other hand now you're complaining about how there's a huge infrastructure for people using it.

This whole thing is a weird offshoot from a discussion on AI, for that matter. What's the relevance at this point?

[–] knightly@pawb.social 1 points 8 months ago (1 children)

An option that nobody uses isn't really an option.

It's clear that you're not getting my point, and I'm not sure how else I could explain it. Both of those things are true, crypto is dead and the zombified corpse that remains is still trying to sell worthless tokens to what few gullible marks remain before the market gets wise and the bottom falls out.

The relevance to AI is that large language models are following the same trajectory. The whole market is propped up by big tech investments, and the firehose of money they have pointed into that bonfire will only last so long as it takes them to realize that that language models aren't useful enough to justify their expense.

[–] FaceDeer@fedia.io 0 points 8 months ago (1 children)

And it's clear that you don't know how cryptocurrency is actually being used, so it's not a useful analogue to AI. Except to the point that you don't know how AI is being used either.

The things that LLMs are being used for are already justifying their expense, otherwise they wouldn't be used in the first place. There's no "bubble" to pop on the usage side of things where jobs have been replaced by it, AI isn'tgoingto"go away." It's possible that some of the big providers like OpenAI or Anthropic might go out of business or get bought out, that's always a risk for first movers in a field like this, but if they fail it will be because others have stepped up to provide those services even more cheaply.

[–] knightly@pawb.social 1 points 8 months ago* (last edited 8 months ago) (1 children)

No, they aren't. The things LLMs are being used for aren't significant enough to justify the costs. OpenAI is the most capital-intensive startup in Silicon Valley history and burns through almost a million bucks a day in data center operations alone. Its net income was -$540 million in 2022 and 2023 looks to be closer to losing a whole billion despite nonupling their income. They'll need to double their revenue again without raising costs just to start breaking even.

That kind of money-bonfire never lasts long.

[–] FaceDeer@fedia.io 1 points 8 months ago (1 children)

I literally addressed this in the comment you're responding to. The individual service providers don't matter, especially not first movers. This is an entire industry, it's not just one company.

[–] knightly@pawb.social 1 points 8 months ago (1 children)

Precisely. It isn't just one company blowing billions.

[–] FaceDeer@fedia.io 1 points 8 months ago (1 children)

You only mentioned OpenAI. That's one.

[–] knightly@pawb.social 1 points 8 months ago

If that's what you actually want, you can ask Google's LLM to list them for you.

[–] livus@kbin.social 3 points 8 months ago (1 children)

@FaceDeer

If it's a scam then at this point it's one of history's biggest and longest-running.

Kind of an overstatement. It hasn't even been 20 years. If it were a scam it'd be nowhere near the scale and timeframe of, say, Papal Indulgences.

This isn't anything to do with your wider argument btw, just me nitpicking. Didn't know you'd relocated to fedia.io, was that during the downtime?

[–] FaceDeer@fedia.io 1 points 8 months ago (1 children)

Heh, I suppose I can grant papal indulgence as a scam. Were I feeling edgy I could one up that and label the church as a whole as a scam. But since the usual accusation leveled against cryptocurrency is "ponzi scheme" I looked that up and noted that Madoff's the current record holder for one of those at a mere $65 billion.

Yeah, the kbin.social week of downtime was the final nudge I needed to set up an alternative account here. But honestly I was getting very frustrated with kbin.social's flakeiness already before then. I appreciate Ernest's work, but something like kbin can't be a single-person show in the long run. I hope he does well but now I don't have to reload the page every time I want to vote or comment.

[–] livus@kbin.social 1 points 8 months ago* (last edited 8 months ago)

I suppose I can grant papal indulgence

Heh love your wording there. I actually started that train of thought with various "relics" like the vial of Christ's blood, statues weeping tears etc some of which were eventually debunked iirc.

Ponzi schemes are quite possibly as old as granaries (or the pyramids), but I think a better contender for scope is something like the Mississipi Bubble which was absolute madness...

Kbin really is a work in progress. I'm sticking with it, but I have backup accounts for the times when it's too buggy.

[–] KevonLooney@lemm.ee 2 points 8 months ago (1 children)

Blockchain is over 10 years old and still not used for its primary purpose: a currency for legal transactions. It's way too volatile and very few institutions accept it.

AI can't reach its promised capability of doing everything for us automatically because it isn't actually AI. It's just advanced Clippy and autocomplete. It can't replace anyone senior. It's just a crappy intern.

[–] FaceDeer@fedia.io 0 points 8 months ago* (last edited 8 months ago) (1 children)

Why do you think that's its primary purpose? It has lots of uses. The point is that it's doing fine, it hasn't "gone away." And if you need a non-volatile cryptocurrency for some purpose there are a variety of stablecoins designed to meet that need.

AI can't reach its promised capability of doing everything for us automatically

Your criterion for a "bubble popping" is that the technology doesn't grow to completely dominate the world and do everything that anyone has ever imagined it could do? That's a pretty extreme bar to hold it to, I don't know of any technology that's passed it.

It's just advanced Clippy and autocomplete. It can't replace anyone senior.

So it can replace people lower than "senior?" That's still quite revolutionary.

When spreadsheet and word processing programs became commonplace whole classes of jobs ceased to exist. Low-level accountants, secretarial pools, and so forth. Those jobs never came back because the spreadsheet and word processing programs are still working fine in that role to this day. AI's going to do the same to another swath of jobs. Dismissing it as "just advanced Clippy" isn't going to stop it from taking those jobs, it's only going to make the people who were replaced by it feel a little worse about what they previously did for a living.

[–] knightly@pawb.social 4 points 8 months ago (1 children)

Why do you think that's its primary purpose? It has lots of uses. The point is that it's doing fine, it hasn't "gone away."

Sure, that's why I only ever hear about unregulated securities when a scam makes the news. XD

Your criterion for a "bubble popping" is that the technology doesn't grow to completely dominate the world and do everything that anyone has ever imagined it could do? That's a pretty extreme bar to hold it to, I don't know of any technology that's passed it.

My criterion for a bubble pop is the sudden and intense disinvestment that occurs once the irrational exuberance wears out and the bean counters start writing off unprofitable debt.

Given that so-called "AI" is falling into the trough of disillusionment, I'd expect it to begin in earnest within a few months.

So it can replace people lower than "senior?" That's still quite revolutionary.

No, it can't, because it isn't and cannot be made trustworthy. If you need a human to review the output for hallucinations then you might as well save yourself the licensing costs and let the human do the work in the first place.

[–] FaceDeer@fedia.io 2 points 8 months ago* (last edited 8 months ago) (1 children)

You've switched from saying that a cryptocurrency's "primary purpose" is as a currency for transactions to saying that they're securities, those are not remotely similar things.

Anyway. Are you aware that, assuming the Gartner hype cycle actually does apply here (it's not universal) and AI is really in the "trough of disillusionment", beyond that phase lies the "slope of enlightenment" wherein the technology settles into long-term usage? I feel like you're tossing terminology around in this discussion without knowing a lot about what it actually means.

No, it can't, because it isn't and cannot be made trustworthy. If you need a human to review the output for hallucinations then you might as well save yourself the licensing costs and let the human do the work in the first place.

If you think it can't replace anyone then why say "It can't replace anyone senior"?

Also, what licensing costs? Some AI providers charge service fees for using them, but as far as I'm aware none of them claim copyright over the output of LLMs. And there are open-weight LLMs you can run yourself on your own computer if you want complete independence.

[–] knightly@pawb.social 0 points 8 months ago (1 children)

You've switched from saying that a cryptocurrency's "primary purpose" is as a currency for transactions

That was someone else's argument, but it can't be denied that the original use-case envisioned blockchain securities as a currency.

Anyway. Are you aware that, assuming the Gartner hype cycle actually does apply here (it's not universal) and AI is really in the "trough of disillusionment", beyond that phase lies the "slope of enlightenment" wherein the technology settles into long-term usage? I feel like you're tossing terminology around in this discussion without knowing a lot about what it actually means.

I'm well aware, it's the slope down into the trough that will pop the bubble of "AI" overinvestment. I wouldn't be surprised if some application of LLM tech finds a profitable niche, but I'd be very surprised to see it in common use outside of automated copywriting for scammers.

No, it can't, because it isn't and cannot be made trustworthy. If you need a human to review the output for hallucinations then you might as well save yourself the licensing costs and let the human do the work in the first place.

If you think it can't replace anyone then why say "It can't replace anyone senior"?

That wasn't me.

Also, what licensing costs? Some AI providers charge service fees for using them,

Those ones.

but as far as I'm aware none of them claim copyright over the output of LLMs.

Hasn't it already been ruled that LLM outputs cannot be copyrighted, or was that just patents and I'm misremembering?

And there are open-weight LLMs you can run yourself on your own computer if you want complete independence.

Ah yes, because rolling your own unreliable text generator is so much less expensive. XD

[–] FaceDeer@fedia.io 0 points 8 months ago

That wasn't me.

Apologies, you're right. I took care to double check the wording but neglected to spot the different username.

Hasn't it already been ruled that LLM outputs cannot be copyrighted, or was that just patents and I'm misremembering?

No, there have been a lot of misleading news articles with headlines like that but nothing like that has been decided in any jurisdictions that I'm aware of.

The most popular news story to get headlines like that is the Thaler v. Perlmutter case, if you do a Google search for that you'll find an endless stream of "U.S. Court holds that AI generated works cannot be copyrighted" headlines. But that's not remotely what the case was actually about. What actually happened was that Thaler generated an image using an AI and then went to the US Copyright office to register the copyright *in the name of the program that generated it." That is, he went to the Copyright office and told them "the copyright for this work is solely held by the computer that generated it. Nobody else was involved in its creation." The copyright office responded "that's silly, copyright can only be held by a person (human or corporate). A computer is not a person." Since the list of copyright-holders Thaler was claiming was therefore zero, the Copyright office ruled that the work must be in the public domain.

Thaler sued, and in the subsequent court case he tried to add himself to the list of copyright-holders. The judge said "no, that's not what this suit is about, knock it off. You told the copyright office you didn't hold a copyright to that work, and as a result their ruling that the work was uncopyrighted was correct."

If he'd tried to claim copyright for himself from the start there wouldn't have been any problem. There have been other instances where humans have registered copyrights for works that they used an AI to generate. The only reason Thaler failed was because he specifically and explicitly said that he wasn't claiming copyright over it. This has unfortunately turned into one of those "suing McDonalds for making their coffee hot" semi-urban-legends.

And even if a U.S. court did make a ruling along those lines, the U.S. isn't the whole world. There are plenty of countries out there that would be happy to take the lead instead if the U.S. decided it didn't want to be supportive of local AI-driven industry.

Ah yes, because rolling your own unreliable text generator is so much less expensive. XD

It really is. I run LLMs on my home computer myself, for fun, using a commodity graphics card. The models it can run don't quite reach ChatGPT's level of sophistication but they're close, and they have the advantage that I can control them much more precisely to perform the tasks that I want them to perform. If I wanted to use a more sophisticated open model there are cloud providers that could run it for me for pennies, I just like having the hardware completely under my control.

[–] romp_2_door@lemmy.world 4 points 8 months ago (1 children)

hiring back to do what?

they don't even need that much staff

[–] knightly@pawb.social 1 points 8 months ago

hiring back to do what?

Generate revenue for the shareholders.

they don't even need that much staff

There's more work to be done than there are people to do it all, lol~