this post was submitted on 25 Jul 2024
1005 points (97.5% liked)

Technology

59627 readers
2807 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The new global study, in partnership with The Upwork Research Institute, interviewed 2,500 global C-suite executives, full-time employees and freelancers. Results show that the optimistic expectations about AI's impact are not aligning with the reality faced by many employees. The study identifies a disconnect between the high expectations of managers and the actual experiences of employees using AI.

Despite 96% of C-suite executives expecting AI to boost productivity, the study reveals that, 77% of employees using AI say it has added to their workload and created challenges in achieving the expected productivity gains. Not only is AI increasing the workloads of full-time employees, it’s hampering productivity and contributing to employee burnout.

you are viewing a single comment's thread
view the rest of the comments
[–] Womble@lemmy.world 2 points 4 months ago* (last edited 4 months ago) (1 children)

AI is a rounding error in terms of energy use. Creating and worldwide usage of chatGPT4 for a whole year comes out to less than 1% of the energy Americans burn driving in one day.

[–] FlyingSquid@lemmy.world 3 points 4 months ago (1 children)

I think I'll go with Yale over 'person on the Internet who ignored the water part.'

https://e360.yale.edu/features/artificial-intelligence-climate-energy-emissions

From that article:

Estimates of the number of cloud data centers worldwide range from around 9,000 to nearly 11,000. More are under construction. The International Energy Agency (IEA) projects that data centers’ electricity consumption in 2026 will be double that of 2022 — 1,000 terawatts, roughly equivalent to Japan’s current total consumption.

[–] Womble@lemmy.world 2 points 4 months ago* (last edited 4 months ago) (2 children)

Forgive me for not trusting an ariticle that says that AI will use a petawatt within the next two years. Either the person who wrote it doesnt understand the difference between energy and power or they are very sloppy.

Chat GPT took 50GWh to train source

Americans burn 355 million gallons of gasoline a day source and at 33.5 Kwh/gal source that comes out to 12,000GWh per day burnt in gasoline.

Water usage is more balanced, depending on where the data centres are it can either be a significant problem or not at all. The water doesnt vanish it just goes back into the air, but that can be problematic if it is a significant draw on local freshwater sources. e.g. using river water just before it flows into the sea, 0 issue, using a ground aquifer in a desert, big problem.

[–] FlyingSquid@lemmy.world 2 points 4 months ago (1 children)

Training is already over. This has nothing to do with training, so that is irrelevant. This is about how much power is needed as it is used more and more. I think you know that.

Also, I'm not sure why you think just because cars emit a lot of CO2, it doesn't mean that other sources that emit a lot of CO2, but less than cars, are a good thing.

The water doesnt vanish it just goes back into the air,

Cool, tell that to all the people who rely on glaciers for their fresh water. That only includes a huge percentage of people in India and China.

But really, what you're telling me is that studies and scientists are wrong and you're right. Cool. Good luck convincing people of that.

[–] Womble@lemmy.world 2 points 4 months ago* (last edited 4 months ago) (1 children)

This New Yorker article estimates GPT usage at 0.5GWhr a day, which comes out to 0.0041% of the energy burnt just in vehicle gasoline per day in the USA (and this is for worldwide usage for chatGPT).

I'm not asking you to trust me at all, I've listed my sources, if you disagree with any of them or multiplying three numbers together that's fine.

Cool, tell that to all the people who rely on glaciers for their fresh water. That only includes a huge percentage of people in India and China.

Yes, if you read my last reply I answered that directly. Water usage can be a big issue, or it can be a non-issue, its locale dependent.

[–] FlyingSquid@lemmy.world 0 points 4 months ago* (last edited 4 months ago) (2 children)

What New Yorker article? You didn't link to one. I, however, linked to Yale University which has a slightly better track record on science than The New Yorker.

And, again, you are arguing that emitting less CO2 is a good thing. It is not.

And if water can be a big issue, why is AI a good thing when it uses it up? You can say "people shouldn't build data centers in those locations," but they are. And the world doesn't run on "shouldn't."

Edit: Now you linked to it. It's paywalled, which means I can't read it and I doubt you did either.

[–] Womble@lemmy.world 2 points 4 months ago* (last edited 4 months ago) (2 children)

Apologies, I didn't post the link, it's edited now.

If you want to take issue with all energy usage that's fine, its a position to take. But it's quite a fringe one given that harnessing energy is what gives us the quality of life we have. Thankfully electricity is one of the easiest forms of energy to decarbonise and is already happening rapidly with solar and wind power, we need to transition more of our energy usage to it in order to reduce fossil fuel usage. My main point is that this railing against AI energy usage is akin to the whole plastic straw ban, mostly performative and distracting from the places where truely vast amounts of fossil fuels are burnt that need to be tackled urgently.

You can say “people shouldn’t build data centres in those locations,” but they are. And the world doesn’t run on “shouldn’t.”

I'm 100% behind forcing data centres to use sustainable water sources or other methods of cooling. But that is a far cry from AI energy consumption being a major threat, the vast majority of data centre usage isn't AI anyway, it's serving websites like the one we are talking on right now.

[–] rekorse@lemmy.world 1 points 3 months ago

Why can't we analyze AI on its own merits? We dont base our decisions on whether an idea is more or less polluting than automobiles. We can look at what we are getting for what's being put into it.

The big tech companies could scrap their AI tech today and it wouldnt change most peoples lives.

[–] FlyingSquid@lemmy.world -1 points 4 months ago (1 children)

Apologies, I didn’t post the link, it’s edited now.

Yes, and it's paywalled, so I can't read it. I think you knew that. It could say anything.

I’m 100% behind forcing data centre’s to use sustainable water sources or other methods of cooling.

Cool, good luck with that happening.

But that is a far cry from AI energy consumption being a major threat,

A different subject from water. You keep trying to get away from the water issue. I also think you know why you're doing that.

Also, define threat. It contributes to climate change. It gets rid of potable water. I'd call that a threat.

By the way, there is nowhere in the U.S. where water is not going to be a problem soon.

https://geographical.co.uk/science-environment/us-groundwater-reserves-being-depleted-at-alarming-rate

But hey, we can just move the servers to the ocean, right? Or maybe outer space! It's cold!

[–] Womble@lemmy.world 1 points 4 months ago (1 children)

Ok, you just want to shout not discuss so I wont engage any further.

[–] FlyingSquid@lemmy.world -1 points 4 months ago* (last edited 4 months ago)

That's a nice cop-out there since nothing I said could remotely be considered shouting and your New Yorker article in no way supported your point.

[–] Womble@lemmy.world 2 points 4 months ago (1 children)

Whole article for ref since you cant access it for whatever reason (its not very nice assuming bad faith like that btw)

In 2016, Alex de Vries read somewhere that a single bitcoin transaction consumes as much energy as the average American household uses in a day. At the time, de Vries, who is Dutch, was working at a consulting firm. In his spare time, he wrote a blog, called Digiconomist, about the risks of investing in cryptocurrency. He found the energy-use figure disturbing.

“I was, like, O.K., that’s a massive amount, and why is no one talking about it?” he told me recently over Zoom. “I tried to look up some data, but I couldn’t really find anything.” De Vries, then twenty-seven, decided that he would have to come up with the information himself. He put together what he called the Bitcoin Energy Consumption Index, and posted it on Digiconomist. According to the index’s latest figures, bitcoin mining now consumes a hundred and forty-five billion kilowatt-hours of electricity per year, which is more than is used by the entire nation of the Netherlands, and producing that electricity results in eighty-one million tons of CO2, which is more than the annual emissions of a nation like Morocco. De Vries subsequently began to track the electronic waste produced by bitcoin mining—an iPhone’s worth for every transaction—and its water use—which is something like two trillion litres per year. (The water goes toward cooling the servers used in mining, and the e-waste is produced by servers that have become out of date.)

Last year, de Vries became concerned about another energy hog: A.I. “I saw that it has a similar capability, and also the potential to have a similar growth trajectory in the coming years, and I felt immediately prompted to make sure people are aware that this is also energy-intensive technology,” he explained. He added a new tab to his blog: “AI sustainability.” In a paper he published last fall, in Joule, a journal devoted to sustainable energy, de Vries, who now works for the Netherlands’ central bank, estimated that if Google were to integrate generative A.I. into every search, its electricity use would rise to something like twenty-nine billion kilowatt-hours per year. This is more than is consumed by many countries, including Kenya, Guatemala, and Croatia.

“There’s a fundamental mismatch between this technology and environmental sustainability,” de Vries said. Recently, the world’s most prominent A.I. cheerleader, Sam Altman, the C.E.O. of OpenAI, voiced similar concerns, albeit with a different spin. “I think we still don’t appreciate the energy needs of this technology,” Altman said at a public appearance in Davos. He didn’t see how these needs could be met, he went on, “without a breakthrough.” He added, “We need fusion or we need, like, radically cheaper solar plus storage, or something, at massive scale—like, a scale that no one is really planning for.”

Video From The New Yorker

What a Mammal’s Loss Teaches Us About Mortality: Requiem for a Whale

Last week, the International Energy Agency announced that energy-related global CO2 emissions rose, yet again, in 2023, to more than thirty-seven billion metric tons. The increase comes at a time when the whole world is supposedly striving to reach net-zero emissions, and it indicates that global efforts are, to put it mildly, falling short. Much of the increase in emissions came from China, and most of it was driven by century-old technologies, such as the internal-combustion engine. So data centers are, for now at least, a small part of the problem. Still, as the use of A.I. ramps up and bitcoin prices reach new heights, the question is: How can the world reach net zero if it keeps inventing new ways to consume energy? (In the U.S., data centers now account for about four per cent of electricity consumption, and that figure is expected to climb to six per cent by 2026.)

Mining cryptocurrencies like bitcoin eats up electricity owing to the way the system was set up. To acquire bitcoin (and other currencies that rely on a similar scheme), miners compete to answer cryptographic riddles. Winning the competition takes a lot of computing power. As a result, server farms devoted to crypto mining tend to be situated in parts of the world where electricity is cheap. China used to lead the world in crypto mining, but it imposed a ban on the practice in 2021, and now the U.S. is No. 1. A few months ago, the U.S. Department of Energy tried to compel mining concerns to report their energy use, but in February a Texas judge issued a temporary restraining order blocking the effort. (According to the White House Office of Science and Technology Policy, crypto mining in the U.S. uses almost as much energy as all the nation’s home computers combined.) Meanwhile, the higher the price of bitcoin rises—it reached a record of sixty-nine thousand dollars on March 5th—the bigger the financial incentives for mining it, and the more energy consumed.

Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

A.I. could potentially be used to alleviate some of the problems it is exacerbating. For instance, it might be used to improve the efficiency of renewable-energy systems, which could reduce emissions from server farms. But it seems unlikely that such gains will keep up with A.I.’s growing electricity demands; this, presumably, is why Altman argues that a technological breakthrough is needed.

De Vries, for his part, is dismayed by what he sees as a lack of human learning in the face of so much machine learning. “I think the only thing that’s realistic in terms of policy, at least in the short to medium term, is disclosure requirements,” he said. “It’s taken a very long time before we got there with regard to cryptocurrencies, and I’m disappointed that we haven’t gotten there sooner with A.I. It’s like we saw what cryptocurrency mining could do, and we totally forgot about it.” ♦

[–] FlyingSquid@lemmy.world 0 points 4 months ago* (last edited 4 months ago) (1 children)

Your link is just about Google's energy use, still says it uses a vast amount of energy, and says that A.I. is partially responsible for climate change.

It even quotes that moron Altman saying that there's not enough energy to meet their needs and something new needs to be developed.

I have no idea why you think this supports your point at all.

[–] Womble@lemmy.world 2 points 4 months ago (1 children)

Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

That was the only bit I was referring to for a source for 0.5GWh energy usage per day for GPT, I agree what Altman says is worthless, or worse deliberately manipulative to keep the VC money flowing into openAI.

[–] FlyingSquid@lemmy.world -1 points 4 months ago

I see, so if we ignore the rest of the article entirely, your point is supported. What an odd way of trying to prove a point.

Also, I guess this was a lie:

Ok, you just want to shout not discuss so I wont engage any further.

Although since it was a lie, I'd love you to tell me what you think I was shouting about.

[–] rekorse@lemmy.world 1 points 3 months ago

They aren't just taking water noone was using.