this post was submitted on 26 Aug 2024
168 points (93.8% liked)

Technology

59589 readers
2962 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

As we all know, AC won the "War of the Currents". The reasoning behind this is that AC voltage is easy to convert up/down with just a ring of iron and two coils. And high voltage allows us to transport current over longer distances, with less loss.

Now, the War of the Currents happened in 1900 (approximately), and our technology has improved a lot since then. We have useful diodes and transistors now, we have microcontrollers and Buck/Boost converters. We can transform DC voltage well today.

Additionally, photovoltaics produces DC naturally. Whereas the traditional generator has an easier time producing AC, photovoltaic plants would have to transform the power into AC, which, if I understand correctly, has a massive loss.

And then there's the issue of stabilizing the frequency. When you have one big producer (one big hydro-electric dam or coal power plant), then stabilizing the frequency is trivial, because you only have to talk to yourself. When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

I wonder, would it make sense to change our power grid from AC to DC today? I know it would obviously be a lot of work, since every consuming device would have to change what power it accepts from the grid. But in the long run, could it be worth it? Also, what about insular networks. Would it make sense there? Thanks for taking the time for reading this, and also, I'm willing to go into the maths, if that's relevant to the discussion.

you are viewing a single comment's thread
view the rest of the comments
[–] Ebby@lemmy.ssba.com 51 points 2 months ago (9 children)

I heard it said many years ago that if DC won the battle, we'd have power stations every 10 miles and power lines as thick as your wrist.

Converting local power is fairly easy, with AC inverters added for universal compatibility.

But, take note of how many DC voltages you use in your house. Devices in mine range from 3v to 25v and some weird one like 19v for a laptop. You'd still have adapters all over the place.

[–] gandalf_der_12te@lemmy.blahaj.zone 0 points 2 months ago (6 children)

Okay, these are short term problems. "power lines as thick as your wrist" depend on the voltage. If voltage conversion works well enough, that issue disappears.

But, take note of how many DC voltages you use in your house. Devices in mine range from 3v to 25v and some weird one like 19v for a laptop.

Yeah, that's why we need some kind of standard for these things.

[–] Ebby@lemmy.ssba.com 16 points 2 months ago (1 children)

Ha! Yes! Even today USB 5 volts is pretty sweet for low power stuff. USB PD re-complicates things, but it's not user dependent so that's a plus.

And you need a loooot of copper to prevent voltage drop especially when a grid of 100 houses 1/2 mile long draw 20-80 amps each. The math starts adding up real quick.

[–] bastion@feddit.nl 3 points 2 months ago* (last edited 2 months ago) (1 children)

I mean, you need a lot of voltage to make voltage drop irrelevant. Like, 120 or 240 volts. If distribution is voltage is the same dc/ac, we could use the same wiring (but different breakers, and everything else).

So the wiring argument doesn't really hold up - the question is more about efficient converters to reduce voltage once it's at the house.

I.e., for typical American distribution, it's 240 in the neighborhood and drops to 120 in the house. If the dc does the same, the same amount of power can be drawn along existing wires.

[–] Quatlicopatlix@feddit.org 1 points 2 months ago (1 children)

Yea have fun transmitting a decent amount of power with 240v over a meaningfull distance. Also most generators produce ac anyways so why would you recitify it at the generator instead of your device after a transformer? You still need all kinds of different voltages everywhere in your electronics and this means you still need to regulate it.

I am not shure how the american wirering worls out but to get from 240 to 120 you still need a transformer... or is it 240v between the different phases and then 120 from phase to neutral?

[–] bastion@feddit.nl 1 points 2 months ago (1 children)

240 in the neighborhood - i.e., that's enough to distribute from the pole to a few houses. Of course you have higher voltages to go longer distances. This is equally true for AC vs DC. Thus, the idea that it takes a looot of copper for DC is erroneous.

In fact, where conductor size is relevant is that you can use smaller conductors for DC, because of the skin effect.

Wiring: Split phase, that is also usable as 240 for large appliances. So, the latter.

[–] Quatlicopatlix@feddit.org 1 points 2 months ago

Skin effect with 50hz yea, no not much.

Ok so every time you change the voltage level you still need a transformer and a inverter to create ac, so no it doesnt make any sense.

load more comments (4 replies)
load more comments (6 replies)