LostXOR

joined 1 year ago
[–] LostXOR@fedia.io 1 points 1 day ago

Either works fine; I just use screen out of habit lol.

[–] LostXOR@fedia.io 7 points 1 day ago (2 children)

I've always just run the server directly from the JAR in a screen session. If you're just running a simple server and don't need the features of Pterodactyl it's definitely the easiest option. Just download the JAR from Minecraft's website to a new directory, and run with java -Xmx4G -Xms4G -jar minecraft_server.1.21.6.jar nogui (The page says 1 gig of RAM, but I'd recommend more if you have it available).

[–] LostXOR@fedia.io 19 points 3 days ago (5 children)

Minecraft before the Microsoft acquisition.

[–] LostXOR@fedia.io 8 points 4 days ago
[–] LostXOR@fedia.io 23 points 4 days ago (2 children)

Isn't a million calories like 40L of gas? Anon's more gas than human lol.

[–] LostXOR@fedia.io 3 points 6 days ago

it’s very likely actual electricity use will soon not be an issue, we are already flooded with too much solar power during the day, in maybe as little as 10 or so years we may have so much solar and battery we have essentially unlimited renewable power

That could be true in the future, but right now a large chunk of our energy comes from fossil fuels.

on top of all of this ai is in its infancy, who knows where it will be at in 100 years, maybe jobs are a thing of the past and we just spend all day socialising and spending time on our hobbies

Any AI that can take a significant amount of work from humanity is going to have an architecture fundamentally different than LLMs. They're not thinking, they're just spitting out plausible sounding sentences.

[–] LostXOR@fedia.io 82 points 6 days ago (5 children)

Wtf did I just read

[–] LostXOR@fedia.io 38 points 6 days ago (2 children)

Natural selection is essentially just a massively parallel Monte Carlo optimization algorithm that's been running for billions of years. It's so simple yet produces such amazing complexity.

[–] LostXOR@fedia.io 65 points 6 days ago (8 children)

This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).

So not only do shitty "AI" models use >20x the energy of a human to "think," training them uses the lifetime energy equivalent of hundreds of humans. It's absolutely absurd how inefficient this technology is.

[–] LostXOR@fedia.io 27 points 6 days ago (1 children)

Compared to what, >50% for a 4090 in a PC?

[–] LostXOR@fedia.io 2 points 1 week ago

They raise the barrier of entry for creating spam accounts from "make a bunch of API calls" to "set up some kind of AI captcha solver/pay someone in India to do it for you." It doesn't stop spammers, but it makes it harder for them.

[–] LostXOR@fedia.io 12 points 1 week ago (2 children)

I've had a great experience here on fedia.io. It's a smaller instance, and it is running Mbin instead of Lemmy, but everything federates over so you get the same content. Might feel a bit weird switching from Lemmy, but if you feel like it I'd recommend giving it a try. :)

We're also defederated from Hexbear, lemmy.ml, and Lemmygrad if that's a factor.

 
 
view more: next ›