bobburger

joined 8 months ago
 

Biden looked completely different at this rally compared to the debate last night.

Maybe it was the prepared script and notes, maybe he was really sick and had some covid head fog, or maybe it was sundowners. I don't know what changed, but he seems like an actual fullt functioning human here.

[–] bobburger@fedia.io 7 points 5 months ago (1 children)

This Git repo has a list of hidden ESPN APIs that might be useful for automatically pulling data.

[–] bobburger@fedia.io 2 points 5 months ago

Right? I want a PHEV so range isn't something I really need to think about.

[–] bobburger@fedia.io 5 points 5 months ago (1 children)

Fun fact: the person your replying to had absolutely no idea that a desalination plant was involved in this process.

[–] bobburger@fedia.io 10 points 5 months ago (1 children)

Since a lot of people seem to be jumping to extreme conclusions about this based on specious assumptions, here's how the process works according to the article:

Magrathea — named after a planet in the hit novel The Hitchhiker’s Guide to the Galaxy — buys waste brines, often from desalination plants, and allows the water to evaporate, leaving behind magnesium chloride salts. Next, it passes an electrical current through the salts to separate them from the molten magnesium, which is then cast into ingots or machine components.

[–] bobburger@fedia.io 2 points 6 months ago

Llamafile is a great way to get use an LLM locally. Inference is incredibly fast on my ARM macbook and rtx 4060ti, its okay on my Intel laptop running Ubuntu.

[–] bobburger@fedia.io 3 points 6 months ago (2 children)

When does it stop being a sandwich and start being pie? Is a sufficiently cheesy grilled cheese or quesadilla pie also? Your comment has really opened a can of worms for me.

[–] bobburger@fedia.io 4 points 6 months ago

So it was like a rib bib for chickpea products? That's very esoteric.

[–] bobburger@fedia.io 1 points 7 months ago

A simpler answer might be llamafile if you're using Mac or Linux.

If you're on windows you're limited to some smaller LLMs without some work. In my experience the smaller LLMs are still pretty good as chat bots so they might translate well.

view more: next ›