areyouevenreal

joined 1 year ago
[–] areyouevenreal@lemm.ee 1 points 1 month ago (2 children)

What made you think that?

[–] areyouevenreal@lemm.ee 5 points 1 month ago (4 children)

Stainless steel pans are amazing when used for the right purpose. They weigh much less than cast iron, don't require any maintenance beside cleaning them, and they are pretty much indestructible. If you burn something badly you can use metal scowering pads or any chemical you damn well like (including sodium hydroxide that will melt flesh) to get the thing clean again. They are tolerant to any cooking temperature you would ever use, ever. You can't overheat one with any appliance a normal kitchen would have. This means you can easily pop one in the oven provided it has a metal handle.

The only issue being they have no non-stick properties to speak of and relatively little thermal mass. This is good in that they don't need long to heat up, but bad in that it's not a consistent temperature and you have to know what you are doing with the power control to get the results you want. This means it's essentially useless for cooking things like steak, and difficult even to cook an omelet without using a lot of butter, ghee, or oil. Things like tomato sauces though? Perfect. The stainless steel could care less about the acidity.

[–] areyouevenreal@lemm.ee 2 points 1 month ago

People see AI and immediately think of ChatGPT. This is despite the fact that AI has been around far longer and does way more things including OCR and data mining. It's never been AI that's the problem, but rather certain uses of AI.

[–] areyouevenreal@lemm.ee 0 points 1 month ago (1 children)

I've seen teachers use this stuff and get actually decent results. I've also seen papers where people use LLMs to hack into a computer, which is a damn sophisticated task. So you are either badly informed or just lying. While LLMs aren't perfect and aren't a replacement for humans, they are still very much useful. To believe otherwise is folly and shows your personal bias.

[–] areyouevenreal@lemm.ee 1 points 1 month ago (3 children)

I am not talking about things like ChatGPT that rely more on raw compute and scaling than some other approaches and are hosted at massive data centers. I actually find their approach wasteful as well. I am talking about some of the open weights models that use a fraction of the resources for similar quality of output. According to some industry experts that will be the way forward anyway as purely making models bigger has limits and is hella expensive.

Another thing to bear in mind is that training a model is more resource intensive than using it, though that's also been worked on.

[–] areyouevenreal@lemm.ee 0 points 1 month ago (5 children)

Bruh you have no idea about the costs. Doubt you have even tried running AI models on your own hardware. There are literally some models that will run on a decent smartphone. Not every LLM is ChatGPT that's enormous in size and resource consumption, and hidden behind a vail of closed source technology.

Also that trick isn't going to work just looking at a comment. Lemmy compresses whitespace because it uses Markdown. It only shows the extra lines when replying.

Can I ask you something? What did Machine Learning do to you? Did a robot kill your wife?

[–] areyouevenreal@lemm.ee 0 points 1 month ago (7 children)

Even if it didn't improve further there are still uses for LLMs we have today. That's only one kind of AI as well, the kind that makes all the images and videos is completely separate. That has come on a long way too.

[–] areyouevenreal@lemm.ee 2 points 2 months ago

From what I heard they do actually put a lot of effort into simulating airplane aerodynamics at least for the smaller planes. So the flying part is kind of important.

[–] areyouevenreal@lemm.ee 19 points 2 months ago

(Jumping from an insult about dietary preferences to an insult about war crimes is not the same magnitude)

The potato joke is a joke about war crimes even if the person telling it doesn't realize this. They actually responded in kind.

[–] areyouevenreal@lemm.ee 1 points 2 months ago

Still having these issues very recently.

[–] areyouevenreal@lemm.ee 21 points 2 months ago

Github has a container register you can use.

[–] areyouevenreal@lemm.ee 20 points 2 months ago (1 children)

Does anybody actually use that feature though?

 

I have a calibre server setup on my home server and was wondering how to sync it to the version on my desktop so I can upload books to an ereader using USB.

 

Trying to figure out how to setup an aria2 server. It seems to rely on XDG dirs which isn't normally setup on LXC containers. I don't want to setup a whole GUI VM just for one application.

19
submitted 8 months ago* (last edited 8 months ago) by areyouevenreal@lemm.ee to c/piracy@lemmy.dbzer0.com
 

I have been having issues using real debrid. One of the big issues is downloads getting stuck continuously retrying. I think this is because I am using 5G internet that has issues with latency and occasional packet loss. Similarly I get buffering issues when using Stremio.

Another problem I am having is with the permission set by the real debrid client if I use the docker version. It doesn't allow for changing the UID and GID the real debrid process uses. This causes problems for the other services, though I can work around this using a manual installation in an LXC container.

Does anyone with more experience know how to fix this?

Edit: I can't edit the title to fix the spelling. You will just have to deal.

Edit2: Fixed the permission issues. Also managed to work out that the problem isn't 5G. The download client is just bad. Apparently you can use external downloaders, but this requires deploying services like aria2c, which it turns out is actually quite hard.

 

I am a bit lost as to how you use authentik to do single sign on.

I can connect things that have external access quite easily using the reverse proxy provider that's built into authentik. I am struggling with how I would connect things that are on a docker network and can't be accessed directly. Normally with nginx proxy manager I would put it on the same network, but I don't think this is correct for authentik. Am I supposed to create a docker outpost?

Other people are using authentik + nginx proxy manager and I am a bit lost why they are doing that.

 

I am currently living with my parents and we have just started an Internet contract with a 5G wireless company.

The issue is the MFND settings are behind a password and likely not allowed access by the ISP. Even if they weren't doing port forwarding on 5G likely isn't possible because of CGNAT. I think I can use cloudflare tunnels or tailscale to get around this, and not many things need to be directly accessible from the Internet.

The more annoying thing is that setting DHCP reservations likely isn't possible without getting access to the settings. It's going to make setting up static IPs difficult too.

Before anyone asks fixed line Internet almost certainly isn't practical in this area. Getting our own modem while possible is more expensive and potentially difficult, and would mean cancelling the contract.

Is there a reasonable way to work around these issues?

Any help or advice would be appreciated.

 

Hello based people of lemmy,

I have recently started trying out BSDs as an alternative to Linux and found out that Spotify isn't supported. Before you say try it in a browser this doesn't work as spotify has DRM that doesn't work on BSD OSes.

Now is there a way to stream music similar to Spotify? I know there is a downloader program available.

Furthermore do you know what self-hosted options are available? I already have a basic *arr stack and am always up for convoluted server and Linux hijinks.

 

I am having issues with Jellyfin not finding ffmpeg on FreeBSD. Is there any solution to this?

view more: next ›