Scipitie

joined 11 months ago
[–] Scipitie@lemmy.dbzer0.com 7 points 1 week ago* (last edited 1 week ago) (1 children)

Not Op here, from what I've read is that the answer to that question is unknown but he showed a significant tolerance for some. Does that make it himself fine? In my book: yes.

For me personally it was enoughto leave the project behind as it's so closely tied to the person.

That's a call everyone needs to do for themselves though if course

[–] Scipitie@lemmy.dbzer0.com 1 points 1 month ago

I don't hate that much but I don't watch him because of the shady selling business hr often does and apparent sponsored content which is not always disclosed (been a while but his channel misrepresented graphics cards benchmarks for example).

It's like the British yellow press for me: his face alone is enough to discredit the quality of the source. Could it be good? Sure! Will I ever find out? Not anymore.

[–] Scipitie@lemmy.dbzer0.com 3 points 1 month ago

At least in Germany it's the same. It gets ignored in the discussions concerning nuclear exit but it's actually the main reason why I'm not aggressively against it: we have save areas for nuclear storage but those fight bitterly to not have it. The areas which are currently used are... Not good. Paying someone else (such as Finland) is out of budget for both state and energy companies. The latter anyway want to do the running but not the maintenance and the building, state should pay for that.

It's really white sad for me. The (true) statement that the dangerous waste needs to be stored carefully got corrupted to "it can't be stored".

[–] Scipitie@lemmy.dbzer0.com 2 points 2 months ago (1 children)

Oh sorry, nvidia RTX :) Thanks!

[–] Scipitie@lemmy.dbzer0.com 4 points 2 months ago (3 children)

Lowest price on Ebay for me is 290 Euro :/ The p100 are 200 each though.

Do you happen to know if I could mix a 3700 with a p100?

And thanks for the tips!

[–] Scipitie@lemmy.dbzer0.com 0 points 2 months ago

And still there are other people than you who want to do that full-time - and in doing so provide, at least for me, more value than the 6ooth marvel billion dollar movie.

There are educators and entertainers out there who chose this as a job and are good at it. If they could live off of it by going the patreon route instead of the shitty YouTube ad spam one I'd be all for it.

[–] Scipitie@lemmy.dbzer0.com 16 points 2 months ago

Which of those questions from the article would you describe as loaded enough to imply the quite interesting responses?

I expected to read something like "why are Chinese people stupid?" and then some racist shit - but the answers to those questions are.. Interesting.

[–] Scipitie@lemmy.dbzer0.com 19 points 2 months ago (6 children)

The bankruptcy scenario is correct but the first part isn't: you don't have X shares as collateral that you can liquidate. Instead, you have collateral to cover sum Y.

As long as the collateral contract covers enough stock positions the bank won't lose.

That said all of this is assuming standard contracts. If y bank wrote "0% interest and instead 50% of the revenue growth of Twitter" then this would be an easy way to lose money.

Haven't heard of a stupid banker yet, though, so what would the chances be?

[–] Scipitie@lemmy.dbzer0.com 2 points 3 months ago

Because it's basically axiomatic: ssh uses all keys it knows about. The system can't tell you why it's not using something it doesn't know it should be able to use. You can give a -i for the certificate to check if it doesn't know it because the content is broken or the location.

That said: this doesn't make -v more useful for cases like this, just because there's a reason!

[–] Scipitie@lemmy.dbzer0.com 1 points 3 months ago

I'd try chat gpt for that! :)

But to give you a very brief rundown. If you have no experience in any of these aspects and are self learning you should expect a long rampup phase! Perhaps there is an easier route but I'm not familiar with it if there is.

First, familiarize yourself with server setups. If you only want to host this you won't have to go into the network details but it could become a cause for error at one point so be warned! The usual tip here is to get yourself familiar enough with docker that you can read and understand docker compose files. The de facto standard for self hosting are linux machines but I have read of people who used Macos and even windows successfully.

One aspect quite unique to themodel landscape is the hardware requirements. As much as it hurts my nvidia despicing heart at this point in time they are the de facto monopolist. Get yourself a card with 12GB VRAM or more (everything below will be painful if you get things running at all. I've tried and pulled or smaller models on a 8GB card but experienced a lot of waiting time and crashes). Read a bit about Cuda on your chosen OS and what their drivers need.

Once you can understand this whole port, container, path mapping and environment variable things.

Then it's going to the github page linked, following their guide and starting a container. Loading models is actually the easier part once you have the infrastructure running.

[–] Scipitie@lemmy.dbzer0.com 4 points 3 months ago (3 children)

No offense intended, possible that I miss read your experience level:

I hear a user asking developer questions. Either you go the route of using the publicly available services (dalle and Co) or you start digging into hosting the models yourself. The page you linked hosts trained models to use in your own contexts, not for a "click button and it works".

As a starting point for image generation self hosting I suggest https://github.com/AUTOMATIC1111/stable-diffusion-webui.

For the training part, I'll be very blunt: if you don't indent to spend five to six digit sums on hardware or processing power, forget it. And even then you'd need the raw training data to pull it of.

Perhaps what you want to do use fine tune a pretrained model, that's something I only have a. It of experience in LLMs thohfn(and even there I don't have the hardware to get beyond a personal proof of concept).

[–] Scipitie@lemmy.dbzer0.com 17 points 3 months ago (6 children)

Both langchain as well as ollama run locally and are open source.

To be very frank: your post sounds like fear mongering without having even clicked on the link.

view more: next ›