hendrik

joined 5 months ago
[–] hendrik@palaver.p3x.de 2 points 3 days ago* (last edited 3 days ago)

And we still don't have any worthy successor. In contrast to the pony express.

[–] hendrik@palaver.p3x.de 24 points 3 days ago* (last edited 3 days ago) (16 children)

In my experience, the majority of people doesn't have the slightest clue how mail works. Somehow you type it in and provide it with an address into one of the three indistinguishable fields that are titled "To", " CC", "BCC". And by some black magic it either appears on the screen of the other person. Or it doesn't. That's about the amount of knowledge.

So comparing something to this is kind of meaningless.

[–] hendrik@palaver.p3x.de 1 points 3 days ago* (last edited 3 days ago)

Yeah, most PCs get turned on and off like once per day. For example at most offices or at home. That's perfectly within normal use.

[–] hendrik@palaver.p3x.de 1 points 4 days ago* (last edited 4 days ago) (2 children)

Consumer harddisks are made to be spinned up and down occasionally. Don't do it every five minutes... But I've been doing it for years and years with my server that spins up the disks once or twice a day, once I access some of my archived files. And it's perfectly fine.

[–] hendrik@palaver.p3x.de 4 points 4 days ago* (last edited 4 days ago) (1 children)

I think you can set up a VPN in a way that it doesn't forward all traffic, just specific traffic to one IP or a certain network, and everything else goes out the default route. That would leave you with your regular connection, except if you're talking to your VPS, then it'll go through the tunnel. But that won't help you with the android and multiple VPN apps at the same time.

Maybe you could configure the firewall on the VPS to drop all traffic from the internet, but just accept packets from your home IP address? I mean with most providers your IP is going to change regularly. You'd need some additional logic or write some script. Your VPS would add an exception to its firewall so you can access it, while dropping all other internet traffic by default. That'd be a solution completely without VPNs.

Or if it's just a few simple services... Lock them with some login screen and people would have to log in with username+password to your services.

[–] hendrik@palaver.p3x.de 3 points 5 days ago* (last edited 5 days ago) (2 children)

What's the difference regarding this task? You can rent it 24/7 as a crude webserver. Or run a Linux desktop inside. Pretty much everything you could do with other kinds of servers. I don't think the exact technology matters. It could be a VPS, virtualized with KVM, or a container. And for AI workloads, these containers have several advantages. Like you can spin them up within seconds. Scale them etc. I mean you're right. This isn't a bare-metal server that you're renting. But I think it aligns well with OP's requirements?!

[–] hendrik@palaver.p3x.de 7 points 5 days ago* (last edited 5 days ago) (4 children)

Well, there's both. I'm with runpod and they bill me for each second I run that cloud instance. I can have it running 24/7 or 30min on-demand or just 20 seconds if I want to generate just one reply/image. Behind the curtains, it's Docker containers. And one of the services is an API that you can hook into. Upon request, it'll start a container, do the compute and at your option either shut down immediately, meaning you'd have payed like 2ct for that single request. Or listen for more requests until an arbitrary timeout is reached. Other services offer similar things. Or a fixed price per ingested or generated token with some other (ready-made) services.

[–] hendrik@palaver.p3x.de 7 points 5 days ago* (last edited 5 days ago) (6 children)

That depends on the use-case. An hour of RTX 4090 compute is about $0.69 while the graphics card is like $1,600.00 plus computer plus electricity bill. I'd say you need to use it like 4000h+ to break even. I'm not doing that much gaming and AI stuff, so I'm better off renting some cloud GPU by the hour. Of course you can optimize that, buy an AMD card, use smaller AI models and pay for less VRAM. But there is a break even point for all of them which you need to pass.

[–] hendrik@palaver.p3x.de 9 points 5 days ago (1 children)

https://runpod.io

They also offer some templates, instructions and blog posts about this. And since I'm not advertising for a single company, there's also vast.ai and several others.

[–] hendrik@palaver.p3x.de 22 points 6 days ago (3 children)

Maybe they meant open, as in it leaks your private data and is open to be exploited by advertisers, malware etc?

[–] hendrik@palaver.p3x.de 13 points 6 days ago

What a shitshow. And there is a lot of harm done to everyone if this comes true.

view more: ‹ prev next ›