anzo

joined 1 year ago
MODERATOR OF
[–] anzo@programming.dev 4 points 4 months ago

I heard conduit.rs has lower memory requirements. Dunno if there's a easy to deploy container tho. Good luck!

[–] anzo@programming.dev 2 points 4 months ago (1 children)

Endless OS had everything bundled

[–] anzo@programming.dev 2 points 4 months ago

Retroshare.net

[–] anzo@programming.dev 2 points 4 months ago

And check their device compatibility list. Also, you'll need plenty of RAM.

[–] anzo@programming.dev 9 points 4 months ago (1 children)

Buckets have a lot of features that postgres don't. Like mounting via FUSE. And Garage in particular offers some integrations to apps, websited, and so on. I would go with this instead of having a column of byte data in a DB table. The pgsql solution might work in small and simple cases (e.g. storing the user's avatar in a forum) but even so, if I could or had to choose, I wouldn't do it.

[–] anzo@programming.dev 1 points 4 months ago (2 children)

There are different approaches or sentiments that bring people together. There's for example the left-politics platform disroot.org and they have also developed some solutions of their own (as in not only hosting, but coding). Autistici colective has this calendar called ganzo or similar iirc. That's something amazing to me.

[–] anzo@programming.dev 2 points 4 months ago* (last edited 4 months ago)

Install Lineage, Graphite, or some distro from xda-devs. Have fun!

[–] anzo@programming.dev 2 points 4 months ago (2 children)

Sell the iPhone, get Android. Best lifehack ever.

[–] anzo@programming.dev 2 points 4 months ago (1 children)

502 means the app is broken. For example, if it were Flask python, it would be raising an exception (e.g. divide by zero). If this is happening to many services or apps simultaneously, it is concerning. Turning it off sounds wise at this point.

[–] anzo@programming.dev 3 points 4 months ago (4 children)

Another consideration would be building communities around platforms and instances. That's how many of the open source world thrives!

[–] anzo@programming.dev 1 points 4 months ago

Have you tried ollama ? Some (if not all) models would do inference just fine with your current specs. Of course, it all depends on how many queries per unit of time you need. And if you wanted to load a huge codebase and pass it as input. Anyway, go try out.

view more: ‹ prev next ›