Wow, I asked the right person. Thanks for the info!
constantokra
I'm getting a bit concerned with logseq. It's just kind of backwards to have a web app packaged as a desktop/android app that can be hosted on a server, but you can't store your files there. I get that they want to monetize sync, but they're kind of bending over backwards here to not have what's inherently a pretty reasonable feature in a web based app, and it makes me concerned about what they're going to do with the project in the future.
Is there a way to embed portions of one page into another page, such that if you edit it on either the change shows up on both, like in logseq?
The documentation is actually pretty good, but i've not been able to find that feature, if it exists. That's probably the last thing keeping me on logseq.
The way they handle port forwarding is particulalry good, as compared with pia, that assigns a random port every time you bring up a connection, so you have to have a script to update your port in your client.
I think people can hide lots of things in code, especially when people don't generally look at it. And I know people don't look at it when they talk about how convenient the aur is. It's at best marginally more convenient than installing from source.
I'm not at all suggesting that people should place more trust in large companies. I'm suggesting that packages in the aur with lots and lots of users should be trusted more, specifically because some of them will be checking out the pkgbuild, and the source, and presumably some of them would notice if the software did something it wasn't supposed to do. Obviously the larger the software the harder that all is to check, and correspondingly you'd want to see many more users using it before you'd extend it any trust.
My point being, i've not seen these discussions taking place. Maybe I've just missed them. But I feel like it's appropriate to bring it up when I see people talking about just how.convenient the aur is. It's really not that convenient if you're using it in a way that i'd consider reasonable.
Yes. It is possible to verify what's going on. That's what I did when I used the aur. Do you think most people do that, or even look at see how many users are using the software? Or do you imagine they just install it blindly?
If you ever see a help video or article that suggests installing something from source, or run some script people generally tell the reader that they shouldn't just run random code without looking at it. I've never once seen anything that suggested people should check the pkgbuild. I don't have a problem with the aur. I just think it's not nearly as trustworthy as it's generally made out to be, and I don't think people generally understand that it might even be a concern, or that you can check the validity of the package yourself.
Right, and that's a good reason why you should feel reasonably comfortable installing very popular software from the aur, once it's been there for a while. That's not why people like the aur.
People like that you can get even unpopular stuff in the aur, and that's the stuff you need to be suspicious of. If you're getting some niche y2k era packet radio software from the aur, you should be checking how it's packaged and what is actually being packaged. And if you have the knowledge to do that you might as well get the source and install it yourself. I'll admit that i'm getting old, and I don't know if that's something people aren't willing or able to do these days.
Maybe i'm just cranky about arch, but it just seems really stupid to me to go through manually installing and setting up your system just to either install some random crap from the aur, or have to manually review it all because the official repos are pretty bare.
Ordinarily I use apt. Sometimes a flatpak if I trust the source. Otherwise it's from source or usually something i'm running in docker, where I'll check what it's actually doing if i'm at all suspicious.
I don't want to make too big a deal of the aur. When I was using arch and I needed something from the aur it was easy enough to see that it was a legitimately packaged piece of software. The only big deal is that it's a real pain in the ass, and I know most people aren't doing that, and I never see anyone mention it so I doubt people even consider that it could be an issue.
It comes down to what you trust. I trust the stuff I can get from Debian's repos. I trust some other sources, and everything else I look at. I don't trust the aur, and I sincerely doubt most people look at the software they're installing from it to make sure it's legit.
It's really none of my business what others are comfortable with. The trustworthiness of where you get your software is a decision you have to make for yourself, and with the way people go on about the aur I get the feeling they don't bother to decide. I don't ever hear anyone acknowledge that there's any sort of difference between the aur and Debian's repos, but that's just frankly an utterly absurd idea.
Do you look at the stuff in the aur? Because any of that stuff you install from there could be messed with because it's a user repository. I specifically left arch because I had to look into all the packages I installed from the aur, and the stuff from the official repos was pretty limited compared to something like Debian. That took a lot of time. Or, you could always just install whatever you find with zero concern about security.
I've been running Debian for decades with maybe 2 problems I had to manually resolve with apt. I ran arch and manjaro for maybe a year, and had a handful. I'm certainly not going to say not to run arch, but it's in no way easier to keep running than Debian. That's literally Debian's whole gig.
Everyone else is telling you to stay local, which is great advice, as far as it goes. But you said you want to host your website publicly available, so i'd recommend getting a cheap vps and starting there. It's not on your network, so if you screw up with security, worst case is you start again from scratch. I'd recommend the cheapest virmach VM you can get, with Debian or Ubuntu, if you like snaps.
First things first, set up ssh with key based logins, with a passphrase on a non standard port (doesn't provide security, but it will keep your logs from getting innundated immediately). Install UFW, and block all incoming traffic, allow all outgoing traffic, and limit traffic to your ssh port. Install docker and add your user to the docker group. Start learning how to use docker, compose, and as your first container, set up duplicati to back up your docker directory (including all your volumes, which I would store as folders inside your docker directory) somewhere else. I'd set it up to run every evening after you go to bed, and i'd also set a cron script to bring down all your containers before you back up, then bring them back up. Just in case.
I've previously had a problem with my server becoming unresponsive when running immich. It's been a while, but I remember there being some kind of memory leak having to do with immich. It was in their GitHub issues and everything. On my system it would take about a day and a half and then ssh, along with everything else, would become unresponsive. Rebooting would fix it for a day and a half. I stopped running immich and it hasn't happened since. I suppose you could try using a cron job to restart immich periodically and see if that resolves your problem.
I should probably figure out discord one of these days. Thanks for letting me know that's where to go for this project.