The only externally accessible service is my wireguard vpn. For anything else, if you are not on my lan or VPN back into my lan, it’s not accessible.
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
This is the way.
Funnily enough it’s exactly the opposite way of where the corporate world is going, where the LAN is no longer seen as a fortress and most services are available publically but behind 2FA.
Corporate world, I still have to VPN in before much is accessible. Then there’s also 2FA.
Homelab, ehhh. Much smaller user base and within smackable reach.
Everything is behind a wireguard vpn for me. It's mostly because I don't understand how to set up Https and at this point I'm afraid to ask so everything is just http.
I've been using YunoHost, which does this for you but I'm thinking of switching to a regular Linux install, which is why I've been searching for stuff to replace YunoHost's features. That's why I came across Nginx Proxy Manager, which let's you easily configure that stuff with a web UI. From what I understand it also does certificates for you for https. Haven't had the chance to try it out myself tho because I only found it earlier today.
NPM is the way. SSL without ever needing to edit a config file.
Its not hard really, and you shouldn't be afraid to ask, if we don't ask then we don't learn :)
Look at Caddy webserver, it does automated SSL for you.
Thank you. It was mostly ment as a joke tho. I'm not actually afraid to ask, but more ignorant because it's all behind VPN and that's just so much easier and safer and I know how to do it so less effort. Https is just magic for me at the moment and I like it that way. Maybe one day I'll learn the magic spells but not today.
Everything is accessible through VPN (Wireguard) only
Same. Always on VPN on phone for on the go ad blocking via pihole.
Nothing I host is internet-accessible. Everything is accessible to me via Tailscale though.
I had everything behind my LAN, but published things like Nextcloud to the outside after finally figuring out how to do that even without a public IPv4 (being behind DS-Lite by my provider).
I knew about Cloudflare Tunnels but I didn't want to route my stuff through their service. And using Immich through their tunnel would be very slow.
I finally figured out how to publish my stuff using an external VPS that's doing several things:
- being a OpenVPN server
- being a cert server for OpenVPN certs
- being a reverse proxy using nginx with certbot
Then my servers at home just connect to the VPS as VPN clients so there's a direct tunnel between the VPS and the home servers.
Now when I have an app running on 8080 on my home server, I can set up nginx so that the domain points to the VPS public IPv4 and IPv6 and that one routes the traffic through the VPN tunnel to the home server and it's port using the IPv4 of the VPN tunnel. The clients are configured to have a static IPv4 inside the VPN tunnel when connecting to the VPN server.
Took me several years to figure out but resolved all my issues.
What benefit does it have instead of getting a dynamic DNS entry and port forwarding on your internet connection?
With DS-Lite you don't have a public IPv4. Not a static one but also not a dynamic one. The ISP just gives you a public IPv6. You share your IPv4 address with other users. This is done to use less IPv4s. But not having a dynamic IPv4 causes you to be unable to use DynDNS etc. It's simply not possible.
You could publish your stuff via IPv6 only but good luck accessing it from a network without IPv6.
You could also spin up tunnels with SSH actually between a public server and the private one (yes SSH can do stuff like that) but that's very hard to manage with many services so you're better of building a setup like mine.
Thanks for the great explanation!
100% is lan only cause my isp is a cunt
Tailscale with the Funnel feature enabled should work for most ISPs, since it's setup via an outbound connection. Though maybe they're Super Cunts and block that too.
Prompt: Super Cunt, photorealistic, in the style of Jill Greenberg.
I currently keep everything LAN-only because I haven't figured out how to properly set up outside access yet.
(I would like to have Home Assistant available either over the Internet or via VPN so that automations keyed off people's location outside the home would work.)
I have used DuckDNS and Nginx to get Home Assistant outside but it was horrible, just constantly breaking. Around Christmas time I bought myself a domain name for a few years and Cloudflare to access it, and it's been night and day since.
Sure it cost me money but it was far cheaper than a Nabu Casa account.
There's a wid range of opinions on this. Some people only access their services via tunnel, some people open most of their services up to the internet, as long as they're authenticated. One useful option for https services is to put them behind a reverse proxy that require oauth authentication, which allows you to have services over the internet, without increasing your attack surface. But that breaks apps like Nextcloud and Lemmy, so it's not a universal option.
Available to the internet via reverse proxy:
- Jellyfin
- Navidrome
- Two websites
- matrix chat server
- audiobookshelf
LAN only:
- homepage
- NGINX Proxy Manager
- Portainer
There’s more in both categories but I can’t remember everything I have running.
All of it is LAN only except Wireguard and some game servers.
Everything exposed except NFS, CUPS and Samba. They absolutely cannot be exposed.
Like, even my DNS server is public because I use DoT for AdBlock on my phone.
Nextcloud, IMAP, SMTP, Plex, SSH, NTP, WordPress, ZoneMinder are all public facing (and mostly passworded).
A fun note: All of it is dual-stacked except SSH. Fail2Ban comparatively picks up almost zero activity on IPv6.
Nothing outside the LAN. Just Tailscale installed on my Synology NAS, on HomeAssistant and on all my machines.
Unlike most here, I'm not as concerned with opening things up. The two general guidelines I use are 1. Is it built by a big organization with intent to be exposed, and 2. What's the risk if someone gets in.
All my stuff is in docker, so compartmentalized with little risk of breaking out of the container. Each is on it's own docker network to the reverse proxy, so no cross-container communication unless part of the same stack.
So following my rules, I expose things like Nextcloud and Mediawiki, and I would never expose Paperless which has identity documents (access remotely via Tailscale). I have many low-risk services I expose on demand. E.g. when going away for a weekend, I might expose FreshRSS so I can access the feed, but I'd remove it once I got home.
Nearly all of them. Nextcloud, Jellyfin, Vaultwarden, Spacebar, and 2fAuth, all set behind an NGINX Reverse Proxy, SWAG. SWAG made it very easy to set up https and now I can throw anything behind a subfolder or subdomain.
The only acessible element is the webserver. Fileserver, home automation, octopi, proxmox, media, etc etc are all isolate.
Nothing is exposed. There are things I want exposed, but I don't want to keep security patches up to date, even if there is a zero day. I'm looking for someone trustworthy to hire for things that it would be useful to expose, but they are hard to find.
something like 95% stays local and is remote accessed via wireguard, The rest is stuff I need to host via a hostname with a trusted cert because apps I use require that or if I need to share links to files for work, school etc. For the external stuff I use Cloudflare tunnels just because I use DDNS and want to avoid/can't use port forwarding. works well for me.
I probably have more accessible from outside than not. Many are required: hosting a website, a media server I can access from anywhere outside the house, my phone system, etc. Some I used to use more than I do now: podcast service, that sort of thing. Then a bunch that are internal only. My phone connects home over Wireguard so that's pretty convenient when out and about for accessing internal only systems.
Everything is accessible, but only through n2n vpn.
I keep everything behind a VPN so I don't have to worry much about opening things up to the Internet. It's not necessary about the fact that you're probably fine but more so what the risk to you is if that device is compromised, ex: a NAS with important documents, or the idea that if that device is infected, what can that device access.
You could expose your media server and not worry too much about that device but having it in a "demilitarized zone", ensuring all your firewall rules are correct and that that service is always updated is more difficult than just one VPN that is designed to be secure from the ground up.
Each time I've read into self-hosting it often sounds like opening stuff up to the internet adds a bunch of complexity and potential headaches, but I'm not sure how much of it is practicality vs being excessively cautious.
Limiting the attack surface is a big part, geo restrictions, reputation lists, brute force mitigation, it all plays a role. Running a vulnerability scanner against your stuff is important to catch things before others do and regular patching is important too. It's can be a rewarding challenge.
Can you recommend me a vulnerability scanner?
https://www.tenable.com/products/nessus/nessus-essentials
https://www.rapid7.com/blog/post/2012/09/19/using-nexpose-at-home-scanning-reports/
Both Nessus and Nexpose are typically enterprise class systems but they have community licensing available for home labs. Nessus can even be set up in a docker container. OpenVAS is more or less free but can be upgraded with pro-feeds, but last I tried it it was a bit more rough to use.
Do be aware though that throwing a full force scan will use a lot of CPU and can break things depending on the settings, so it's good to practice their settings on some non-critical systems first to get a feel for them.
Thanks sounds like a fun weekend project. My 72 cores are bored most of the time anyways. 😃
It’s always a balance between security and convenience. You have to mitigate what risk you are willing to well…risk
PII or anything that would demonstrate clear attribution is LAN, the rest of the "fun" stuff lives on a VPS. Wireguard between them.