Dave
I don't know that the specific thing you want is available (Spotify but it has better downloading).
However, the tool I linked (zotify) does let you download playlists from Spotify (as files). Then you can put the files into a self hosted tool for the actual listening. If you're mainly using Spotify and trying to get songs to be synced to another service for offline use then that seems like a lot of work.
If you're willing to pay, you could just try a different service (Tidal, Qobuz, or Apple/Google offerings) and see if you like them better.
Yes but also no. the tldr is It will work, but video streaming is against CloudFlare rules. I ran this way for about 2 years with Plex just for my own use, so for about 15 hours a week on 480p and I never got my service suspended, but I’ve heard stories of others getting suspended… So just know it’s a risk
My understanding is that this clause was quietly removed from the Ts and Cs, perhaps 1 or 2 years ago. I haven't heard of anyone getting banned for it since then.
Personally while I have Jellyfin set up through Cloudflare, it's almost entirely run local-network only (with a local DNS entry in Pihole to connect to the domain direct when on my network) so I haven't had any issues but probably wouldn't trigger any unusual activity alarms in Cloudflare.
I'm not sure I really get what you are looking for when you say "selfhosted".
Yes there are self-hosted music platforms, but you bring your own music files.
Maybe you are looking for something like Spotube (available on F-Droid) that let's you download music (I believe it uses Spotify search then downloads the music from YouTube).
Or maybe you just need to find a tool to download music from Spotify (like zotify) then put them into a self hosted music platform like Navidrome or Jellyfin?
Thanks, I think that explains it a bit more. It is unexpected to me, as a non-git expert, and I'm sure many others.
Ah thanks, this explains it a bit more.
I'm still not sure that answers it. If I fork a project, and the upstream project commits an API key (after I've forked it), then they delete the commit, does this commit stay available to me (unexpected behaviour)? Or is it only if I sync that commit into my repo while it's in the upstream repo (expected behaviour)?
Or is it talking about this from a comment here:
Word of caution 2: The commit can still be accessible directly via SHA1. Force push does not delete the commit, it creates a new one and moves the file pointer to it. To truly delete a commit you must delete the whole repo.
Someone replies and said by having garbage collection kick in it removes this unconnected commit, but it's not clear to me whether this works for github or just the local git repo.
Perhaps the issue is that these commits are synced into upstream/downstream repos when synced when they should not be?
Like I said, I'm really confused about the specifics of this.
The article is really not clear. Is it saying if a project is forked, then the original is made private, the fork can access data from the private fork?
potentially enabling malicious actors to access sensitive information such as API keys and secrets even after users think they’ve deleted it.
Is this saying people misunderstand git and think committing a deletion makes people unable to access the previous version? Or is it saying the sharing between public and private repos can expose keys in private repos?
If you accidentally commit an API key into a public repository... you need to roll that key. Even if it was deleted completely, someone still could have accessed it while it was there.
When I use Organic Maps, I download the parts of the map I need (my country). Is there a reason something like Immich couldn't do something similar?
It's hundreds of MB per map section, but I have hundreds of GB of photos so it would be a drop in the bucket.
From what I've read, I would not trust any process other than the takeout process. Do the album thing to split it up.
Can you do one album at a time? Select the albums you want to download, then do that file. Then do the next few albums. That way you have direct control over the data you're getting in each batch, and so you'll have a week to get that batch instead of having to start again if the whole thing didn't finish in a week.
Battery tech has still come a long way since say 10 years ago, even though the "next gen" stuff hasn't made it to scaled production. Looks like this is the beginning of scaled production, though.