harsh3466

joined 1 year ago
[–] harsh3466@lemmy.world 1 points 8 months ago

It’s definitely a YMMV situation. I’ve heard from lots of people that it runs solid as a rock in Docker, and from others like you and me where it’s flaky af.

[–] harsh3466@lemmy.world 2 points 8 months ago (1 children)

I’m glad yours is stable! I don’t know why, but mine, if you’d cut a loud fart near the server Nextcloud would just shit the bed on me. God forbid I try to update Nextcloud.

Like you I had most plugins disabled, and I was the only user. I first ran Nextcloud using NextcloudPi on an rpi4, and that ran solid for like four years. However, when I repurposed that pi and moved Nextcloud to my server in Docker, it just would not reliably run for me no matter what I did. At that point I also wasn’t really using Nextcloud anymore so I just abandoned it as not worth the effort.

[–] harsh3466@lemmy.world 4 points 8 months ago (5 children)

I abandoned nextcloud entirely a couple years ago. It was just too damn flaky (self hosted via docker).

[–] harsh3466@lemmy.world 6 points 8 months ago

Chiming in for Radicale. Been running it for a couple years. It’s great

[–] harsh3466@lemmy.world 4 points 8 months ago

That’s because Apple, as they love to do, decided to make their own special version of 2fa within their little garden.

Y’know, instead of going with the accepted totp method.

[–] harsh3466@lemmy.world 11 points 8 months ago

Wait. An email just to get a download link AND a cloud account. Fuck that.

[–] harsh3466@lemmy.world 4 points 8 months ago

Exactly. I generally like typing out my commands because I’m learning and it helps me remember what I’m doing and what the commands mean/how they work. And if it’s a particularly long one I’ll make an alias for it.

[–] harsh3466@lemmy.world 2 points 8 months ago* (last edited 8 months ago) (2 children)

Edit: tl;dr is that Grav worked out of the box.

Bear in mind that I just started messing around with Grav, so I’m no expert. With that being said, I tried 11ty and Hugo (spent 2-3 weeks messing around testing them).

What I was looking for was a static site generator that let me easily use a simple/clean theme, and would generate the webpages from markdown files.

What prompted me to look for this was my Wordpress site breaking. I’m currently self hosting wordpress via docker. While retooling my server (install updated server os, import raid, relaunch containers), my Wordpress container broke. It was still serving my website, but I couldn’t do anything on the backend because of a database permission error. I had just spent a day fighting with and fixing database permissions on another container, and I decided I wanted to look into these ssgs and see if it would simplify dealing with my website.

11ty seemed really promising until I tried to theme it with a starter pack. What was confusing to me at first was that 11ty doesn’t theme like you think of theming something like Wordpress. You don’t set up/intstall 11ty, and then download/install a theme. Instead, you use a starter pack, which is a theme that includes an implementation of 11ty. (You can write your own theme with a barebones 11ty setup, but I’m not a web dev and don’t want to be.) I must have tried 15 different 11ty starter packs, and not a single one of them worked and/or was maintained, and these were the ones linked/provided on the 11ty website.

Had I found a pack/theme that worked (and met my criteria of being a simple/clean theme), I’d have been very happy with 11ty. The core of 11ty worked great for me (take a .md file and make it html), but the starter pack situation IMO is a disaster to anyone who isn’t a competent web dev.

Hugo was much simpler when it comes to theming, just git clone a repository (which, I get is not ideal for everyone, but also, isn’t all that different from downloading a theme zip file for something like wordpress). Hugo seemed promising to me, but despite the relative ease of cloning a repo for a theme, I couldn’t get Hugo to generate a single page of content. I read the docs, watched tutorials, got frustrated, kept getting errors, and noped out.

Grav when I tried it was exactly what I was looking for. Out of the box it has a nice simple theme. I can drop a markdown file into a folder and it automatically generates it. That part is even better than Hugo, 11ty and other ssgs. You don’t have to run a command to generate or edit an html file from a markdown file. It watches the content folder and when it finds new/changed .md files it auto generates them.

I also really liked that Grav easily did what these other ssgs claim is easy to do (but in my experience failed to deliver on), and provided some additional complexity for making management a little easier via the web ui.

Overall I like Grav, but I’m not actually using it. I ended up fixing the database permissions on my wordpress container. I’m going to keep Grav around in case I decide to migrate, or if I ever decide to launch another site/project.

[–] harsh3466@lemmy.world 1 points 8 months ago (4 children)

I was finding the same with ssgs, until I ran into Grav. You might like it. It’s got a web ui kind of like Wordpress, and you can write right from the web ui, but you can also just write markdown files and dump them into a directory to generate posts/pages.

[–] harsh3466@lemmy.world 2 points 9 months ago

Thank you (switched to an lemmy.ml account because I haven't been able to comment or post on lemmy.world for over a week)

Not really specific skills, I'm just a hands on learner/tinkerer. I've been messing with self hosting for around three years now, so spinning up new docker services is fairly easy (fairly. I still have a lot to learn about docker). In doing so, I've used and referred to github a lot, and even used git to clone repositories for self hosting a service, but beyond that, I hadn't looked into it as it didn't seem relevant to me at the time.

And thank you for the Forgejo information! I will look into that and compare to see which one I'd like to use. Coincidentally I just saw today or yesterday that Forgejo has gone for a hard fork.

[–] harsh3466@lemmy.world 1 points 9 months ago

+1 for Portainer CE. works like a charm.

 

Another fun week of tinkering! Here’s what I learned:

How to implement a for loop in bash scripts using seq.

I’ve been working on a script to create folders for my tv show library to play nice with my Jellyfin server. What I wanted was for the script to:

  • prompt me for the show’s name
  • query The Movie Database: Shows api for the show
  • present me a numbered list of the show results formatted as index showname year tmdb-id
  • prompt me to choose the correct result from the list
  • create a directory formatted as Show.Name.(YYYY).[tmdbid-xxxxx]

Since the number of results will vary from query to query, I couldn’t use a preset range like {0..5} for my for loop. I tried without success to have the loop iterate through the JSON response, but I was unable to figure out how to do that.

So, while likely inelegant, What I did was:

  • take the JSON response and pipe it to jq, get the number of results
  • Since jq indexes start with 0, take the number of results and subtract 1, setting the results of that calculation as my $count variable
  • loop through the JSON using for i in $(seq 0 $count) ; do to create the indexed list of results to choose from

How to use jq to work with and extract data from JSON objects

I’m just scratching the surface of jq, but I’m finding it very useful! I’ve worked with JSON before making automations on iOS with the Shortcuts app, so getting up and running with jq was pretty easy once I understood the syntax.

Note: I know tools like Filebot exist to do the kind of thing I’m doing with this script. I’m writing my own scripts from scratch in order to learn

Git and Github are different things

On my post last week a number of people suggested using Git. I already was aware of Github, and because I didn’t know what I didn’t know, I thought Git and Github were parts of a whole. I also generally knew that Git/Github are used for version control, but that was the extent of my knowledge. I still know very little, but I do now understand that Git and Github are independent things that can work together.

I also went ahead and set myself up a gitea instance on my server for when I’m ready to create repositories for myself for my scripts and dotfiles

 

I didn’t get to spend as much time tinkering and learning this week, but I still learned some new things!

  1. Wireguard is great! I had been using OpenVPN because when I initially set up my machine, my VPN had a bug with Wireguard. I was setting up a raspberry pi today for some more tinkering, and I decided to try Wireguard to see if the bug was fixed. Not only is it fixed, but Wireguard is much easier to work with. Not hating on OpenVPN, but I’ll definitely be preferring Wireguard going forward.
  2. Proper use of find, particularly with regex. This is ongoing. I’ve been using find for awhile, but not with full understanding of it’s options and syntax. I’m starting to get a better understanding of how to use it to find and manipulate the files I’m looking for. One of the biggest things that’s tripping me up with find and regex is designating the path.
  3. How to set up a new user. This was interesting. I already knew the basics, adduser -m username, sudo passwd username, but what I didn’t know anything about was --skel for copying over the skeleton shell config files. I didn’t even know the skeleton config files existed.
  4. The shell prompt can be customized. This was interesting. I was setting up a non root user on a vps that I have, and after creating the user, all I had was the $ prompt. No user@host, and no working directory. After some reading I found that adding PS1='$(whoami)@$(hostname):$(pwd)$ ' to ~/.profile will show a more traditional user@host:working/directory$ prompt. I’m sure this is not the only way to do this, and may not be the best way to do it, but based on my limited knowledge, it is the way that I’m currently doing it on my vps.
56
submitted 10 months ago* (last edited 10 months ago) by harsh3466@lemmy.world to c/linux@lemmy.ml
 

I’ve been homelabbing for a few years now, and recently I’ve really been focusing in on learning how to use gnu/linux. I thought it might be fun to periodically share the things I’ve been learning. The stand outs for me this past week were:

  1. Use the full path when referencing files and directories in bash scripts (Edit: when it makes sense, which is something I’m also still learning. This mkaes sense when the files will always be located in the same place.)
  2. In a bash script, the variable ${file##*/} will get you the name of the file your script is handling (example, when looping over files in a directory. I believe that’s a shell/bash standard variable, but I need to learn where it came from and how it works)
  3. Ubuntu gets a ton of justifiable criticism, but I find Canonical’s Multipass to be a great tool for spinning up Linux virtual machines. Especially on Apple silicon macs.
  4. Piping the output of ls to grep as a variable in a path is a great way to change to a directory you know exists but can’t remember the exact name of. (Example: cd ~/movies/“$(ls ~/movies | grep movie-name)”)
  5. The reason Mac cli utilities have syntax variations compared to the standard gnu/linux utilities is because macOS and its cli utilities are BSD based. This was information I knew at a high level, but had never really understood the implications of until this week.
  6. Related to point 5, if you’re on macOS trying to learn and you’re annoyed by the syntax differences between bsd and gnu utilities, you can run this script from darksonic37 on github to remove the bsd utilities from macOS and replace them with their gnu counterparts. (I have not run or reviewed the script. I found mulitpass first, and so far I’m happy using the Ubuntu virtual machine)
38
I'm an idiot (arm) (lemmy.world)
submitted 10 months ago* (last edited 10 months ago) by harsh3466@lemmy.world to c/linux@lemmy.ml
 

EDIT: Putting this at the top because not everyone is seeing what I actually need. I can unpack the rar archive just fine. What I can’t do (on arm) is add to/update the files in the rar archive. I have unrar already installed. What I can’t install is the rar package to create/update rar archives.

So I’ve been banging my head against the wall for about half an hour trying to install the rar package from the multiverse repository on an Ubuntu 23.10 vm I have running on my m1 mac mini. I finally ended up on https://pkgs.org and searched up rar to see if I could download it directly instead of using apt.

And it was there I realized there’s no arm version of rar.

Side note, any recommendations for an arm utility that handles rar files? I already have unrar-free installed, but what I need is something to update/add files to existing rar files.

Worst case scenario I unrar them and then repackage them with tar or zip, but if I can just work with the rar archive, I’d prefer that.

Edit: I got excited for a second remembering that I’ve got rar installed via homebrew on that same m1 mac, but when I tried to install homebrew in the vm, I learned that homebrew doesn’t officially support arm.

201
submitted 11 months ago* (last edited 11 months ago) by harsh3466@lemmy.world to c/linux@lemmy.ml
 

I've been reading Mastering Regular Expressions by Jeffrey E.F. Friedl, and since nobody in my life (aside from my wife) cares, I thought I’d share something I'm pretty proud of. My first set of regular expressions, that I wrote myself to manipulate the text I'm working with.

What’s I’m so happy about is that I wrote these expressions. I understand exactly what they do and the purpose of each character in each expression.

I've used regex in the past. Stuff cobbled together from stack overflow, but I never really understood how they worked or what the expressions meant, just that they did what I needed them to do at the time.

I'm only about 10% of the way through the book, but already I understand so much more than I ever did about regex (I also recognize I have a lot to learn).

I wrote the expressions to be used with egrep and sed to generate and clean up a list of filenames pulled out of tarballs. (movies I've ripped from my DVD collection and tarballed to archive them).

The first expression I wrote was this one used with tar and egrep to list the files in the tarball and get just the name of the video file:

tar -tzvf file.tar.gz | egrep -o '\/[^/]*\.m(kv|p4)' > movielist

Which gives me a list of movies of which this is an example:

/The.Hunger.Games.(2012).[tmdbid-70160].mp4

Then I used sed with the expression groups to remove:

  • the leading forward slash
  • Everything from .[ to the end
  • All of the periods in between words

And the last expression checks for one or more spaces and replaces them with a single space.

This is the full sed command:

sed -Eie 's/^\///; s/\.\[[a-z]+-[0-9]+\]\.m(p4|kv)//; s/[^a-zA-Z0-9\(\)&-]/ /g; s/ +/ /g' movielist

Which leaves me with a pretty list of movies that looks like this:

The Hunger Games (2012)

I'm sure this could be done more elegantly, and I'm happy for any feedback on how to do that! For now, I'm just excited that I'm beginning to understand regex and how to use it!

Edit: fixed title so it didn’t say “regex expressions”

view more: next ›