I get that. It's funny I think I've gotten advice in the past to always check the results of search engines because they can be wrong (as in teachers said it to me) or things about Wikipedia being unreliable. But nobody does those things nowadays. Perhaps someday LLMs will be good enough that we don't need to check them either.
JackbyDev
Second sentence of the description from the man pages, "Otherwise, depending on the options specified, date will set the date and time or print it in a user-defined way." not sure what they were on about.
Because smart phones are a more normalized piece of technology that (most) people have and use extremely often.
They once wrote me a massive script for parsing a history file instead of telling me about history -i
Codeberg, but you have to manually apply for the CI/CD part. Also, Codeberg only allows you to host FOSS projects.
What if source forge bought it lmao
Ah, okay. That's fair. It wasn't clear they meant a different system lol.
More info if you're seriously considering it. https://codoraven.com/blog/ai/stable-diffusion-the-invisible-watermark-in-generated-images/
I don't actually know if any model creators check for the watermark or not.
...but that output is also from the AI so it would still be watermarked lol
As someone who fiddled with Stable Diffusion which also has optional invisible watermarks this is a good feature. It is so that AI training will avoid content marking itself as AI generated. If people want to hide that their content is AI generated then, sadly, it's harder to detect.
Reverse incel maneuver?
OP, you know you don't have you use a table, right? You can make a bulleted list.