- World of Warcraft
 - TES V: Skyrim
 - Minecraft
 
Allero
This is so beautiful it made me want to have a VR set
Your computer will gradually get more and more filled with security holes that will be problematic to patch. Eventually, programs will stop supporting it as well.
60-year-old man who had a "history of studying nutrition in college" decided to try a health experiment: He would eliminate all chlorine from his diet
Oh well, it started before ChatGPT even had a chance to make it worse.
All kidding aside, support for self-hosted server or just a local program to store and visualize your data would be amazing
Alright, we generally seem to be on the same page :)
(Except numerous great books and helpful short materials exist for virtually any popular major, and, while they take longer to study, they provide order of magnitude better knowledge)
While I don't fully share the notion and tone of other commenter, I gotta say LLMs have absolutely tanked education and science, as noted by many and as I witnessed firsthand.
I'm a young scientist on my way to PhD, and I get to assist in a microbiology course for undergraduates.
The amount of AI slop coming from student assignments is astounding, and worse of all - they don't see it themselves. When it comes to me checking their actual knowledge, it's devastating.
And it's not just undergrads - many scientific articles also now have signs of AI slop, which messes up with research to a concerning degree.
Personally, I tried using more specialized tools like Perplexity in Research mode to look for sources, but it royally messed up listing the sources - it took actual info from scientific articles, but then referenced entirely different articles that hold no relation to it.
So, in my experience LLMs can be useful to generate a simple text or help you tie known facts together. But as a learning tool...be careful, or rather just don't use them for that. Classical education exists for a good reason, and it is that you learn to get factually correct and relevant information, analyze it and keep it in your head for future reference. It takes more time, but is ultimately much worth it.
Modern LLMs can serve you for most tasks while running locally on your machine.
Something like GPT4ALL will do the trick on any platform of your choosing if you have at least 8gb of RAM (and for most people nowadays it's true).
It has a simple, idiot-proof GUI and doesn't collect data if you don't allow it to. It's also open source, and, being local, it does not need Internet connection once you downloaded a model you need (which normally takes a single-digit number of gigabytes).
Beautiful