this post was submitted on 04 Sep 2024
914 points (98.2% liked)
Technology
59739 readers
2149 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I've found AI helpful in asking for it to explain stuff. Why is the problem solved like this, why did you use this and not that, could you put it in simpler terms and so on. Much like you might ask a teacher.
I think this works great if the student is interested in the subject, but if you're just trying to work through a bunch of problems so you can stop working through a bunch of problems, it ain't gonna help you.
I have personally learned so much from LLMs (although you can't really take anything at face value and have to look things up independently, but it gives you a great starting place), but it comes from a genuine interest in the questions I'm asking and things I dig at.
No offense but that's what the article is also highlighting, naming that students, even the good, believe they did learn. Once it's time to pass a test designed to evaluate if they actually did, it's not that positive.
I mean...
私と日本語で会話したいか 😅
At the end of the day, I feel like it's how you use the tool. "if you’re just trying to work through a bunch of problems so you can stop working through a bunch of problems, it ain’t gonna help you." How do you think a bunch of kids using this are going to be using it when it comes to school work that they're required to finish, but not likely actually interested in?
To an extent, but it's often just wrong about stuff.
It's been a good second step for things I have questions about that I can't immediately find good search results for. I don't wanna get off topic but I have major beef with Stack Overflow and posting questions there makes me anxious as hell because I'll do so much diligence to make sure it is clear, reproducible, and not a duplicate only for my questions to still get closed. It's a major fucking waste of my time. Why put all that effort in when it's still going to get closed?? Anyways -- ChatGPT never gets mad at me. Sure, it's often wrong as hell but it never berates me or makes me feel stupid for asking a question. It generally gets me close enough on topics that I can search for other terms in search engines and get different results that are more helpful.
Yep. My first interaction with GPT pro lasted 36 hours and I nearly changed my religion.
AI is the best thing to come to learning, ever. If you are a curious person, this is bigger than Gutenberg, IMO.
That sounds like a manic episode