this post was submitted on 04 Sep 2024
914 points (98.2% liked)

Technology

59569 readers
3431 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale. 

Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.

A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.

you are viewing a single comment's thread
view the rest of the comments
[–] glowie@h4x0r.host 24 points 2 months ago (5 children)

Of all the students in the world, they pick ones from a "Turkish high school". Any clear indication why there of all places when conducted by a US university?

[–] catloaf@lemm.ee 19 points 2 months ago (1 children)

I'm guessing there was a previous connection with some of the study authors.

I skimmed the paper, and I didn't see it mention language. I'd be more interested to know if they were using ChatGPT in English or Turkish, and how that would affect performance, since I assume the model is trained on significantly more English language data than Turkish.

[–] adespoton@lemmy.ca 3 points 2 months ago

GPTs are designed with translation in mind, so I could see it being extremely useful in providing me instruction on a topic in a non-English native language.

But they haven’t been around long enough for the novelty factor to wear off.

It’s like computers in the 1980s… people played Oregon Trail on them, but they didn’t really help much with general education.

Fast forward to today, and computers are the core of many facets of education, allowing students to learn knowledge and skills that they’d otherwise have no access to.

GPTs will eventually go the same way.

[–] Phoenix3875@lemmy.world 8 points 2 months ago

The paper only says it's a collaboration. It's pretty large scale, so the opportunity might be rare. There's a chance that (the same or other) researchers will follow up and experiment in more schools.

[–] Ilandar@aussie.zone 5 points 2 months ago (1 children)

The names of the authors suggest there could be a cultural link somewhere.

[–] glowie@h4x0r.host 1 points 2 months ago

Ah thanks, that does appear to be the case.

[–] Lemminary@lemmy.world 3 points 2 months ago (1 children)

If I had access to ChatGPT during my college years and it helped me parse things I didn't fully understand from the texts or provided much-needed context for what I was studying, I would've done much better having integrated my learning. That's one of the areas where ChatGPT shines. I only got there on my way out. But math problems? Ugh.

[–] ForgotAboutDre@lemmy.world 23 points 2 months ago (2 children)

When you automate these processes you lose the experience. I wouldn’t be surprised if you couldn’t parse information as well as you can now, if you had access to chat GPT.

It’s had to get better at solving your problems if something else does it for you.

Also the reliability of these systems is poor, and they’re specifically trained to produce output that appears correct. Not actually is correct.

[–] Veddit@lemmy.world 5 points 2 months ago (1 children)

I read that comment, and use it similarly, as more a super-dictionary/encyclopedia in the same way I'd watch supplementary YouTube videos to enhance my understanding. Rather than automating the understanding process.

More like having a tutor who you ask all the too-stupid and too-hard questions to, who never gets tired or fed up with you.

[–] Petter1@lemm.ee 1 points 2 months ago

Exactly this! That is why I always have at least one instance of AI chatbot running when I am coding or better said analyse code for debugging.

It makes it possible to debug kernel stuff without much pre-knowledge, if you are proficient in prompting your questions. Well, it did work for me.

[–] Lemminary@lemmy.world 4 points 2 months ago

I quickly learned how ChatGPT works so I'm aware of its limitations. And since I'm talking about university students, I'm fairly sure those smart cookies can figure it out themselves. The thing is, studying the biological sciences requires you to understand other subjects you haven't learned yet, and having someone explain how that fits into the overall picture puts you way ahead of the curve because you start integrating knowledge earlier. You only get that from retrospection once you've passed all your classes and have a panoramic view of the field, which, in my opinion, is too late for excellent grades. This is why I think having parents with degrees in a related field or personal tutors gives an incredibly unfair advantage to anyone in college. That's what ChatGPT gives you for free. Your parents and the tutors will also make mistakes, but that doesn't take away the value which is also true for the AIs.

And regarding the output that appears correct, some tools help mitigate that. I've used the Consensus plugin to some degree and think it's fairly accurate for resolving some questions based on research. What's more useful is that it'll cite the paper directly so you can learn more instead of relying on ChatGPT alone. It's a great tool I wish I had that would've saved me so much time to focus on other more important things instead of going down the list of fruitless search results with a million tabs open.

One thing I will agree with you is probably learning how to use Google Scholar and Google Books and ~~pirating books~~ using the library to find the exact information as it appears in the textbooks to answer homework questions which I did meticulously down to the paragraph. But only I did that. Everybody else copied their homework, so at least in my university it was a personal choice how far you wanted to take those skills. So now instead of your peers giving you the answers, it's ChatGPT. So my question is, are we really losing anything?

Overall I think other skills need honing today, particularly verifying information, together with critical thinking which is always relevant. And the former is only hard because it's tedious work, honestly.

[–] technocrit@lemmy.dbzer0.com 0 points 2 months ago* (last edited 2 months ago)

The study was done in Turkey, probably because students are for sale and have no rights.

It doesn't matter though. They could pick any weird, tiny sample and do another meaningless study. It would still get hyped and they would still get funding.