this post was submitted on 24 Feb 2026
8 points (62.5% liked)

Technology

81802 readers
4329 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

I co-teach AP Computer Science A through Microsoft's TEALS program. The classroom runs on Chromebooks, Google Classroom, and code.org (AWS). Corporate infrastructure top to bottom. This year I added an AI tutor. That's apparently the controversial part.

The research is interesting: a Wharton study found students using standard ChatGPT performed 17% worse on exams—the "crutch" effect. But students using AI with pedagogical guardrails showed no negative effect. The problem isn't AI in education. It's unguided AI. So I built a tutor that asks probing questions instead of giving answers. I'm sharing the prompt I use and how to set one up yourself.

While, China made AI education mandatory for six-year-olds this year. We're still deciding whether to block ChatGPT.

top 8 comments
sorted by: hot top controversial new old
[–] x_pikl_x@lemmy.world 1 points 31 minutes ago

Thanks for volunteering to flood the workforce with useless idiots!

[–] GreenBeard@lemmy.ca 5 points 4 hours ago

Hard pass. There absolutely should be no AI in any classroom under any circumstances. The whole point of a classroom is to build a foundation on which to understand the fundamentals before they slap a set of training wheels on and vibe-code their way into disaster. Most of these LLMs ignore whatever guardrails you slap on them far too frequently.

The most important lesson these kids need to learn is if you can't do it yourself, you shouldn't be letting an LLM do it for you. If the best you can say about the effects is "This version doesn't seem to be actively harming them" then the bar is in hell, and we shouldn't be playing with these tools at all at this point.

[–] XLE@piefed.social 13 points 6 hours ago (1 children)

I got into volunteering through TEALS, Microsoft’s nonprofit.

Good for you / I'm sorry to hear that

The class runs on Chromebooks managed by Google Classroom, writing code on code.org—which is powered by AWS.

My condolences to the students. It sounds like they're already being brought up in a world where they are expected to own nothing and be happy.

I hope you teach them about how terrible this privacy violation is, and how they are slowly being groomed into dependency.

Corporate infrastructure is already the foundation of public CS education.

That's very sad too.

...wait, you're upset because you want to indoctrinate the children with more stuff?

[–] LodeMike@lemmy.today 5 points 7 hours ago (1 children)

"Using" AI is not well defined. I assume the one that showed no difference is because the students found it useless.

[–] davidwkeith@lemmy.world 2 points 6 hours ago

Eactly! The cohert that showed no difference didn't provide guidance of any sort, just provided GPT-4 as a resource. The cohert that benifited had a tutor agent setup and the students were instructed to treat it like a tutor. Like calculators, computers, and the Internet before, we need to design curriculum with AI in mind for it to be useful.