this post was submitted on 27 Jan 2026
46 points (96.0% liked)
Technology
79355 readers
4201 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Don't worry, this is rapidly emerging in the US too. Entire universities are now trying to figure out how to mandate it in every corner.
I didn't even think about it until you mentioned it, but I've had several college assignments where I'm tasked with asking an LLM a question regarding the course, and then I have to write about what I learned from it. I still have to find sources supporting or refuting the output, so we're not expected to take the output as truth at least. And these aren't CompSci courses either. It's common core cultural intelligence stuff.
When they talk about AI taking over the world, it's always about taking over the Internet and connected industrial machines. No one told me that AI was going to take over the collective consciousness first.
That's so weirdly almost self aware. You must use AI but also can't trust it an inch.
I mean with the way how it is going I think that's the right way to go about it. I don't think any of the alternatives are possible. Like its out there, people are going to use it, its better to model your course around that. It sucks but I also don't know another option. Yes you can make it part of the academic policy and all but is that really effective? It just causes people harder to cover it up. Like instead of using one AI you'll use 3, one to research, and the others to reword your text and remove all the AI-isms that it brings. I much prefer the, you gotta disclose AI use and penalties for not disclosing it and elaborating on how it was used.
I am honestly open to any suggestions wrt this.
That is 100% why you're seeing those assignments. Everybody within most universities structures as far as I can tell is 100% bought in to artificial intelligence in virtually every discipline.
Expect it to get worse. And in strange ways. The educational literature indicates that we have zero idea how to actually use these things without diminishing our own cognitive abilities. The system's largely can't be trusted, look at grock the Nazi, or the other systems that are looking to now put ads into conversation.
It's a big f****** mess. I'm not having any of it in my classes right now. But I also don't really see the need for them. I need to train my students in base principles, not in the art of asking questions.