100
submitted 2 weeks ago by uuj8za@piefed.social to c/fuck_ai@lemmy.world

LLM use is the most demoralizing problem I’ve faced as a college instructor.

top 11 comments
sorted by: hot top controversial new old
[-] U7826391786239@piefed.zip 31 points 2 weeks ago

school admin (and everyone else tbh) are completely fucking clueless about how to deal with the AI problem, or worse, are actively trying to incorporate it as part of the process of research. all they're doing is prepping a generation of kids to go out into the workforce without knowing the first damn thing about how to find and use relevant and credible information, much less care--why bother, i'll just ask AI. the fact that it's so often dead wrong doesn't matter--it just needs to look correct. done. next.

[-] Bluegrass_Addict@lemmy.ca 8 points 2 weeks ago

what do you mean recipe, doesn't the food just come out of that hot box thing in the kitchen?

[-] la93@thelemmy.club 5 points 2 weeks ago

are actively trying to incorporate it as part of the process of research Yes

[-] tuckerm@feddit.online 13 points 2 weeks ago* (last edited 2 weeks ago)

This author is bringing up one of my biggest concerns about AI adoption, which is that it's allowing people to come off as an expert while just skipping the learning process entirely. The learning process is where personal development happens. It's where you become a better person. And I don't just mean in college, I mean in all things. And professors are supposed to somehow make sure that students are going through that process, while also being told to adopt AI in the classroom.

Also, I first skimmed the headline as "It's time to teach ChatGPT to know pain," and...YES.

[-] thesohoriots@lemmy.world 12 points 2 weeks ago

Students often carry misconceptions about coursework. They may view an instructor as an opponent standing in the way of the grade they want. And they see “getting the right answers” as the goal of education because that’s how you secure that grade.

Pedagogy aside, this is not entirely untrue in the lower division, at least on an institutional level. It’s usually the institution propping instructors up to stand in the way. They are told “you have to meet these learning outcomes, so these assignments have to do X, Y, and Z” and turns out X, Y, and Z are bullshit, but it meets accreditation standards if the class is audited. I taught rhetoric and it had to be explicitly Aristotelian. It blew chunks, nobody liked it or really understood it, but it was what the department justified to the institution for accreditation purposes.

I hated giving grades. I wanted to see continual improvement rather than final products. With a gun to my head I wouldn’t go back.

[-] SoleInvictus 8 points 2 weeks ago* (last edited 2 weeks ago)

I agree 100%. My friends, peers, and I all wasted huge amounts of time during our undergrad degree,and to varying extents even in post-graduate degrees, fulfilling the university's "one size fits all" curriculum standards.

I spent hundreds of hours sitting in lecture over nearly a decade. I DO NOT learn well from oral instruction but still was graded on attendance. I did homework far in excess of that required to learn and practice the material. I wasted so much time that I could have achieved double the number of degrees, even then with less work, if I had been given full autonomy and responsibility to learn the course material.

[-] dkc@lemmy.world 10 points 2 weeks ago

The direction I’m moving, and other teachers as well is to only consider in class work for assessment. Specifically pencil and paper work.

If you had asked me to predict what a high-quality education would look like post-Covid I would have said high quality videos made with great production value with rock star teachers and domain experts involved. Combined with local teachers offering a more personal tutoring like experience to clarify the videos.

If you asked me now what a high-quality education would look like in the post-AI future. I would say boarding schools that don’t allow phones, with almost all assessments being done in person with pencil and paper.

There’s no denying that technologies can make certain topics unnecessary to learn. Calculators can do basic calculus with ease these days and AI can summarize a text or write a bland essay or implement a simple intro to HTML assignment. But if the goal for education is to learn we have to limit the use of technologies that limit learning.

I suppose there’s another debate to be had about vocational learning and taking advantage of modern tools. That’s usually why I say “high-quality” education when discussing these things. For the average family AI is here to stay in education. For the schools for the elite, AI is an experiment which is slowly being kicked out.

[-] I_Jedi@lemmy.today 6 points 2 weeks ago* (last edited 2 weeks ago)

10% of the male students, and 20% of the female students of a class use ChatGPT. There are 100 students in the class, and the class is evenly split between male and female. How many students used ChatGPT to answer this question?

[-] Mothra@mander.xyz 6 points 2 weeks ago

Are we rounding up or down?

[-] Renat@szmer.info 6 points 2 weeks ago* (last edited 2 weeks ago)

I learn in the time of GPT and It's pain. I can't ask other students about college stuff cause they are tech bros who answer "ask chat GPT".

[-] morto@piefed.social 5 points 2 weeks ago

Around here. even some professors are giving "ask ai" as an answer, and they're supposed to be theaching people!

this post was submitted on 13 Apr 2026
100 points (100.0% liked)

Fuck AI

6863 readers
1184 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS