I think that's actually a good idea? Sucks for e-learning as a whole, but I always found online exams (and also online interviews) to be very easy to game.
Really sucks for people with disabilities and handwriting issues.
It's always sucked for them, and it always will. That's why we make accommodations for them, like extra time or a smaller/move private exam hall.
And readers/scribes! I’ve read and scribed for a friend who had dyslexia in one of her exams and it worked really well. She finished the exam with time to spare and got a distinction in the subject!
My handwriting has always been terrible. It was a big issue in school until I was able to turn in printed assignments.
Like with a lot of school things, they do a shit thing without thinking about negative effects. They always want a simple solution to a complex problem.
Prof here - take a look at it from our side.
Our job is to evaluate YOUR ability; and AI is a great way to mask poor ability. We have no way to determine if you did the work, or if an AI did, and if called into a court to certify your expertise we could not do so beyond a reasonable doubt.
I am not arguing exams are perfect mind, but I'd rather doubt a few student's inability (maybe it was just a bad exam for them) than always doubt their ability (is any of this their own work).
Case in point, ALL students on my course with low (<60%) attendance this year scored 70s and 80s on the coursework and 10s and 20s in the OPEN BOOK exam. I doubt those 70s and 80s are real reflections of the ability of the students, but do suggest they can obfuscate AI work well.
Here's a somewhat tangential counter, which I think some of the other replies are trying to touch on ... why, exactly, continue valuing our ability to do something a computer can so easily do for us (to some extent obviously)?
In a world where something like AI can come up and change the landscape in a matter of a year or two ... how much value is left in the idea of assessing people's value through exams (and to be clear, I'm saying this as someone who's done very well in exams in the past)?
This isn't to say that knowing things is bad or making sure people meet standards is bad etc. But rather, to question whether exams are fit for purpose as means of measuring what matters in a world where what's relevant, valuable or even accurate can change pretty quickly compared to the timelines of ones life or education. Not long ago we were told that we won't have calculators with us everywhere, and now we could have calculators embedded in our ears if wanted to. Analogously, learning and examination is probably being premised on the notion that we won't be able to look things up all the time ... when, as current AI, amongst other things, suggests, that won't be true either.
An exam assessment structure naturally leans toward memorisation and being drilled in a relatively narrow band of problem solving techniques,^1^ which are, IME, often crammed prior to the exam and often forgotten quite severely pretty soon afterward. So even presuming that things that students know during the exam are valuable, it is questionable whether the measurement of value provided by the exam is actually valuable. And once the value of that information is brought into question ... you have to ask ... what are we doing here?
Which isn't to say that there's no value created in doing coursework and cramming for exams. Instead, given that a computer can now so easily augment our ability to do this assessment, you have to ask what education is for and whether it can become something better than what it is given what are supposed to be the generally lofty goals of education.
In reality, I suspect (as many others do) that the core value of the assessment system is to simply provide a filter. It's not so much what you're being assessed on as much as your ability to pass the assessment that matters, in order to filter for a base level of ability for whatever professional activity the degree will lead to. Maybe there are better ways of doing this that aren't so masked by other somewhat disingenuous goals?
Beyond that there's a raft of things the education system could emphasise more than exam based assessment. Long form problem solving and learning. Understanding things or concepts as deeply as possible and creatively exploring the problem space and its applications. Actually learn the actual scientific method in practice. Core and deep concepts, both in theory and application, rather than specific facts. Breadth over depth, in general. Actual civics and knowledge required to be a functioning member of the electorate.
All of which are hard to assess, of course, which is really the main point of pushing back against your comment ... maybe we're approaching the point where the cost-benefit equation for practicable assessment is being tipped.
- In my experience, the best means of preparing for exams, as is universally advised, is to take previous or practice exams ... which I think tells you pretty clearly what kind of task an exam actually is ... a practiced routine in something that narrowly ranges between regurgitation and pretty short-form, practiced and shallow problem solving.
Ah the calculator fallacy; hello my old friend.
So, a calculator is a great shortcut, but it's useless for most mathematics (i.e. proof!). A lot of people assume that having a calculator means they do not need to learn mathematics - a lot of people are dead wrong!
In terms of exams being about memory, I run mine open book (i.e. students can take pre-prepped notes in). Did you know, some students still cram and forget right after the exams? Do you know, they forget even faster for courseworks?
Your argument is a good one, but let's take it further - let's rebuild education towards an employer centric training system, focusing on the use of digital tools alone. It works well, productivity skyrockets, for a few years, but the humanities die out, pure mathematics (which helped create AI) dies off, so does theoretical physics/chemistry/biology. Suddenly, innovation slows down, and you end up with stagnation.
Rather than moving us forward, such a system would lock us into place and likely create out of date workers.
At the end of the day, AI is a great tool, but so is a hammer and (like AI today), it was a good tool for solving many of the problems of its time. However, I wouldn't want to only learn how to use a hammer, otherwise how would I be replying to you right now?!?
I think a central point you're overlooking is that we have to be able to assess people along the way. Once you get to a certain point in your education you should be able to solve problems that an AI can't. However, before you get there, we need some way to assess you in solving problems that an AI currently can. That doesn't mean that what you are assessed on is obsolete. We are testing to see if you have acquired the prerequisites for learning to do the things an AI can't do.
has led some college professors to reconsider their lesson plans for the upcoming fall semester.
I'm sure they'll write exams that actually require an actual understanding of the material rather than regurgitating the seminar PowerPoint presentations as accurately as possible...
No? I'm shocked!
We get in trouble if we fail everyone because we made them do a novel synthesis, instead of just repeating what we told them.
Particularly for an intro course, remembering what you were told is good enough.
The first step to understanding the material is exactly just remembering what the teacher told them.
They're about to find out that gen Z has horrible penmanship.
Millennial here, haven't had to seriously write out anything consistently in decades at this point. There's no way their handwriting can be worse than mine and still be legible lol.
As a millennial with gen Z teens, theirs is worse, though somehow not illegible, lol. They just write like literal 6 year olds.
There are places where analog exams went away? I'd say Sweden has always been at the forefront of technology, but our exams were always pen-and-paper.
Can we just go back to calling this shit Algorithms and stop pretending its actually Artificial Intelligence?
It actually is artificial intelligence. What are you even arguing against man?
Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn't AI because you don't like it is like saying rock and roll isn't music
I am arguing against this marketing campaign, that's what. Who decides what "AI" is and how did we come to decide what fits that title? The concept of AI has been around a long time, like since the Greeks, and it had always been the concept of a made-made man. In modern times, it's been represented as a sci-fi fantasy of sentient androids. "AI" is a term with heavy association already cooked into it. That's why calling it "AI" is just a way to make it sound high tech futuristic dreams-come-true. But a predictive text algorithm is hardly "intelligence". It's only being called that to make it sound profitable. Let's stop calling it "AI" and start calling out their bullshit. This is just another crypto currency scam. It's a concept that could theoretically work and be useful to society, but it is not being implemented in such a way that lives up to its name.
Who decides what “AI” is
Appearently you.
But then the investor wont throw wads of money at these fancy tech companies
Am I wrong in thinking student can still generate an essay and then copy it by hand?
Not during class. Most likely a proctored exam. No laptops, no phones, teacher or proctor watching.
This isn't exactly novel. Some professors allow a cheat sheet. But that just means that the exam will be harder.
Physics exam that allows a cheat sheet asks you to derive the law of gravity. Well, OK, you write the answer at the bottom pulled from you cheat sheet. Now what? If you recall how it was originally created you probably write Newtons three laws at the top of your paper... And then start doing some math.
Calculus exam that let's you use wolfram alpha? Just a really hard exam where you must show all of your work.
Now, with ChatGPT, it's no longer enough to have a take home essay to force students to engage with the material, so you find news ways to do so. Written, in person essays are certainly a way to do that.
When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn't exactly a new development.
So what you’re telling me is that written tests have, in fact, existed before?
What are you some kind of education historian?
This thinking just feels like moving in the wrong direction. As an elementary teacher, I know that by next year all my assessments need to be practical or interview based. LLMs are here to stay and the quicker we learn to work with them the better off students will be.
You can still have AI write the paper and you copy it from text to paper. If anything, this will make AI harder to detect because it's now AI + human error during the transferring process rather than straight copying and pasting for students.
Wouldn't it make more sense to find ways on how to utilize the tool of AI and set up criteria that would incorporate the use of it?
There could still be classes / lectures that cover the more classical methods, but I remember being told "you won't have a calculator in your pocket".
My point use, they should prepping students for the skills to succeed with the tools they will have available and then give them the education to cover the gaps that AI can't solve. For example, you basically need to review what the AI outputs for accuracy. So maybe a focus on reviewing output and better prompting techniques? Training on how to spot inaccuracies? Spotting possible bias in the system which is skewed by training data?
That's just what we tell kids so they'll learn to do basic math on their own. Otherwise you'll end up with people who can't even do 13+24 without having to use a calculator.
Training how to use "AI" (LLMs demonstrably possess zero actual reasoning ability) feels like it should be a seperate pursuit from (or subset of) general education to me. In order to effectively use "AI", you need to be able to evaluate its output and reason for yourself whether it makes any sense or simply bears a statitstical resemblance to human language. Doing that requires solid critical reasoning skills, which you can only develop by engaging personally with countless unique problems over the course of years and working them out for yourself. Even prior to the rise of ChatGPT and its ilk, there was emerging research showing diminishing reasoning skills in children.
Without some means of forcing students to engage cognitively, there's little point in education. Pen and paper seems like a pretty cheap way to get that done.
I'm all for tech and using the tools available, but without a solid educational foundation (formal or not), I fear we end up a society snakeoil users in search of the blinker fluid.
as someone with wrist and hand problems that make writing a lot by hand, I'm so lucky i finished college in 2019
Well if i go back to school now im fucked i cant read my own hand writting.
It just brings into question what the point of exams are.
AI in its current form is equivalent to the advent of the typewriter. Its just empowering you to do a whole lot more a whole lot faster.
Not using it is dumb.
AI is a tool that can indeed be of great benefit when used properly. But using it without comprehending and verifying the source material can be downright dangerous (like those lawyers citing fake cases). The point of the essay/exam is to test comprehension of the material.
Using AI at this point is like using a typewriter in a calligraphy test, or autocorrect in a spelling and grammar test.
Although asking for handwritten essays does nothing to combat use of AI. You can still generate content and then transcribe it by hand.
Chat GPT - answer this question, add 4 consistent typos. Then hand transcribe it.
Isn't this kind of ableist? I remember when I was in school I had special accommodations to type instead of write, because I had wrists too weak to write legibly, but fingers fast enough to type expediently, they legitimately thought that I was a really stupid kid, until they realized that my spelling tests were not incorrect.
They just couldn't read that I had spelled it correctly. Somehow I wrote the word fly, and the teacher mistook my y for a v. I went from being the dumbest kid to the smartest kid as soon as the accommodation was put in place.
You became the smartest kid because everyone else had a stroke trying to read what you wrote.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed