College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.
I found this too generalizing. Yes, most people only ever need and use productivity skills in their worklife. They do no fundamental research. Wether their education was this or that way has no effect on the advancement of science in general, because these people don’t do science in their career.
Different people with different goals will do science, and for them an appropriate education makes sense. It also makes sense to have everything in between.
I don’t see how it helps the humanities and other sciences to teach skills which are never used. Or how it helps to teach a practice which no one applies in practice. How is it a threat to education when someone uses a new tool intelligently, so they can pass academic education exams? How does that make them any less valuable for working in that field? Assuming the exam reflects what working in that field actually requires.
I think we can also spin an argument in the opposite direction: More automation in education frees the students to explore side ideas, to actually study the field.
“I don’t see how it helps the humanities and other sciences to teach skills which are never used.” - I can offer an unusual counter here, you’re assuming the knowledge will never be used, or that we should avoid teaching things that are unlikely to be used. Were this the case, the field of graph theory would have ceased to exist long before it became useful in mapping - indeed Bool’s algebra would never have led to the foundations of computer science and the machines we are using today.
“How is it a threat to education when someone uses a new tool intelligently, so they can pass academic education exams?” - Allow me to offer you the choice of two doctors, one of whome passed using AI, and the other passed a more traditional assessment. Which doctor would you choose and why? Surely the latter, since they would have also passed with AI, but the one without AI might not have passed the more traditional route due to a lack of knowledge. It isn’t a threat to education, it’s adding further uncertainty as to the outcome of such an education (both doctors might have the same skill levels, but there is more room for doubt in the first case).
“Wether their education was this or that way has no effect on the advancement of science in general, because these people don’t do science in their career.” - I strongly disagree! In an environment where knowledge for the sake of knowledge is not prised, a lie is more easy plant and nurture (take for example the antivax movement). Such people can be an active hinderence to the progress of knowledge - their misconceptions creating false leads and creating an environment that distrusts such sciences (we’re predisposed to distrust what we do not understand).
Not exactly. What I meant to say is: Some students will never use some of the knowledge they were taught. In the age of information explosion, there is practically unlimited knowledge ‘available’. What part of this knowledge should be taught to students? For each bit of knowledge, we can make your hypothetic argument: It might become useful in the future, an entire important branch of science might be built on top of it.
So this on it’s own is not an argument. We need to argue why this particular skill or knowledge deserves the attention and focus to be studied. There is not enough time to teach everything. Which in turn can be used as an argument to more computer assisted learning and teaching. For example, I found ChatGPT useful to explore topics. I would not have used it to cheat in exams, but probably to prepare for them.
Good point, but it depends on context. You assume the traditional doc would have passed with AI, but that is questionable. These are complex tools with often counterintuitive behaviour. They need to be studied and approached critically to be used well. For example, the traditional doc might not have spotted the AI hallucinating, because she wasn’t aware of that possibility.
Further, it depends on their work environment. Do they treat patients with, or without AI? If the doc is integrated in a team of both human and artificial colleagues, I certainly would prefer the doc who practiced these working conditions, who proved in exams they can deliver the expected results this way.
I feel we left these lands in Europe when diplomas were abandoned for the bachelor/master system, 20 years ago. Academic education is streamlined, tailored to the needs of the industry. You can take a scientific route, but most students don’t. The academica which you describe as if it was threatened by something new might exist, but it lives along a more functional academia where people learn things to apply them in our current reality.
It’s quite a hot take to paint things like the antivax movement on academic education. For example, I question wether the people proposing and falling for these ‘ideas’ are academics in the first place.
Personally, I like learning knowledge for the sake of knowledge. But I need time and freedom to do so. When I was studying computer science with an overloaded schedule, my interest in toying with ideas and diving into backgrounds was extremely limited. I also was expected to finish in an unreasonably short amount of time. If I could have sped up some of the more tedious parts of the studies with the help of AI, this could have freed up resources and interest for the sake of knowledge.