HACKER Q&A
📣 amichail

Have students started questioning their curriculums due to ChatGPT?


In particular, what is the point of learning something that is answerable by an AI such as ChatGPT?

How do teachers reply to such questions?

Are teachers starting to change what they teach?


  👤 judofyr Accepted Answer ✓
> In particular, what is the point of learning something that is answerable by an AI such as ChatGPT?

ChatGPT will often make up an answer which is absolutely incorrect. It will even insist that it’s correct.

It appears that the only responsible way to use ChatGPT is for questions where you already know the answer or can quickly validate the correctness somehow.


👤 AdamH12113
You're making a very big assumption here, which is that the point of learning something is to be able to answer specific questions of the kind that ChatGPT can answer. This is wrong. Students below the graduate level do not, in general, produce valuable work in the course of their studies. I promise you, your teachers do not need your help to write a high school freshman-level essay on Romeo and Juliet, any more than they need your help to factor a polynomial or discuss the causes of World War 1. The point of learning is to improve your brain. Asking why a student can't use ChatGPT to write an essay is like asking a jogger why they don't use a car. It's missing the point entirely.

People have been having this same discussion about math for many years. A copy of Mathematica can answer just about every question on every math test you will ever take. So what? We didn't stop having to learn math in order to understand anything that's based on it. Mindlessly quoting answers from ChatGPT will not teach you new ways of thinking.

As for the knowledge itself, the key thing to understand is that knowledge inside your brain is more useful in more circumstances than knowledge outside your brain. ChatGPT won't notice connections between unrelated subjects for you. It won't notify you in advance when a question ought to be asked.


👤 mattferderer
How is this different than "what is the point learning something that is answerable by Google, an Encyclopedia or common genre specific book, a calculator, etc?"

👤 recursivedoubts
I teach CS at a university.

I've largely given up on trying to "catch cheaters." At this point, either a student wants to really learn the information or they don't.

What are you gonna do? Cheat your way to a programming degree and then... bomb at some programming job because you don't know how to actually code or think?

I try to set students up for success, w/clear & obvious requirements (e.g. test suites for project grading) and then appeal to their sense of reciprocity: I'm trying to be cool to you, please be cool w/me & don't cheat. If you feel like you need to cheat to pass, come talk to me so we can address the material instead.

Regurgitating was never a great way to measure understanding anyway. In an ideal world, I'd have the time to sit down with each student and have a 10 minute talk, and get a sense of how well they understand the material, like the old english system.


👤 dougmwne
I think these are great questions. Certainly one answer could be “don’t do anything, don’t change anything, consider using LMMs on par with cheating and plagiarism.” And in the short term, that’s seems like an OK answer. This is very new technology and we are in the early stages of developing it, let alone putting it to some kind of industry standard use.

In the long term, assuming AI takes over a significant share of knowledge work, curriculums will need to adapt, just as they adapted to computers and calculators.


👤 g_p
I think the key skill being overlooked is the ability to spot when someone (or something) is wrong, and correctly and accurately call it out and justify why it's wrong.

Critical evidence-based thinking and the ability to properly apply understanding to solve a problem, and validate or disprove a proposed solution is something that a language model struggles to do, as it can't apply the reasoning process on facts and circumstances and context.

GPT output can be very believable and confident, but I would expect a good student to be able to question and interrogate a position and establish whether it is accurate or not, based on an evaluation of the available evidence.

Education moves slowly as a sector, and I doubt many have changed anything in the last month or two, but the pandemic and a shift to remote exams or "open book" exams has (for good courses and educators) resulted in more of a focus on explanations, justification, application of knowledge to new areas, and critical evaluation. If you could copy it down in a book and repeat it in an open book exam, chances are a language model poses the same threat.


👤 chrisco255
This is ultimately no different than search engines & internet access have been since the late 90s. ChatGPT may be able to answer a question slightly faster than a search in some cases. I suppose the one unique difference with ChatGPT is you can "teach" it some new material that it can infer responses from, to a limited extent.

A simple answer would be to have students take handwritten exams in class, with no electronics allowed. This is not always possible (ie online classes). Homework has always been subject to a number of methods that relieve the student from the burden of actually learning the material (ie parents helping, copying other students, searching online, etc).

Online classes are a tougher problem to solve. Maybe the answer for that is requiring that they be paired with proctored exams.


👤 floor2
We've had calculators that can do arithmetic for decades but we still teach kids how to do math.

Also, ChatGPT in particular is a terrible "source of truth". It gives plausible sounding answers, but they may not be factually correct.

I just asked ChatGPT to write a press release for a new scientific discovery, which would be impossible to actually happen. It happily did so, making up a name for a new element along with creating a fake quote from a made-up researcher at a made-up National Laboratory. You could do the same for historical events, current institutions, anything.


👤 bfeynman
That's not what education is about. AI is not novel concepts, and learning is not just rote facts. Things are answerable by AI because humans have written it and recorded it.

It dismays me seeing such poor takes everywhere about how they think education will falter and people ask questions like this, it's almost comical.

Perhaps people should start to learn that education is about learning how to learn and applying that to new concepts.


👤 csa
Often times answers are the easy part. It’s unfortunate that this comes across as the basis for much of US education, especially K-12.

As most (all?) researchers will tell you, asking good questions is the often times the hard part.

A good curriculum, sadly rarely found, will teach you how to think and how to ask good questions. Chatgpt doesn’t really impact this, imho.


👤 taumoeba
I’m about to graduate with a computer engineering degree. I’ve messed around with ChatGPT a fair bit. If I ask it about topics in my curriculum, it’s either wrong or it spits out a high-level overview of some extremely basic concepts that’s less useful than the end-of-chapter summaries in my textbooks.

👤 SOTGO
I think ChatGPT will cause similar problems to those already experienced in STEM disciplines, particularly math. Students can use a camera based app on their phones that can do an entire worksheet in seconds, computer algebra systems are baked into calculators, Wolfram Alpha can solve fairly complicated problems, and googling effectively can give the answers to the majority of homework questions through a rigorous undergraduate math program. Math class has always made students ask "when are we going to use this?" Since it is already difficult to convince students that math is useful, ChatGPT will probably make students question the usefulness of most of what we currently teach in schools.

👤 juancn
That's faulty reasoning, why learn something since it can be looked up?

Because understanding and looking it up are not the same. We've been able to look up things since we've had libraries and experts.

One one hand, you may end up doing something really stupid/dangerous just because you don't know what the answer means or what are the implications and blindly do what the statistical model tells you.

On the other hand learning gives you "HD vision" for the world.

Anything new you learn will let you see much more detail of the things around you, and you don't really know what you're missing, you really can't know, because until you learn it, you're blind.


👤 Shugarl
Well, I think there's a difference between something being answerable by ChatGPT, and someone being able to take that answer, understand it, and use it to deliver value.

Moreover, while ChatGPT is really good, it can make mistakes, and if said mistake isn't an obvious one, then you'd need to have a good general understanding of the thing you're asking a question about to spot that mistake.

We should also consider that ChatGPT won't be free forever, so it might be a bad idea to change teaching in a way that assumes that these kinds of things will always be accessible to students and wherever you're going to work.


👤 olalonde
Probably "You won't always have a ChatGPT in your pocket" :)

👤 braingenious
ChatGPT has only made me question who will write confident-but-dead-wrong SEO spam in the future. I would only question my curriculum if I were averse to doing any research myself and were willing to blindly and explicitly trust a black box on a website run by strangers.

I don’t really understand the uptick of “Rate how much the idea of a confident-sounding chat bot has shaken your world view to the core on a scale from 1-10” posts. It’s a language model, not a New Internet.


👤 12345hn6789
The point of learning is to apply the knowledge of the subject matter through something. If you just let the AI answer for you, you haven't learned anything.

In most cases, homework / basic questions are to strengthen your knowledge. If you are letting the AI do this for you, are you able to have a conversation about said topic?

Changing what teachers teach will not change the problem. Students will always use this bot now. The cat is out of the bag.


👤 joh6nn
I believe the following Isaac Asimov story best answers your question: https://archive.org/details/1958-02_IF/page/n5/mode/2up?view...

👤 furyofantares
I feel like the first thing I'd do as a teacher is embrace it. Take the assignment I was going to give, pass it through ChatGPT and give that to the student. Their job is to find the errors and omissions and rewrite without them.

👤 adamredwoods
At what point will skeptics / conspiracy theorists / religious fundamentalists start to add training inputs to ChatGPT that gives misleading information?

👤 icu
I use GPT3 to work out bodies of knowledge in specialist areas I’m interested in and then get it to suggest books and MOOCs that cover the topics.

👤 add-sub-mul-div
A lot of people are about to get a rude awakening about the differences between the concepts of information, knowledge, and wisdom.

👤 newaccount74
Even if ChatGPT could answer all questions correctly, you would still need to know which questions to ask.

👤 kidme5
Haven't we already been through this with the advent of Google?

👤 gmoore
that's like asking what is the point of learning something that is in a book...

seems like the wrong question