HACKER Q&A
📣 hubraumhugo

How has AI changed your learning methods?


While doing some research on a topic, I was wondering about the impact of AI on traditional study methods.

Any real-world experiences from students or teachers? Is there any research on the impact of AI on learning outcomes?


  👤 pennomi Accepted Answer ✓
AI really speeds up the “I am so new to this topic I don’t even know what questions to ask” phase of learning. When approaching a new topic, it gives me a general overview which I can pick apart into deeper queries that I research the old fashioned way.

👤 milesvp
I can’t speak for “traditional” learning methods, since I’ve long been out of school. But I find chat gpt is about as good at first pass research as google was circa 2003. Google is no longer good for research. Used to be I didn’t know much on a topic and I could start googling and page 3 or so would start to have key words necessary to blow open the topic. google no longer even populates page 3.

So I start with basic questions in chat gpt. I know it’s lying to me. But it uses words and phrases that seem interesting. I can iterate over that, and in a short time I know a phrase I can actually search for in google to find authoritative content.

I wouldn’t quite call it a game changer yet, as all it’s done is give me back what I once had. But it is a bonus in that it can do an ok job synthesizing new examples from content that already had lots of disperate examples on the web. It can also give some clues when you find conflicting information as to why the information is conflicting. It’s making stuff up a lot, but there are always good clues in the output.


👤 singleshot_
Since the advent of AI, when I am doing research I have to scroll down a little further after making a search before I can see the results, because now there is some AI stuff in the way at the top of the page.

👤 markusde
Not much. It's too inaccurate for my research and is a bad writer.

Day to day I write Lean, and I use moogle.ai to find theorems. It's... fine as a first pass. The website constantly gets confused about similar-looking theorems, it can't pattern match, and it can't really introspect typeclasses (which can be hiding the theorems I want). However it usually can usually help me go from a vague description of what I want to some relevant doc pages, so credit where it's due for that.


👤 hannofcart
This is something I found useful doing but not sure if it applies to learning in general, but here goes:

- I regularly read technical texts, some related to math, some related to programming/software engineering

- I ask Chatgpt4 (now Claude Sonnet 3.5) to quiz me on the nuances of topics X, Y, Z

My prompt template looks like so:

"I'd like to test my understanding of {topic}. I'd like you to quiz me one question at a time. Ask nuanced questions. Answers need to be multiple choice that I can pick from. Avoid congratulatory tone in responses. If I pick the right choice for a question, move on to next one without asking. Provide detailed explanations in case I answer something incorrectly."

I've found this surprisingly effective at pointing out gaps in my understanding. Ymmv.


👤 Buttons840
I learned linear algebra and would discuss it with GPT4.

Sometimes I made mistakes, sometimes GPT4 made mistakes.

Once GPT4 wouldn't agree with something the textbook said. I said "it's in the text book", it said "then the text book must be wrong", "no, you are wrong", "I'm sorry, but that statement is not generally true", "here's the proof"--only after I gave GPT4 the proof did it finally accept the textbook was correct. It was also able to detect subtle mistakes in the proof, and could not be persuaded by a faulty proof. [0]

I think the biggest help was just participating in conversations about math, anytime I needed. It made me more engaged, more focused on what the textbook was saying and whether or not the textbook was matching what GPT4 and I had discussed.

You know the saying, "the easiest way to get an answer online is to post the wrong answer and start an argument", something like that. Well, that's similar to what GPT4 was doing for me, it would have an engaged discussion, maybe an argument, maybe leave me wondering about something, and that was very motivating when reading the textbook.

The textbook still played a central role in my learning. (GPT4 did catch a mistake in the textbook once though.)

[0] Here's a previous comment of mine about learning linear algebra from GPT4: https://news.ycombinator.com/item?id=36244561


👤 poikroequ
I use it a lot to ask dumb simple questions. I've been delving into a new tech stack at work, and I'm already familiar with the concepts, but I just don't know how to do those things in this specific tech stack. AI saves me a lot of time digging through documentation and SEO spam. It often gets me to the answer faster.

However, I usually only use it to ask dumb simple questions. When it comes to anything more complex or obscure, it often falls flat on its face and either hallucinates or misunderstands the question. Then I'll do an old fashioned web search and find a clear cut answer in stack overflow.

My experience has been AI is very unreliable right now and you simply can't trust what it tells you. So I only use it in very limited ways.


👤 muzani
The ChatGPT app lets you just take photos of everything. I take photos of labels and ask it to explain all the chemicals in my food, shampoo, toothpaste, etc. Which one are the preservatives? How long do those preservatives last? What's the chemical that makes my teeth hurt when I eat ice cream? What happens if a toothpaste doesn't have that chemical?

Also lately I've been taking photos of stuff like coffee grinders and making it guess what it is. It's surprisingly very accurate and you can use it to explore the thought processes of why someone might pick a particular set.


👤 nonameiguess
To be honest, I've never felt like I needed additional tooling to make learning faster or easier. You're inherently rate-limited by the bandwidth of your own brain, so it mostly comes down to finding information in the first place. It sounds like others haven't known where to start in the past, but I guess I'm lucky not to have ever had that problem. I'm drowning in more information than I can take in already. My bookshelves have always been and will always be full of books I intend to get to eventually. All the other things taking up mental time and energy, spending time with my wife, trying to have a family, making and eating food, sleep, exercise, software can't help me with. The only time-consuming activity I can potentially offload is paid work if I can find some other way to pay the bills.

👤 nicbou
It lowered the bar for being curious. I used to Google all sorts of questions that popped in my mind. For a while I even had a list called "things I don't understand".

Then Google got worse and I started to resent having to refine my query multiple times and sorting through junk results.

Now I ask ChatGPT and get a straightforward answer. I am aware that it's sometimes wrong, but as an average of many shallow introductions, it's excellent.


👤 sshine
I've tried to use ChatGPT to answer programming questions where I need either a library reference and/or an example. I find that ChatGPT isn't better than looking up the library reference, but that sometimes it's faster for it to generate the example that I'd otherwise have to look on several pages before I find.

I also spend too long clarifying what I mean.

For example, I wanted for a Rust program to detach into the background, and ChatGPT (with my stupid prompting) kept suggesting I just run `std::process::Command::new("program")`, but I want a single executable to detach! Eventually, once I struck the right chord, it suggested the `daemonize` crate. But it wasn't until after I'd found that by conventional search.

I sometimes use the Kagi !fgpt pattern if I know that what I'm searching for has a good average answer. It'll give that answer and skip the blinking ads, cookie pop-ups, newsletter popups, and autonomously scrolling on my behalf.

I'm looking forward to having an offline AI assistant that'll search and accumulate, rather than hallucinate answers from a bunch of stolen code snippets that akin to "copy-pasting from StackOverflow, but with hallucinations."


👤 spurgelaurels
I've gone back to reading paper books.

👤 VladimirGolovin
Thanks to ChatGPT, I'm much less hesitant to delve (haha) into unfamiliar topics. "Hi! I'm a beginner programmer. I'm interested in learning Idris but I know next to nothing about dependent types. Could you explain them in a couple of sentences?"

Then, after the answer, I ask follow-up questions. I also try to check the answers against other sources, e.g. docs or Wikipedia in order to spot hallucinations.


👤 authorfly
Yes... AI helps transform learning content to make it more possible to consume in ways that are effective for learning. But it can't help you with things up Blooms hierarchy (like synthesis) and indeed using ChatGPT might harm you doing that - it prevents you feeling the severe pain of bad/inconsistent writing.

Having learned before and after ChatGPT, the workload of students has remained the same, but some can obviously be done with AI, and most students do this, or have tried it.

I believe as a result, the efficacy boost has been offset by the lower amount of time people spend studying - just like most historical studying technologies. For every 1 student who uses AI to learn, there is 2-3 who prefer just to use it to cheat. But it works brilliantly for every type of student for basic explanations, run downs of historic authors or positions, etc. But this is pretty much just wikipedia stuff re-arranged to your learning level. It's helpful, but not augmenting.


👤 flessner
For university, it hasn't changed by a lot. Ideally I would want a tool where I could input all slides, worksheets and scripts for a module and then get a nice summary, mock exams, ... nothing can do this which is why I wrote my own tool for this, it works alright, but it isn't 100% accurate and misses bits of information. So I am mostly back to compiling my own summaries and learning based on old exams.

As a side project I am currently building a drone myself on a really tight budget. While I am pretty good on the coding side, my understanding of electronics is basically non-existent. So when I am asking basic questions it's quite helpful; as soon as I am giving it specifications ("Will this brushless motor and this ECS work with a 4S lipo?") it breaks down completely - so it's helpful, but far from perfect.


👤 mitthrowaway2
Llama helped me become more comfortable with Lagrangian mechanics. I had to keep correcting its math errors along the way, but it was nonetheless very good at answering my questions -- like having a patient and empathetic but very absent-minded professor at my fingertips. And because I was always on my toes and double-checking its work for mistakes, I had to learn actively with my brain switched on, instead of just reading passively.

So ironically, its flaws made it a pretty good teacher in this case.

I do have to be especially careful not to ask it leading questions, because it's so biased towards positive affirmation that it would rather lie to say I'm right than to explain why I'm mistaken.


👤 jryb
It's the first thing I go to when I have a "stupid" question. I also just use it to learn languages and math, and get clarification when something in my area of expertise is confusing. It's dramatically reduced my search engine use. It also enabled me to write an anki clone that I just wouldn't have bothered with otherwise.

That said, it is actively harmful when discussing components of Chinese characters - it hallucinates so much it's essentially unusable. I stick to traditional resources for that. I'm also reading as many scientific papers as before, there's really no substitute for that yet, and I haven't found it very good at literature searches.


👤 jonplackett
I really like to learn by example so learning SwiftUI via ChatGPT has been brilliant.

I find tutorials often have some kind of weird additional thing in them I don’t care about. Like they’re making a list app but can’t help over complicating it with other stuff like adding in images or videos or parsing xml when I just want to learn something specific.

ChatGPT has been awesome for this. You can get simple examples and look through them. Ask questions about the code. Try it out. Change things. If it doesn’t work how you expect, paste it back in with a question.

It’s made learning a new language so much easier and probably 5x faster. I’ve started doing the same kind of thing with learning Spanish too.


👤 nathanasmith
Between the hallucinations and over the top agreeableness I have a hard time finding LLMs useful as a reliable learning aid. First you have to disentangle what output is factual from what it's just making up and then you have to be careful not to feed it loaded questions lest you unwittingly lead it on such that it's just agreeing with whatever you say. The amount of work required to get satisfactory results might be better spent doing research in more traditional ways.

All that said I'm very excited for the future and look forward to these problems being solved as I believe they eventually will.


👤 hiAndrewQuinn
I use it a ton to generate example sentences I can then import into Anki to practice vocabulary in a foreign language, specifically Finnish.

I've even been working on a tiny open source wrapper around the OpenAI API specifically to speed this process up, based on what I've learned works from experience: https://github.com/hiAndrewQuinn/wordmeat


👤 consumer451
I use OpenAI and Anthropic tools for brainstorming/rubber ducking/sanity checks.

I know that the LLM provided specifics are almost never good enough for a final answer. However, LLMs can get me to think out of my own personal box.

Basically, this has replaced what I once used to get out of being surrounded by human peers. However, I was reticent to bother humans, and I have no such reservations about asking a chatbot a dumb question.


👤 rawgabbit
My son used the "custom GPT" in ChatGPT a lot e.g., for Organic Chemistry and Advanced Biology. He said they saved him a ton of time as it explained basic concepts when he had trouble understanding the textbook (which is written in the most Orwellian manner possible). Before, he googled Libretext and Khan Academy etc., but the custom GPTs summarizes that information nicely.

👤 marksimi
As someone focused on grad school, I find myself much less frequently getting stumped by problems and rage-quitting. I also use some prompts which help to speed up my learning in general while making sure any LLM doesn’t give me the answer.

One of my research interests is on how humans use expert systems (akin to how Go players’ ELO ramped significantly after the release of AlphaGo).


👤 Crier1002
i think it help to expedite learning/research by quite a bit. before this, i had to append "site:reddit.com" to my google searches and spend hours to read multiple post/articles about the topic.

now, i do my learnings/research from something like phind or perplexity. I have a shortcut "!pl" or "!p" setup on my address bar. I just have to type in "!pl food recommendation for keto diet" for example in my address search bar and it would summarize everything for me. After that, all i had to do is to read/click a few more links in deep into reddit or wtv to verify what the LLM tells me is according to it's citations, then i can gauge if the answers are credible or not. i just ask some follow-ups if i have any. The result has been quite satisfactory so far


👤 keiferski
I used to ask questions on various /r/Ask subreddits. For example, AskHistorians and AskPhilosophy.

Just as ChatGPT became more available, these subreddits decided to make their posting policies unnecessarily strict. The "answering culture" has also become more hostile, with people downvoting questions that don't fit into the monoculture of Reddit's hivemind.

And so I have found ChatGPT to be very useful for asking about philosophical and historical questions, specifically asking for resources on a particular topic/problem. E.g., "Has any philosopher written about XYZ topic?" It will sometimes give me imaginary resources, but usually it'll recommend actual books written on the subject.


👤 ramon156
I feel like all the low tier questions I'd have are solved with ChatGPT. These questions would also be solvable by simply re-reading documentation, but its nice to be able to ask another "person" to explain it for you. Maybe my attention span is just non-existent

👤 solarized
Prompting ELI5: your complex subjects to know. then build more advance conversation after that, really help me a lot.

except, their answer inconsistencies. made me like you chat with someone who you have trust issues with them.

In the end, i have to googling again to validate this creatures output. geez.


👤 ulrischa
Massively changed my learning in the development. Before I sat in front of my pc and did not know where to start. Also understanding the innerhalb working is much easier. I learned next.js in just a few weeks

👤 sebastiansm
Last year, when I was studying for an AWS Certification, I used a lot of chatGPT to explain me services and concepts with useful analogies and practical use cases.

👤 qwertyuiop_
I have to do a double take on Google, Duck, Brave because even the search results and content farms are spitting out AI hallucinated nonsense.

👤 hot_gril
Yolo pumping out code for a programming language or tooling I'm unfamiliar with, either to use as boilerplate or learn from as an example.

👤 lolive
I still use StackOverflow 99% of the time for my code issues. #micDrop

👤 sevensor
Now that AI has crapped all over the internet, I use books more.