HACKER Q&A
📣 sabrina_ramonov

Is misinformation a solvable problem?


the code and cost to generate misinformation is essentially free


  👤 aristofun Accepted Answer ✓
It is fundamentally unsolvable problem.

Many literature about it from fiction like https://en.m.wikipedia.org/wiki/In_a_Grove

Through pop sci https://www.goodreads.com/book/show/789727.How_Real_Is_Real_

All the way to the deepest epistemological problems.

The main reason is similar to why AGI cannot be achieved — it is impossible to achieve something that you can’t or haven’t clearly defined first.


👤 sky2224
I understand what you're getting at, but I think we're trying to attack the wrong issue here.

Humans have been lying or sharing incorrect information (whether the person had known it or not) for thousands of years. The only difference now is the volume of information being thrown at us has increased an exponential amount and our brains haven't evolved to handle that.

So my thinking is that we shouldn't be trying to reel in the "bad" information, we should just be trying to reel in the speed at which information is coming at us.


👤 mikewarot
It's a matter of curating quality sources of information. Society solved this with the concept of reputation as a fairly good heuristic.

👤 pizza
People find groups they want to join in proportion to the ease of finding those groups (which is in proportion to the size of those groups) and the degree to which they agree with them. You could try to change those factors, but would it be very simple?

👤 ddgflorida
Freedom of the press should be protected, no labeled.

👤 rulalala
The problem is the spreading of it, and given human social dynamics, there is no permanent solution.

👤 ActorNightly
Absolutely.

People who spread misinformation aren't evil. They subscribe to a certain set of ideals, and feel comfortable within their social circles when those ideals get reinforced. The main desire is to be validated as a person, the issue is that internet has allowed those social circles to leak and attract other people looking for validation.

All you need to do is to provide a natural organization of those social circles with a technical solution, where people can get the validation while having their ideas reinforced, except those ideas would be factually correct.

And on a hierarchy of needs, things like money and food and shelter and health come way before ideology, so you just need financial incentives in that form to get people to organize. I.e something that looks like "If you post a true wikipedia article that is peer reviewed to be true, you get tax breaks or payouts, if you post things that are factually false, you get a fee".

Multiple ways to solve this technically. All the solutions however would require a government funded/ran public key registry associated with your unique username, with a corresponding private key that is password+biometric determined, and you would need to have every piece of social media signed with a private key.


👤 anamax
WWhat is misinformation? What does "solve" mean? (Is that the same as "what should be done about misinformation?")

👤 owenpalmer
There are ways to help prevent it. For example, if social media had a built-in feature for citing scientific papers, that could help a little. There wouldn't really be an excuse not to include them. Also, if there was a better way to quantity the validity of papers (while somehow avoiding Goodharts law), that could give people a more efficient way to sort through information.

LLMs might help a little, if we could find a way to control for their bias.

Another strategy is comparing both extremes of an opinion. GroundNews has employed this strategy. Whether or not it's effective at preventing misinformation, I don't know. But it's an interesting idea.

The last and most important preventative technique is to teach people how to avoid confirmation bias. I think that is the source of most misinformation: the gullibility of the general population.


👤 andrei_says_
You may want to see the work of linguistics professor George Lakoff who recommends using certain patterns to disarm misinformation, like the “truth sandwich”.

https://soundcloud.com/future-hindsight/the-truth-sandwich-g...

One of the aspects of successful misinformation is that it establishes a desired framing of an issue. Merely repeating the framing, even to indicate its incorrectness, enforces it. The way around it is to never repeat the framing and use correct framing instead. Political speech nowadays is based on extremely weaponized framing. Once you hear this you can’t unhear it.


👤 wittystick
The only thing that isn't potentially misinformation is that which you witness with your own eyes. Since everyone has their own eyes, calling what they witness misinformation is known as gaslighting.

The only way to prevent people sharing their experiences is called censorship.

What you're really asking is "can we stop people from lying?"

The answer is no.


👤 binsquare
I think this is where applying proof of work is interesting - make the cost of spreading information expensive to discourage misinformation saturation.

👤 pushfoo
TL;DR: "Solvable" sounds like you're mistaking a social and human issue for a technical challenge

> the code and cost to generate misinformation is essentially free

It's always been pretty low. People also want to believe things which feel good, especially when they're blatantly fake. That's not a technological problem, but a social one.

It's like the quote about people having trouble understanding ideas which threaten their income[1], but the payoff is emotional instead of financial.

[1]: https://www.goodreads.com/quotes/21810-it-is-difficult-to-ge...


👤 Khelavaster
Almost every piece of academic fraud is misinformation.

Every shady AI marketing ad is misinformation.

It's absurd that Biden's administration went after individuals sharing bad takes on COVID, but not classic fraudulent advertising.

Very troubling to see such a huge mismatch between stated policy and government activity.