So how do we fix this? I've heard some talk that upvote systems and algorithms might be at fault. Do we ditch them and go back to a literal timeline? Or is this more of a social problem that code can't solve? Let's hear some input on this, because I can't shake the feeling that tech isn't totally innocent in this mayhem.
We need to start emphasizing what we have in common, help people see that ultimately we're really not all that different from each other. The internet lets people hide behind this isolating veneer that makes it too easy to just shout at each other and behave in extreme ways that would never be socially acceptable in the real world. Interacting in real life significantly moderates people's behaviour and when social media and the internet facilitate segregation and amplification of extremes, spending more time in person might really do us a lot of good.
We need to put people in a context where politics is irrelevant and a shared interest has the opportunity to bond people who would otherwise not meet. It's the only way I see of countering Identity Politics that try to be all consuming.
It would also be helpful to encourage people to focus on more positive events. Outrage culture is a thing, only exacerbated by social media, where rage = clicks/views = money. We need to change the economics of this equation. We would really benefit from a culture that gave more attention to people doing good rather than just focusing on the villains in society.
Do you know _why_ this uptick has happened? Does anyone?
I get the feeling, in general, when people talk about wanting to "Do Something" about this sort of thing, they are always focused on managing the symptom and never focused on addressing the cause. Frequently, when I try to bring this up, the cause is waved away with "oh they're just stupid/ignorant/angry" or some other such explanation. The current-year favourite is "they just believe fake news", and I'm sure next year there will be a political explanation, and then the year after that we'll loop back to generic ones. (The irony of denouncing other people as causing polarization "because they're just angry and stupid" is lost on most people)
I don't know why this is happening. But I do know that until the why is address, and done so with care and respect, this problem will be papered over at best but never solved.
Also
> Do we ditch them and go back to a literal timeline?
If I got one wish to change the world, but I had to give up a bunch of great things, I would sell out all technological wonder in order to get literal timelines back. As soon as other people decided they knew what I wanted to see better than I did, the problem started. And as soon as they got tired of doing that and made computers do it for them, it got worse
Here's the thing. It's leaking. This stuff used to be contained, kept in a little secret segmented-off piece of your world that was 'the internet'. Now it's invading your real life, your real-world friendships, your workplace, even your family.
Think of the Greater Internet F-ckwad Theory: "Normal Person + Anonymity + an Audience => Total F-ckwad". Now run that function over the entire populace, map/reduce style. Now take that F-ckwad, and remove the Anonymity by requiring them to use their real name. And bingo, now you live in Interesting Times.
Everybody remember the collective insanity of the Covington Catholic school fiasco where journalists and celebrities were publicly wishing for the death of a bunch of kids wearing MAGA hats or at the very least getting them doxxed or hurt all because of a one minute video taken out of context. And those who approached the story in a cautious manner were called out by the rest for being to soft or accused of being part of "the bigots", Ironically, the account that initially twitted the video snippet was banned after a few days.
I don't think the population is more deranged than before though, I mean the 70's were pretty violent politically, more than today.
The solution is to quit social media or join smaller communities free of wedge issues and identity politics. Twitter, Facebook or Reddit aren't going to fix themselves, they make money off outrage and petty divisions.
And finally you don't owe activism on a specific cause or an opinion to anyone. It's OK not to have an opinion or not wanting to get involved in a political debate. Anybody that attacks you for refusing to take a stance on any issue should probably be muted/blocked or removed from your life, that person isn't your friend and will try to make you look bad at the first opportunity just get brownie internet points for themselves.
The fact that people have increasingly divergent views (or rather, are voicing them, they probably always held those views) is just a fact of life. It's not unique to any online community or even any country. It might be simply the fact that too much diversity is inherently destined to collapse.
'Toxicity' is a made-up new meaning for the word, which means anything undesirable from the point of view of the speaker.
Labelling the former as the later is one of the root causes of the feedback loop you're describing.
The first step is for both sides of polarising issues to acknowledge each other, that both think they're doing the right thing, and that neither is 'toxic'. Without that common ground, dialogue can never begin.
Instead, you should try to accept the truth: free speech is ugly, deal with it. There always are some people, who are nice, but all people are never nice. A lot of different people with different opinions freely voicing their opinions in the public space will always be ugly.
Ok, now let's assume you don't care and get to actually answering your question. In order to remove "ugly" you need to remove some of the elements:
1. Different opinions. People with different opinions are removed from society, everybody in the community must be as similar as possible. There are multiple ways to achieve that to multiple degrees, but the key is closed communities, since banning faster than they appear (or find a way to return, which is relevant for the web) is a hard work.
2. Freely voicing them. Again, multiple ways: good old moderation of the content, make everybody have a stake in what they are saying (reputation systems, goods exchange, game mechanics enforcing cooperation), etc.
3. Public space. Move all discussions to PM, so they still argue, but you don't see it and feel good about yourself and your platform.
4. Speech. Just don't let people communicate. Easiest to achieve on a given platform, and the most effective.
There was a theory in 2004, known as GIFT: Normal Person + Anonymity + Audience = Total Fuckwad
Guess what? People without anonymity are total fuckwads too - this happens on Facebook, Instagram and Twitch, in an era where you could probably find someone's personal details if you look hard enough.
I think the key is audience. Bullying feels good for a lot of people. Bullies will go for low hanging fruit where they won't be struck back.
You see people acting this way even in once helpful sites like Stack Overflow; downvotes will pull bullies in like a magnet. You see people picking on anti-vax, flat earthers, Justin Bieber, not so much because they do harm, but because they're easy targets to hold down.
Viral algorithms amplify this effect. It highlights bad news that everyone can join in and rage on. It's not a new thing; news channels have done this for decades.
We can't really fix it. I'm more a community nomad these days. It's easy to move on to other new and old communities, away from this effect. I've been happy with HN, Discord, and IRC lately.. IRC has picked up back to 500 people per channel, and it's probably no coincidence.
You will realize that Social Media is designed in such a way intentionally to draw toxicity from society.
No technology is neutral. Technology gives and technology takes away. Medium is the message. Communication evolved because of the medium. For every evolution, it gives something and it takes away something. From oral communication to the printing press to telegraph to TV to the internet to social media. Every evolution gives birth to something and destroys something.
For example, TV makes everyone reachable (what it gives), but makes everything into entertainment (what it takes), even important topic such as politics, religion, war, poverty, pestilence, etc science becomes pure entertainment, juxtaposed between endless drama, reality show, and ads, coupled with background music and personas that manipulate the minds. The more "entertainer" you are, the better, regardless whether you are a dumb scientist or a dumb lawmaker that will affect many people's lives using your policy.
Social media such as twitter, for example, everyone now has a voice (what it gives), but with its char limit (what it takes) doesn't give critical thinking and rational debate a highlight, therefore it spirals down into madness. The more outrageous you are, the better, because it will go viral and people will react in such a predictable way.
A good example of a person who knows exactly how TV audience and social media audience will behave in a predictable way, and took advantage of that, is President Trump.
You want to design a medium/platform in such a way so that the pros outweigh the cons. But I think the hard part is knowing how will people use the platform. Those social media giants started out with good intentions, and only later and later down the road, and here we are right now, that we discover its true effects.
Smaller internet forums, subreddits, Discord/Slack groups, etc tend to be a lot more civil than the likes of Twitter or YouTube are.
So a revival of those types of sites and communities will help a lot.
As will returning to the days of multi pseudonyms for different websites. Because people are not one sided. They don't always act the same way in every setting.
No, their behaviour depends on the company they're with. They might act one way with family, another way with friends, another way at work, etc.
That's how society stays together to some degree. People don't know how others act in other settings, and they don't care. Your coworkers likely have a whole mix of political opinions, but since it likely doesn't come up during work, it doesn't really matter.
Social networks seem to be trying to demolish this sense of separation between sides of people's personalities, and that's making society more and more fragile, as one wrong move means someone's entire life gets destroyed by the internet mob.
Oh, and decent moderation too. Unfortunately for Facebook and co, you can't automate moderation and expect it to work well, and you can't outsource it to a bunch of full time employees in a distant office somewhere. It has to be done by people with a real investment in the community, which is again where a well run small community shines.
No seriously, just stop. This isn't a "complex problem that needs nuanced technical and legislative solutions".
Back in the old day, there was this saying. "Do not feed the trolls". Sadly, we've forgotten that.
Our current approach is, "create rigorous 30 minute point by point take down videos to defeat their point of view". Our urge to debate and correct people who are wrong just fuels them making more content. A troll needs reactions to survive. Just downvote and move along.
A lot of the people complaining about "toxicity" on the internet seem to be under the impression that if we "deplatform" the so-called toxic people, that will fix things. But on the contrary, that makes things worse.
If someone says something awful on the internet, and everyone either ignores them or politely presents a counterargument, they either move on because they feel they've been heard, or they engage in a polite discussion. Maybe they change their mind, maybe they don't. If they really can't engage in polite discussion, then they come across as making their ideas look worse, so they aren't really doing much harm.
If someone says something awful on the internet, and everyone rails about how awful it is and gets them banned, then that person is angry, and that anger motivates them to keep posting about it everywhere and spreading their idea. Meanwhile, they will integrate that idea into their identity, which makes it far harder to change their mind. And if you actually manage to get them to go away, they will go to cesspools like Voat, where they are even less likely to be exposed to ideas that change their mind, and where in fact they are likely to be exposed to even worse ideas.
Let's get some perspective: what you're complaining about is people saying things you don't like on the internet. Yes, what they are saying spreads ignorance, but the solution to ignorance isn't silencing the ignorant, it's education.
MLK and Harvey Milk both recognized that the source of the bigotry they fought against was fear borne of ignorance. But the average left-leaning person today doesn't see bigots even as people any more. All it takes nowadays is for someone to say one of a list of banned phrases and they're completely written off as even human. If we're going to bridge the gap here, we on the left have got to consider that we might be the toxic ones.
I've found that when I actually talk to so-called "toxic" people politely, they are willing to listen. It's not them that are causing the polarization.
For example, I started reading Slashdot not long after it started ('96?). I really enjoyed it, read it every day, occasionally posted. Then sometime around 2000 I just got disgusted with the quality of the comments- it seemed like there was a long slow decline, and no longer did the average commenter seem like they were my age or older, my level of experience or greater, my level of knowledge or greater- instead it just seems like a cesspool of pointless, toxic flamewars about stupid things that seemed to exist for their own sake. Reading Slashdot comments started making me feel stupider and angrier, not more informed. So I eventually stopped reading Slashdot.
But then, over the years I would talk to various people younger than me with the same experience- the only thing that changed was which year they thought Slashdot was good, and which year they thought it went so downhill that they just couldn't read it anymore. Like, "Slashdot was awesome circa 2007 but by 2011 all the commenters sounded like toxic teenagers". Which makes me think that what actually happened was Slashdot comments were always terrible, but if you started reading it your larval hacker years you might not know any better. Then you grow up some more and get more knowledgeable and mature, and the same level of commentary seems really inane.
EDIT: Possibly against my better judgment I'll take the question a bit more at face value. I think it's all about the climate or culture of a place. As an engineer I've tended to dismiss talk of "culture" esp. where a company or workplace is concerned, but in recent years I've changed my mind. For example, one message or pronouncement by the CEO can change people's entire experience of working there. Same thing for a website, which is a virtual place. And what kind of culture do you expect will develop in a place where the incentives, motivations, values etc. of the people running the site are actively hostile to yours? A place where the very premise of being there is already a hostile act against other humans? It's a corrupt and fundamentally dishonest (not just incidentally dishonest, I would argue) place where the social order has already broken down before a word is even said.
Yeah okay, being a surveillance platform that optimizes for dark patterns and "engagement" doesn't guarantee that it'll be full of trolls. But it's not exactly a surprise when it turns out to be that way.
Steve Jobs was obviously pretty shitty in some ways, but at the height of Apple's revolutionizing of the UI (before it turned shitty) it made you feel like there was something noble about it and you felt better and more of a sense of decorum just by being there. Meanwhile Twitter refuses sensible and awesome ideas for UI/UX over and over for years because they get more engagement by keeping it shitty. And shittiness has a way of spreading. If the place is shitty you will act shitty. But if you suddenly get invited to go have lunch with the Queen of England or something, you probably behave better despite yourself, even if you're Johnny Rotten.
We don't "behave better" because, fewer and fewer of us are getting invited to lunch with the Queen, and because we're not the (U)ser whose (X) the ones running the place care about, and because we can tell we're being taken advantage of, and the nearest person available to take out our grievances on, is each other.
I don't know what internet you were using, but this does not match my experience. The internet has been a dickfighting playground since at least the 90s. The biggest difference is that comment threads are the majority of how people create content on the internet now and they weren't before.
Moderators now are also more willing to ban accounts that are non-spam. Certain views are "harmful" and apparently people cannot be asked to not react to seeing things, or even ignore people, whether via some tool or just by saying "Oh, it's her again, I'll just skip over that." This places pressure upward on the moderators to make certain people go away.
Folks love gaming the algorithms to decide what will and will not be read and algorithms come into play because of the aforementioned large scales. Face it, the Internet runs on ad money and that can be tight, so everyone gets the bright idea to either let the algorithms do the moderation or to let people who have free time on their hands and the motivation to do it. Usually the motivation is expressed in some seemingly-altruistic manner but it always boils down to pushing their own views. Eventually some views become acceptable, others anathema, and you get that ghastly distillation where more moderate voices are driven off until finally you get a kind of ghost-town.
This is a pretty tough nut to crack from this vantage point.
One of the ways I see out is the admittedly computationally-intensive tack of making the algorithms only affect a user's personal view rather than making it site-wide. You can still run into the echo chamber issue, only in this case it is a bunch of people in their own private speech bubbles in one large chamber.
I've spent a lot of time thinking about this because I have watched this sort of thing kill communities for the past thirty years now.
"Have you ever noticed that anybody driving slower than you is an idiot, and anyone going faster than you is a maniac?" - George Carlin.
Funny... but what's even funnier is that EVERYBODY THINKS THIS.
People dont realize that it's not that there are a group of bad drivers out there, but that everyone drives badly sometimes. It's always different people.
The toxicicity online is similar. There's just a lot more people online a lot more hours per day, and everybody is a jerk sometimes.
My personal feeling is that reverse chronological order with maybe throttling your chronic over-sharing friends would probably go a long way to helping blunt some of the negative effects we see today. But it would also lower engagement and lead to revenue losses so they probably haven't earnestly considered it. Kinda like how the cigarette companies don't want you to know that smoking is bad for you, because you might smoke less.
The real problems are:
A) real life sucks unless you are part of the subset of the rich or have good support structures,
B) toxic behaviors helps you get ahead in certain situations in life and people with low or no resources often resort to them,
C) there's no cost to joining certain social networks so it attracts those with no resources and/or the lowest common denominator, including people who have been damaged by life and only know toxic behavior as a norm,
D) people are interesting and useful when doing something productive or creative, but not otherwise.
It's hard to say if the category of people who fall in C is increasing in absolute number or only due to popularity of certain social networks.
Social networks that aren't dedicated toward doing something productive or creative, or who are open to everyone for free, will be overtaken by toxic behavior as a result. They should be avoided, or new ones developed in their place.
Another factor is the downfall of respect for the news media in the US, this makes discussions of news events attract comments from the type of people in C as they think their opinions are more important than they should be.
If war is politics by other means (Clausewitz), we can turn that around and say that politics is war by other means. See: the Russian concept of hybrid warfare.
What ends war is when enough people in a society are convinced it's not worth it. And that happens when enough people are exposed to the horrors of war, when they see what it does to their own lives and to the lives of people they care about.
Now we're in a period when most people haven't been exposed to that level, so this is unfortunately an upswing.
The way through is for some number of us to keep our humanity toward others (in the positive sense), to act accordingly, to carry that through until conflict blows over, and to rebuild society afterwards.
But why is this happening recently? I think the rise of smart phones has facilitated way more people engaging on the internet instead of just browsing. This in turn has opened up people to the vast amount of viewpoints in the world, and unfortunately people tend to get angry at people that are different.
Another thing that has become a problem is people being paid to support or attack a cause. These professional rabble rousers are stirring up shit inside of people that would rather just have a sandwhich.
Eh, people were being jerks to each other on usenets and bbs's way before this.
Godwin's law is 30 years old at this point.
I've found it easy to ignore content I don't like, but it is getting harder to find open discussion forums.
Absolutely, because it's wrong to regard the web as a thing that's separate from the real world and it's effects. We're talking about people's identities in the sense of how they see themselves, and that's shaped by their environment (what they see, hear and do), online and offline.
What happened is that people are able to find a tribe, whereas before the web was a thing, finding a tribe was more difficult and often exclusive to cities.
People living in rural areas were likely to adapt to existing "big village tribes", branching off wasn't viable or a matter of circumstances because of a lack of different subcultures within those rural areas.
The web changed that, everybody can socialize and find a tribe online, it's like a big city full of subcultures, and subcultures provide identity.
Early people on the web were often already part of urban tribes I think, and took position of views within those, so if they encountered a rival group online (think about poppers and rockers stereotypes from the 80's) toxicity was almost ensured. So what happ[ens if a mainstream culture that's fed with stereotypes through hollywood etc. is let loose on the web?
The before tribeless population which was forced into a tribe by circumstance is basically discovering the world and is going through their teenage years.
That's my explanation of what's happening.
When I released my app 8-years ago, I thought support would be my most hated part of releasing a product. To my surprise, it was one of the most rewarding parts, I met some great people via email, some I will visit one day.
But support has its ugly "toxic" side as well. This is particularly prevalent in 1-star reviews for trivial issues. All the reviewer sees is a box to vent their thoughts without considering that there is a real person on the other side. But I'm a real person who cares and some comments do stir bad emotions. This has brought some of my lowest and darkest days of app development.
Sometimes I will make contact with the reviewer and as soon as they get to know me as a real person, they are friendlier and more respectful.
The faceless nature of the internet causes people to treat others poorly. In the real world where we meet people in face-to-face, our initial and natural position is to treat each other with respect.
So here are a few ideas: 1. imagine a person, someones face, not a textbox, website or company - add a persona to him or her. 2. Ask this simple question "Would I treat or talk to someone (my friend) like this in real-life?" 3. Would you be proud of the way you're conversing if people you admire were watching, e.g., respected colleagues, friends, parents.
This will change the way you write.
2. M1S gains traction from hackers who don't want to be part-time sysadmins, but the first widely-used killer app for it is a social media front-end combined with a CDN. Self-hosted files and "publish once, syndicate everywhere" starts to flourish and Facebook begins to lose its position as middle-man to the web's social activity.
3. Bezos buys Keybase with spare change from his couch cushion and integrates it in to M1S; now all server-side apps, and the users thereof, natively have cryptographic identities.
4. Users begin to demand M1S integration from businesses that used to hold (and resell) their data. Widespread adoption from webapps cyclicly drives further adoption by users, making a M1S identity (if not heavy usage) almost ubiquitous.
5. With most client/server use cases sharing a single cryptographic auth network, the last vestiges of anonymity disappear from the web as peoples' username, real name, cryptographic identity, and checking account melt into a blob.
6. With their real identity attached to online activity, online activity becomes "activity", and, while toxicity is not eliminated, it is at least no worse than the real world.
(Crazy? maybe. But I am honestly surprised every day that I wake up and find out that Amazon hasn't made something like step 1 yet...)
At one point Merlin's in a library. We learn that Druid's are not allowed to write. Once you write something down, it becomes fixed. Fixed on paper. Fixed in your mind. Rigid and unalterable... this is not conducive to magic, which cannot work under such rigidity.
It reminded me of Socrates & Phaedrus' conversation on reading & memory. Words in a book don't work the way words in your head do. Reading about a conversation isn't like discussing a topic in Socrates' garden. When writing replaces memorisation as a method of learning, things are lost as well as gained. The ideas themselves change.
It's particularly interesting as Socrates was famous for his conversational philosophical methods, where allegory and flexible style make the case. His student is famous for his writing, and the more precise, detail-oriented, reductive and uhm... platonic in his method.
Anyway... the internet has made us all writers. We write whatever enters our head. Then it becomes fixed. Then we defend it... like cranky old academics defending pet theories they wrote in their youth.
Human’s love observing an argument.
Twitter is too concise to form decent arguments.
If there was a website called Dual - where people could register a username - then post on the forum where they were arguing to verify their dual account and then continue the argument on a platform set up for settling written arguments and integrating external references.
People would talk less anonymous smack or have to back up their smack on dual and then write at a level that convinces the masses they know what they are talking about. No swearing on dual. The disagreements would actually educate everyone and the voting and following would tell you what to advertise.
If someone was talking smack on a forum and didn’t register on dual you could assume they were a coward or a troll. If followers could supply duelers with arguments we would get to the truth in an entertaining way very quickly and any misunderstanding would be highlighted and documented.
People often do great thinking in the heat of an argument. We should harness the anonymous arguments. Politicians should accept duels instead of televised debates.
Limit the responses to 300 words and six references or something effective.
I wish I had the money and health to clone twitter and do it but someone should do it. Both sides have to watch the arguments unfold so it brings observers and duelers closer to the truth rather than polarising them.
The trick is in trusted auto verification of anonymous duelers and getting enough duels and spectators to make internet dueling a thing.
What changed? My rough list is:
1. News bloggers got online and turned flamebait into their business model.
2. Normies got iPhones and started posting online. They never read an netiquette guide. If they had heard of the concept at all they dismissed it as computer nerd nonsense. They don't think about online communication systematically and expect every interaction online to be like talking to customer service: they get to say whatever they want, and only hear a very constrained window of things back. This backfires in all the obvious ways.
3. "Platforms" with KPIs based on engagement are incentivized to enable the largest flamewars possible and put it in as many people's faces right up to the point they start causing bad PR.
4. People with basically zero forethought thought they could fix society by being paternalistic to strangers.
My most recent investment was on this thesis: http://blog.rlucas.net/vc/exchanging-thoughts-on-a-thoughtex...
Everything is an echo chamber these days. No one wants to be wrong or look foolish online. People don't want to even read opinions other than their own. There is no more civil discourse.
It's not just online - I'm noticing the same thing in real life.
I don't know how to fix it...but I hope someone does. Whatever the case, we need to treat the cause, not just the symptoms.
What struck me was how much the described trolling and toxicity resembled what we observe nowadays. The book and the setting is the 70's so the anonymous bullying and hysteria just materialized via letters, newspapers and the phone instead of social media.
Also it has been mentioned on HN several times that society didn't become more toxic - possibly even less. But more and more people gained access since the 90s and take the opportunity for abusing it in such ways. Also there are now troll farms and a single person without a job can spread poison like a gatling gun bullets.
1: https://en.wikipedia.org/wiki/The_Lost_Honour_of_Katharina_B... 2: https://en.wikipedia.org/wiki/Red_Army_Faction
The only fix is aggressive moderation, and even that isn't great, because you'll be shaped by the ideas of the moderators.
The internet is a wonderful tool. It magnifies reach of all thoughts and facts and opinions. Sometimes that's a good thing, sometimes it's a bad thing, and what each person considers good or bad is different.
We've always had "crazies" with strange ideas. Back in the day they just stood on street corners shouting. Then they passed out leaflets. Then they got on public access cable.
And now they have the internet, where they can reach the whole globe at once.
The one thing I can tell you that won't work is pure democracy. Something I learned at reddit early on is that pure democracy just doesn't work, because the trolls have far more time and patience than everyone else, and they will manipulate the system.
Also, a lot of people just don't want to think for themselves. They will just follow the loudest voice assuming it is correct. This also makes democracy not work well.
But democracy is still the best system we've got.
They'll say it's the AI, and it's only giving people what they want, but if you ask a kid what he wants to eat he'll become malnourished on an all-sugar diet. A healthy diet means serving up some vegetables even if your kids don't like you as much for it.
In other words, even if it hurts your ad revenue and engagement metrics a little, it's the morally right thing to do. I'm not saying to bury misdeeds or censor anything, but give people some positivity once in a while.
Dumping controversy and callout posts that have barely any backlinks at the top of peoples' results gives it artificial weight and makes people see the world as more and more polarized, which results in them acting accordingly, and we're all the worse for it.
When I redesigned my blog/website last year, I intentionally removed Disqus and all commenting ability. In fact, what prompted the redesign was that I wanted to add a box pleading with people to not use comments for tech support for various open source projects I've created and to file Github issues instead.
But the more I thought about it, the more I realized how bad comments have really gotten. Trolling, bullying, racism, sexism, homophobia, transphobia, abuse, and general mean behavior have become such a rampant problem that "don’t read the comments" has become an Internet meme on par with "don’t feel the trolls." Think about it: when was the last time that you changed your opinion because of a comment you read on a blog or news article? More than likely it just make you mad or made you sad.
I have been low-key nursing this idea for awhile now that the Internet would be better off if a lot of sites killed off comments. When comments become a breeding ground for the worst aspects of human behavior, why should we continue to enable it? Why should we host it and give it a voice? Why should we implicitly endorse it under our brand? Because that is what we do when we host comments.
Commenting should probably still exist on social media sites (Facebook, Reddit, HN, etc.) But most sites would be a lot better off if they just dropped comments entirely and reallocated those resources to more productive uses, and I think their users would be better off for it as well.
Yes, I am just one person with a small blog about programming. But if this is the position that I want to take, I should be the change that I want to see in the world. Thus, the removal of commenting. And I've been pushing this idea a bit in other areas that, just maybe, users don't need to be able to directly comment on things.
- Ignore them and despise people arguing on the web while you build yourself a better life.
- Laugh at them and possibly feed the troll for more fun, but it gets boring and youll get back to strategy 1.
We live in a clown world where too many people spend 90 percent on their screens receiving information from strangers instead of living a normal life. Plus the more opinionated people are on the web, the more they have nothing actually going on in their life.. its like the same fact that all psychologist are people whith poor mental healt who went into this branch hoping they would understand themselves. The louders someone is about a subject......
Do you really want to invest time into policing this clown world ?
If you are not convinced just go in flat earth group on facebook. Youll understand that "dont argue with pigs, youll get dirty and the pig loves it"
I believe there is no sane way (an insane way is to deanonymize and watch everyone - China and Russia implement this) to stop the polarization/toxicicity in the first place, all we can do is develop immunity.
- smaller communities are more civil. As forums get more popular and their power to influence grows, assholes want to use that power to get their way
- reasonable people might put in their 2 cents on a discussion, but will not continually argue with the assholes. The assholes know this, so pour lots of energy into turning a discussion into a shitstorm.
I have seen this same thing here at HN: two people go back and forth on a point until the discussion is two words per line on the far right of the screen. Once I see that starting, I usually quit reading the whole topic.
I'm not that familiar with the reputation thing here, but in my previous life I had some ideas about how to rescue our forums. Most of the ideas revolved around limits, like:
- limit the number of posts per userid on a topic. This would require that a person get all his thoughts out on the table and think before posting to do that, rather than going back and forth forever.
- limit the number of posts per userid per day. Then if someone went on a rampage, they couldn't just pollute one topic and go to another to vent (or create new topics to vent). My hope was that if you force a person to cool off for a day, they might not want to get into the same argument the next day.
Once our support forums got popular, they got ugly. We tried everything we could think of, but eventually had to shut them down and went to a model where we anonymously posted questions that users had and posted our responses. Users couldn't post. The support forums were still informative, but all the heat (and a lot of the interest) was gone.
I'd never have forums in another business. They're great in the beginning, but in my experience, impossible to manage when they get popular. HN has done a great job here, but it's also a specialized, highly educated community.
An algorithmic solution would be to rigorously hide everything that looks like mere unproductive outrage from public discourse. However, not only would that amount the severe censorship but it'd probably also not be exactly easy to implement.
Besides, platforms such as Twitter or good old-fashioned news thrive on outrage. If you take that from them there's probably not much left, which is why it's not in these platforms' interest to do something against the issue.
So, it's back to square one: Yourself. Keep your identity small and try not to perpetuate outrage on the Internet.
It's probably already been attempted, but a forum that had some understanding of language structure and could programmatically provide some feedback by adding a "Review" step to posting would be interesting. The system would have full access to the context a post is being made in and checks could be built around it. The less checks that are ignored, the better the rank.
That's no longer true. If Google or Facebook don't want to censor content, then they need to change the UI and warn that some of the results contain wrong/misleading information, or even complete lies.
This fact-checking wasn't needed long ago because the pages with the most backlinks had almost always factual content.
I think we're looking back at history with rose-tinted glasses here. Toxicity has always been around. Maybe in recent years it has grown, but I think that's just the nature of the technological beast.
It doesn't mean that on a broad scale technology always actively contributes to making things more toxic- but just that it amplifies and contributes to all sorts of things, and that includes the negative. We're more connected than we've ever been, and the unfortunate side-effect is that the degrees of separation between us and those that are toxic have been significantly reduced. Anonymity is maybe a small part, but there are plenty of openly toxic people, both online and off.
Toxic people are just toxic people and I think it's a social issue that is just always going to exist at some level. Maybe there are ways to use technology to assist in fighting against it and maybe not. I think it's just a potentially-unsolvable complex problem that will always arise in society.
I don't know the most effective ways to fight it. However, I'm trying my best not to contribute to it, and maybe personally fighting it within myself will have an outward effect on others. While I wouldn't say I'm mean/toxic/etc online, I do try to stay self-aware of my actions/reactions/emotions and what I post online (and have failed to do this sometimes) because it's very easy to get caught up in negative news/misery and then it's easy to branch off from there into an unhelpful level of anger/negativity.
I try to do my best not to assume the worst of others, to realize there are beings with entire lives unknown to me behind every screen name (if it's not a bot) and to realize it's sometimes difficult to properly infer the tone of what someone is saying online. It's still an internal work-in-progress, but I think I'm far more mindful of my behavior now than in years past.
I've been pondering this issue for over a decade. I think the general attitude is partially right: we can't do anything about troll postings. But we can create incentives for troll (meaning nonconstructive) comments in online discussions to funnel together while non-troll conversations seeking literally any goal have incentive to maintain their conversations through something similar to the "like" mechanism.
The basic idea is rather than a single comment and reply widget for online articles and posts, the "comment reply" widget has an "attitude indicator" that declares "this post is in this attitude". Others give the post a +/- rating for how well it fits the attitude.
It is a simple mechanism that alters how online discussions are handled. It has to be simple, or it will not be adopted. The change could be significant, enabling a diversity of on-topic conversations. Just pick your attitude, your sub-culture and talk with birds of a feather. Whereas no serious conversations can take place online today without random trolls stomping on everyone's china - all because we have a single reply widget funneling all voices together into a troll happy mess.
Anyone else thinking of possible solutions?
Text is not, unless your a 1% top tier writer.
Would love to see more experiments with video and voice to bring back the subtleties of human communication.
Do not participate in forums where this type of behavior happens. And participate actively in moderation (flag/upvote/downvote) to keep forums civil and interesting.
I'd say forums aren't well-suited to discuss political issues. But I agree that tech could help. For instance, Stack Exchange Politics is readable but last time I checked, it didn't really provide much value. Mostly, arguing about politics is a waste of time, on a forum or anywhere else.
At some moment we will have to agree that big online plattforms aim to change behaviour, create engagement etc. All of this has side effects and these side effects have tangible political consequences. The tangible political consequences have in turn real hard consequences for single persons and groups of people.
We technicians need to think more about the social impact of the things we build. If it is a nice business idea but is a bad idea in the sense of Kant's categorical imperative, just scrap it and move on. Ideas are cheap.
Also: get away from the screen. Meet and talk. Even wkth people with whom you disagree with. Try to keep your own bubble permeable and aim to do the same for others — you will not win your culture war anyways, so don't try to.
Instead try to widen people's horizon by taking them and their ideas seriously, even if it feels stupid to you.
It reminded me very much of how I first experienced the net in 1995: I was too amazed at talking to some random person from across the world to bother with petty arguments about politics or religion.
Now, novelty has worn off and, perhaps more importantly, for some people it's never been new - they weren't around to consciously and with great curiosity try it out for the first time. Add to this the factor of prevalence, the scale that brings and how centralized things have become.
I used to hang out on different special interest forums and IRC channels and I have to say there was as much drama, cliques, in-fighting and groupthink there as it is now. It was just smaller and more self-contained.
With today's massive community tools like Instagram, Facebook and Twitter, where there are no clear boundaries between factions or groups of interest, it's just a free-for-all: there's always someone willing to argue about something, and they're all on the same platform, which they have access to all the time, as opposed to having to deal with the diplomacy of dial-up and a shared family phone line.
If tech is to blame, it is only because it's too cheap and ubiquitous: The Great Invention that was supposed to bring us together finally has, and it turns out that we just fucking love to argue.
Humans confronted with scarcity (real or imagined) react with rational self-interest. They react more easily and strongly to perceived threats, become selfish for themselves and their in-group, and over-demonstrate their allegiance to that in-group.
Make people feel (economically) safe, that they have opportunities to accumulate wealth, and can generally do what they want. They'll be less fearful, and less afraid to see Others getting ahead as well.
Check these articles for ideas how that might be accomplished:
Teenagers will always find it amusing to say naughty things and being transgressive. Trolls never bother finding something more interesting to do.
It is like a prank call. You used to call the local pizza joint, and ask for orange chicken, annoying the manager until they flew into a rage. Now you get on a comment section and annoy people.
The old story about the tower of Babel is true. It seems humanity has faced these moments before. The solution is to separate the bubbles in separate, more cohesive communities. Ideas need such sheltering in order to grow, even if this means they won't blend easily with other ideas.
> Do we ditch them and go back to a literal timeline?
probably doesn't help. look at youtube comments. you can set twitter to timeline it doesnt make a lot of difference
If we want to be those people, the answer is 'personal growth', and lots of it.
Well, of course, but it's our inability to mutually agree on the terms "bad" and "toxic" that makes this policy fundamentally impractical, and freedom of expression the only workable solution.
These are not new problems, they're old problems in a new suit.
(1) Normie-fication of internet. Regular people who aren't necessarily enthusiats have a tendency to want or expect the internet to mirror their daily life; they're not escaping into something better, they're just using internet as extension of their normal life.
(2) An obsession with the self. Everything on social platforms today is self-promotion. "Showing off," to put it simply. Naturally, when the bulk of your internet activity is focused on self-promotion rather than pursuing an intellectual interest, you are naturally exposed to criticism. Some of that criticism will be harsh or inconsiderate, but frankly, that is just to be expected and it's always been this way.
In terms of practical, "what can we do," there's only one thing I can think of: getting rid of identities. The identity-obsessed, self-promotional social platforms will always be doomed. There is no way to "make people be nice." There is only a means to control the extent to which identity is exposed.
Bruh you have clearly never been to usenet lol
Disagreements are always solved the same way. Either by coming to an agreement or agreeing to disagree.
Bad manners is a much bigger problem that has a variety of solutions that may or may not work, and are totally up to the individual. Bad manners also prevents problem #1 (disagreements) from being resolved.
* single user - evidenced by harassment, trolling, attacks
* group - evidenced by an echo chamber, which is typically present from down votes drastically outnumbering replies or replies suggesting silencing or solicitations for recruitment
Both forms are equally toxic, but they are not equally recognizable or distinguishable. An example a lone attacker is typically repulsive and starts out as a bold violator to most users in a well moderated environment and so they rarely attract supporting attention from other users.
Contrarily, group attacks are typically benign, at first, but misery loves company. The negative attention grows on itself drawing in users with insecurities how tend to fear diversity. While group toxicity may start out benign the toxic nature of it becomes starkly apparent as it is allowed to fester resulting in comments that are direct and hostile attacks claiming justification of group support or agreement to a premise.
Toxic groups do occur on HN, but they are rarely able to grow wildly out of control in any measurable way because downvotes to any contribution are capped at -4. The only evidence of group attacks in an online environment like this are the quantity of down votes and the nature of the reply comments present.
It is my opinion toxic group behavior is a more serious concern and the lone wolf, because the lone wolf flamer is easier to identify. In many cases users have no idea they are contributing to group toxicity as conformance without explanation may feel natural. It's also serious because it is substantially harder to correct.
Either way the nature of toxic behavior is about attention whether it's to draw attention to a single user's contribution or to silence a disagreement.
But for a moment consider another factor, a broader scope. One of the ways I like to describe the internet, and its effect on the first generations to adopt it, is "we can share notes now".
What I mean by that is: previously a combination of physical barriers, geography, institutions, power structures, and communication constraints made it such that the vast vast majority of us really only got information from, and discussed information with, people within a day's-walk radius of where we lived. On the upside that meant lots of it was face-to-face and rooted in your family and community. On the downside that meant we were all incredibly isolated and living in an almost completely false understanding of the world.
The internet has pierced (or broken) that. I can watch in near realtime as the forest fires drive the people of Sydney from their homes. I can watch the chart of the wuhan flu grow exponentially every day. I have seen high def video of the slums in india, nigeria, and sacramento.
People of the past had far more first-hand experience with suffering than I do, but they hadn't the slightest inclination of the scale that we are all as aware of as we choose to be now.
I think that combination of your brain reeling at the vastness of "the problems" in conjunction with the powerlessness you feel about it leads to a sortof emotional shrieking in horror that plays out as "toxic" posting.
Which is in part to say, I don't think its going to get better anytime soon. As a matter of fact, as the challenges and constraints of climate change add pressure every year, it seems very very likely to get worse.
Human nature will prevail. Guerrilla tactics will continue to be used by organized groups to shame/banish those who do not confirm to their cultural views. High-profile people will continue to be shamed for associating with "bad" people. The era of civility and mutual respect on the Internet died over 20 years ago.
As methods of communication get cheaper and more accessible, it opens the world up to see what humanity is, not what humanity wants itself to be.
People that maliciously attack those online are broken people. Broken people exist in reality, thus broken people will exist online.
Google and Facebook can write code to stop harassment, but they can’t do so without marginalizing people who are already on the fringes of society. Plus, their system can be gamed. The output of such changes would mean that it would stop some harassment at the cost of losing accessibility for everyone and a meta game playing in the background that costs the corporations engineering resources.
The best answer right now, in my opinion, is to create tools to fight harassment, but give those tools to the users of the site rather than automated algorithms. A good example of this is a mute feature. Other tools like customized word filters, block lists and safe lists can also help empower users.
I believe that we, as a society need to engage with one another better. Be less quick to judge. Be less quick to mock. Learn to get along with people that hold views that are antithetical to our own -- just like we have to in the real world. And when someone comes in to your community with the explicit intent of disruption, simply ignore them.
To start:
- Stop using social media
- Don't use services that have public digital scorekeeping and other vanity metrics centered around ego and narcissism
- Recall that the 1st Amendment is the first amendment for a very good reason
- Recall that it's perfectly OK to disagree with others, no matter the subject
- Recall that the more difficult a subject often the more nuance is required
- Reject divisive political and social movements
- Reject identity politics
- End outrage mobs
- End PC policing nonsense
- End cancel culture
- Don't engage in 'call out culture' or 'woke' nonsense which incentivizes toxicity, division, and mob behavior both online and offline.
As for future generations, perhaps k8-12 schools should have mandatory studies of inquisitions, mob mentalities, lynchings, witch burnings, and cultural revolutions, and the tremendously negative outcomes associated with those types of behaviors.
That's never (even before the internet) been true in unmoderated open online fora, and it's still true in closed and heavily moderated fora that seek that outcome.
Unmoderated open fora have probably become more dominant because effective moderation doesn't scale.
So then the question is: whose view of toxicity is right? The answer we have so far is: whoever protests the loudest or whoever controls the medium.
Although it's the answer we've got, it's far from obvious that it's the right answer.
Progressives seem to have gained the upper hand at this point.
Back in the day, many Usenet groups were OK during summer, when only professionals were online, and suddenly became toxic in the fall, as students went online. And then, with the first public ISPs, they became toxic all year. That is, Eternal September.
So we can choose where to hang out. If we want interesting discussions, we can frequent highly moderated sites, such as HN and professional subreddits. If we want toxicity, we can frequent the chans or whatever.
Edit: I ought to have said that some "toxic" sites may be useful for finding stuff that's suppressed elsewhere. If for no other reason, to know what's happening.
I think if we employ the same tactic and let polarization happen online there will be communities where the ROI of discussions drops exponentially and there will be places where less of that stuff happens for whatever reason and they will be the truly engaging places to hang out online.
A lot of people like to throw around the term “fake news” when they see an article from the news source that is opposite their side of the aisle. While fake news does exist, CNN and Fox News are not fake news. They produce heavily biased news. There is a major difference. While both news sources have been caught lying before, both of them are usually telling the truth. It may only be half of the truth and purposely mislead the viewer, but that doesn’t mean the information being shared is fake or false.
I’m relatively young so I don’t know if there has ever been anything remotely close to “unbiased news,” but just presenting the public with facts that aren’t tarnished by bias would go a long way to helping people become more informed and, potentially, less worked up.
Stuff like ranked choice voting and open/no primaries that would result in more moderate politicians winning and less need for political parties.
Also a UBI might help diminish an “us vs them” mentality. The more good policies we have, the more we ALL benefit.
I also think we have non-cs folk entering the internets en masse and are there by changing the rules and norms, which generates mania over things which were just ignored before.
How do we stop it? I don't think we stop it. We build resistance to it. Every generation has its detractor and for us it is the bad effects of this network. Over time, we will start ignoring the noise and use these for what they are actually useful for (or the next detractor takes over).
Higher barriers of entry. The right to comment and occupy mind space is given away too easily. Penalize new accounts, start with limited capabilities, disallow short comments, etc.
Highlight exemplary content.
I don't have any data on how this has changed over time, but my guess is that social media polarization has increased over time.
People who are in the middle, or have some beliefs on each side, or are willing to seriously consider points from both the left and the right just don't post much anymore. On the rare occasion that they do, they get downvoted to the point that they are invisible.
So yes, when you look at social media you see the extreme left and the extreme right lobbing insults at each other. Or you might catch a glimpse of the extremists working to silence the moderates, if you dig deep enough.
I don't think it used to be as bad as it is now.
The polarization is continuing to get worse and worse.
How do we stop it? I wish I knew.
IMHO you won't fix it with technology as it runs much deeper. But you can remove comments to sweep a big part of the toxicity under the rug.
If you'd like a good overview of the longer story, Ezra Klein, who just released a book on the subject, was on Chris Hayes' "Why is this Happening?" podcast earlier this week ("Why we're polarized, with Ezra Klein").
But what I see constantly is that people have gone from a "what about you" mindset to "what about me" mindset. My conversations end as soon as the person is done answering my questions about them. Like, zero reciprocation, which is god awful.
If these people participate online, where they can mainly get away with being a shitty personality, then this toxicity is not that surprising to see.
I believe others want this too.
To combat my desire for this I'm developing two things: a Discord community for all but with a private section for paying customers. I believe when people pay for access to a community that expects a certain behavour from them they run with it.
Secondly I'm also developing a localised Meetup called Brisbane CloudOps. I want to enable physical meetings and discussions too.
I think the answer, to be honest, is smaller, niche, private communities and not vast open arenas.
I don't think it's a "problem" and I don't think there is or should be a technocratic solution.
This is new normal, where political struggle has evolved into a prolonged people's posting war, and civility is not respected. This is freedom of expression, and it's a reflection of ourselves and society, you can break the mirror if it upsets you but it won't change the underlying truth.
It's probably going to be like this for a while, and that's okay.
Limit how many media outlets any one entity can own. When all the news and all the entertainment is owned by a giant inhuman corporation whose bottom line is solely profit, they can and will act much more in their interest than in the interest of any of the individual humans that make up this corporation, never mind the humans they provide news and entertainent to.
If you have a shitload of tech money, move the fuck out of Expensive Tech City, stop giving your money to landlords, start giving it to organizations and politicians who are working to put back the barriers to this sort of thing that corporations have been relentlessly campaigning against for their entire lifespan. If you run a tech company that's making obscene profits then start sinking some of that into that sort of stuff, fuck "increasing shareholder value", have your accountants figure out creative ways to stiff your VC if you've got that to deal with instead of finding creative ways to pay less taxes. Give money to unions, unions work to make shit better for every worker, and right now part of why people are so easy to stir up is because they are broke and afraid of the fact that they are one financial crisis away from homelessness/death/etc.
Oh also I guess yeah stop trying to make another goddamn Zuckerfortune by relentlessly promoting whatever creates "engagement" in your new social media site and not giving two shits about the fact that nothing gets people to keep coming back to spend time on your hellsite like an intense argument, because all you care is how long they're there looking at the ads you've parasitized all around their human-to-human communications. Social sites are not helping but this shit has been building since before "social media" was even a thing anyone said.
Dealing with the new users eternal September was a Sisyphean task. I assume that was an earlier stage of fake toxicity. Everyone complained about it then, too, even though there was usually heavier human moderation. No one could agree about how to handle the "problem."
Want to put controls on the damage software can do to society? Regulate. Maybe even force us to be real engineers. I would like to think that software killing people (e.g. 737 Max) would be a bigger factor in that legislation than social networks encouraging meanness. The economics of the two seem to favor the former more, too.
I think some radio shows etc have greater ability to be polarizing with this out of the way, and our respective bubbles freer from outside disturbance.
Reddit, Facebook and this website are some of many with this issue. Leads to echo chambers, inability to see the opposing point of view as it is literally hidden.
As a start point, higher karma should be allocated for highest quantity of votes where total votes plus ones and minus ones total to zero. That way you are swarded karma for posting both sides of an argument, not being just some karma whore mindlessly agreeing with someone else.
Reddit and Hacker News REWARD groupthink, and REWARD brigading. They are the problem.
I enjoy posting counter arguments, but no-one ever sees them because they get downvoted to invisibility quickly.
Doing things on the internet used to be harder. It used to be a network of loosely interconnected websites. You wanted to have a voice you made a website. You wanted to talk to others you joined a chat room or newsgroup or message board.
Then we invented comments. Now we have apps with SSO. You don’t even need to make a username anymore. The entirety of the internet congealed into 50 or so web properties.
Slashdot is still around. It’s relatively hard to use. Their moderation system brutally punishes trolls and rewards good behavior. It also allows for jokes and sarcasm.
The internet abstracts that all away. If there are natural limits on what and how you engage with online they're vastly different.
We need to make the internet closer to what we can cope with.
From my light understanding of "non violent communication" I think the NVC creator would take the view that the internet just amplifies our existing deficiencies. I think it would be worth trying forums that impose NVC patterns; perhaps NLP could be applied to automate this?
Step two - You need to ask the correct question. There are lots of answers here that are both useless and answer you.
Step three - You need to get people to answer the actual question
You need the better answers to rise to the top
You need the answers to be directly actionable.
You preferably need it to be quantifiable
You need to understand how it will work in a marketplace.
As an example, we should try and understand what happened when Twitter doubled it's Tweet size. Did it make it better? Did it hurt their market? How important is message size?
Hear me out: There's another article here about how the EARN IT would attack Big Tech and reduce user freedom by limiting Section 230. Maybe Section 230 has helped the Internet grow into what it is today, but it might also be the reason for a lot of the toxicity.
I'm personally in favor of privacy and digital freedom, but if repealing that law would turn the Internet into a forum for civil intellectual discourse, it might be worth it.
Just wanted to point out to the thread: this is a question that is a great fit for VC3 (https://vc3.club). I started VC3 to try and bring together people to answer questions like this (and others) that pertain to building more meaningful community through the Internet. If you're interested in discussing this, you'd be a great fit! Send us an application and we'll get you in the discussion!
I have a feeling this issue is pretty complex. Looking forward to reading more about it soon.
People disagree; people have always disagreed. The fact that we are now repeatedly being told this is "toxic" is part of an ongoing campaign to suppress dissenting voices.
It seems often to be the case that speech which oppresses people is "free"; while speech which identifies that oppression is "toxic".
I've not noticed a difference over the years TBH. HackerNews is substantially worse than most other sites IMO as well. Ironic that I'm disagreeing with you! Ha!
I just mean - maybe you're at the part of your life where those extremes are apparent and that a degree of humility, understanding & conversation goes a lot further than most people think.
> Or is this more of a social problem that code can't solve?
More of a social problem I think.
I myself have some opinions that are controversial and not politically correct and for reference I'm a well-educated software engineer. There are people out there with far more extreme opinions than me, and the Internet gives them the opportunity to share them.
And the reward of angering and fighting with people who disagree, with no real consequences.
Take immigration, people who wish to discuss immigration often end up getting called racist or xenophobic, and the response is then: "f*ck you, I'm not a racist". Thus you have polarized the community.
And, well, polarization isn't exactly the end of the world. Polarization has occurred throughout human history and is only occurring once again, and is a seemingly integral part of the human condition.
There are plenty of parts of the web that aren't toxic or polarising; you've just got to find the ones you enjoy interacting with.
Speaking generally we've lost our ability to put ourselves in someone else's shoes.
This is the result of rampant tribalism and surely several other contributing factors including the gasoline of social media magnifying the issue.
We gotta just start empathizing more!
Just like bullies, they're a fact of life, and the key to dealing with them is to not take them seriously. A bully's goal is to get a reaction out of you so to win - don't react.
Sticks and stones may break my bones but words will never hurt me - this applies online and offline.
For instance, I myself am running a bootstrapped software business, draw immense positive energy from seeing satisfied customers and feeling that I'm the person that made it possible.
As far as I can see, due to commoditization of most labor (better organization, moving of production offshore), it is becoming harder and harder for people to self-actualize professionally. It's hard to draw personal satisfaction from driving Ubers all day or being a cog in a corporate machine. You perfectly know that one day you can be replaced by a fresh graduate with a minimal training who would do just as fine. You can no longer have a self-identity as "the best baker in town", because most people buy bread from a supermarket. There are certainly artists, video bloggers, etc, but it's a tiny percentage of the population and it pays a fraction of what the soul-crushing corporate jobs do.
So, since people cannot self-actualize professionally, they seek it elsewhere. Since the West has a culture of openness and acceptance to all kinds of minorities and subcultures, many people's self-identity becomes their belonging to a certain social group and their feeling of self-growth comes from having others acknowledge their point of view. Except, this is a zero-sum game: instead of creating value for those who need it, people begin competing for other's attention and alignment.
For instance, if I am selling ice cream and the guy across the road is selling chocolate cakes, we're at peace with each other because whoever wants ice cream will come to me and whoever wants chocolate cakes will come across the road. But if I go on a crusade trying to convince everyone that ice cream is the only correct desert, while my neighbor does the same for chocolate cakes, we quickly become political enemies intolerant of each other.
To sum it up, if you want to fix it for yourself, put your passion into something that creates value rather than aims at redistributing it, and you will feel much better. This would be extremely hard in the current economy though, and yes, it feels sucky because the previous generations sort of had it for granted. I've no idea how to fix it globally.
Back when I first got online there was this idea of internet etiquette. I am not sure where that disappeared to.
It helps if the online communities can meet in person. I remember we did this with the first indiehackers meetup and it was great to see real people and make a connection.
If you want to be a miserable person online, find an ISP willing to host your garbage until they get held responsible as well.
One possible answer: Disconnect, ignore, and avoid it while at the same gravitate towards positive, solution-oriented actions/people.
Stop banging your head against the wall, it's not going to work.
It's probably more annoying now since:
1) There is a lot more people online now than 10 years ago. We've gone from being in the far west to the industrial phase of the internet.
2) Since the web 2.0 thing there are now comments everywhere. It became even worse with social media.
I don't think any of that is true, there were tons of arguments on forums and IRC and stuff.
I believe this is purely a "now we see it all" where as before it was hidden behind private invite only communities and hidden/private chat rooms.
I think the big thing that’s changed is that internet access and computing’s audience has increased a lot over the past two deceased and it includes people that would normally be polarized in public.
Anger pushes us to action more than agreement.
Truth us hard to find and difficult to defend.
When the world is right we’re not driven to action, nothing needs fixing.
I don’t know the answers but the odds are stacked against civility and truth right now.
With growing awareness + press coverage, those platforms who deliberately implemented bad algorithms to harvest maximum attention (FB, YT, TW) increasingly find it damages their brand image. There is an incentive to (at least marginally) improve their features / biz models.
Then there is digital literacy. We used to have netiquete and that can be taught. If you read the (very entertaining) article The Internet of Beefs [0] you'll see there is a solution by not getting involved in a beef, or extracting yourself if you got baited. Similarly it helps to know how to avoid trolls [1].
It will not be easy to change cultures large-scale. Many people just like to beef (there is a similarity with road rage too; people forget themselves online). Others enjoy starting beef wars for the lolz, or - more sinister - with strategic objectives.
At least we should be able to create more safe harbours, where people thrive from uplifting experiences online. Changing culture bit by bit.
[0] https://www.ribbonfarm.com/2020/01/16/the-internet-of-beefs/
[1] https://github.com/prettydiff/wisdom/blob/master/Avoiding_Tr...
I think it's just a reflection of our nature -- we all claim to seek boring stability, yet actually thrive on drama. People who don't want drama have no trouble avoiding it online.
1) Anonymity when its not useful. Many forums are relatively anonymous to the average user but most of the time its not useful. Theres no life or death situation on reddit. Sometimes, it can be useful, but most of the time its not. People post a lot of stuff they would never say in the real world because they have no responsibility to back it up or be associated with it. Just look at what they say on T_D
2) There are objectively a lot of instances of "censorship" of "wrongthink". There just is. Not all opinions are good, important, etc, but the more you shut them down, the more people will intentionally become toxic. This is especially made worse by point 1.
3) Radical moderation of communities. User moderated areas like reddit are shaped by the mods. T_D is largely the way it is due to rabid people shaping what is and is not allowed. Same with twitter.
Take for instance climate change. Young people are asking politely for implementing needed solutions. At some point they will stop asking so nicely..
What we can to do make it better is to invest in teaching critical thinking, media literacy, and actual argumentation.
And it is important to know when to walk away. There are a lot of times when I see someone respond to a comment in a way that is shitty or just plain wrong and I would love to try to correct them. However, I try to resist. If it is reddit or something I will look at their post history. If they mostly post on certain subreddits I know that there is no point in my continuing the conversation. Nothing I can write is going to change this person's mind so I stop engaging with that person. I'm not always successful in resisting the urge to tell them how wrong they are but I try.
Yes. Any tool for communication can be used both ways. The way to fight negative uses of the tool is to use it positively, not to try to limit the tool.
I don't believe that polarization or toxicity are endemic to online spaces.
Change the material conditions that cause people to become polarized or toxic and online spaces will reflect that change.
In essence it works by mimicry. we mimic our peers and our peers mimic us. we just gotta be careful about what we mimic.
the tricky part is that this process is largely (but not entirely) subconscious. and also it takes a lot of focus and well applied effort to change previously learned behaviors. specially when "everyone else (within you social circles) does this"
> Before, around 2010-2012, people who disagreed would usually leave it at that and walk away respectfully
I can tell you've never played Call of Duty.
In a diverse society it is normal to have a wide spectrum of opinions, and a lot of them are going to seem wrong. Wrong ideas are not a threat, so long as we accept that they are just opinions. What is dangerous is giving a small group the power to decide which ideas are valid and which ones are not, because that small group will be taken over by opportunists who will use the power to control speech to further their own agenda.
This hasn't been going on for 7 years; it has been happening for at least 40.
The rage stems from the wage, which has stagnated for the working class for as long as I have been around to see it. People are trying to telegraph that unpleasant things will be happening soon, if all the people who are constantly dumping on their inferiors don't at least start handing out some umbrellas.
I think the tech has actually been a mitigating factor. Online mobs can't throw real-life firebombs. When people go out to actually do something, there are fewer people in the same physical place to pump each other up.
People stopped being "all technical" assholes that argued about everything and started to think more about social stuff.
Thats all we can do. But the most important step.
Someone posting anonymously or under a pseudonym has nothing to lose. He might not intend to offend, but he has no reason to guard his tone or consider how his audience will receive him.
Someone who posts under his real name on Facebook but only shouts into an echo chamber filled with like-minded "friends" also has nothing to lose.
Shame is a great motivator. Fear of loss of friends or career prospects is also a great motivator.
The whole HN narrative around this presumes that this is an unnatural outcome caused by the bad incentives of tech platforms. This is just wrong.
The polarization comes directly from people being able to communicate freely.
The polarization is real. It was there before, but it was controlled by censorship, basically. The polarization isn't caused by the web, it's unleashed by it. We can't stop it without neutering the web, and we don't want that.
Until the cold war between globalism and nationalism finds some resolution this atmosphere will likely persist.
There's a reason that reddit and twitter developed Nazi problems while other old fashioned human moderated forums didn't.
I literally lol'd.
The .000001% is heavily invested in keeping everyone divided and conquered.
The only incident I can remember was playing a game with chat mode where kids went full scale anti-Semitic etc.
PS: I mostly avoid Facebook, reddit or wherever toxicity normally pops up
Oh I completely forgot the most toxic place on earth: stack overflow, but I completely removed any activity there almost 10 years ago
Stop advertising it as "news"
Everyone starts out as a “noob”, seeing the unfiltered cesspool, and nobody with significant grouping weight sees their comments (except other noobs, and people who’ve categorized themselves as “noobs” by Liking inflammatory nonsense and Disliking reasoned debate.
In the year or so it takes for your identity to begin to develop weight toward your cluster, you see less and less “noob” content, and more “cluster” content.
If you like one-sided (eg. dismissive statist left/right, or aggrieved libertarian, ...) or perhaps reasoned principled respectful debate — that’s what you’ll see.
Everyone wants to see debate they’re comfortable with. If you want garbage, that’s what you’ll see. If you want understanding of opposition views in a reasoned, respectful environment, that’s what you’ll see.
What you won’t see is — trolls. They’ll all be busy trolling eachother, and we won’t see their garbage.
The idea that we can use single-axis rating to create a universally acceptable online environment is just, well, crazy. And yet, since the mid-80’s, that is what literally every online “social” tool has been trying, vainly, to accomplish! It’s stunning, really.
Massive tech companies like google have perfected how you are supposed to move someone from unaware and not caring into the firmly aware caring and loyal brand consumer. They extend that ability to anyone willing to pay for it.
All the twitter ads all the Facebook ads all the Google ads. Doesn't matter if it's for potatoes or Trump, they have the same ability to subtly move you into their camp. Even a basic search engine in reddit or google will constantly feed you the same bs day in and day out based on what you already search for. Take Google news. They aren't going to show you too much out of your region, even in world news as an American you are usually going to see American policy and any topics the engine thinks is important to you based on your previous views.
The whole of Internet technologies is meant to divide and feed you the confirmation bias info you love. Only stuff that already aligns with your views. You take that and couple the fact that ai algorithms are literally designed to take non linear data and plop them into definite categories and you'll get independent/grey thinkers slowly being pushed to one side or another no matter how complex the data.
Unless there is a way to forcefully break the feedback loop, you won't stop it. If you sat day in and day out twirling a butterfly knife or painting or whatever, you would be an expert eventually. That's what is happening now but not with a useful skill, just information aligned with what you already believe. Then after years of programming someone comes along from the other category and tries to break your beliefs, of course you'll get toxicity because it's hard to unlearn how to paint. The brain is not meant to unlearn constantly practiced behavior.
[1] no Twitter, Instagram or Facebook.
Once that happened, more and more opinion pieces started to be passed around as "news" articles. As opinions are biased, those biases got more and more apparent over time. With a lack of a way to respond, people turned to Twitter and other services.
Add to that, a continual redefining of what speech is "appropriate" and what speech is "problematic". This was a further attempt to control the discussion/debate by defining what tools could and could not be used when discussing/debating. This is seen by many who do not consider themselves "liberal" or deeply concerned with social justice issues as another attempt to exert power and control the narrative by silencing any opposition.
That's where it's all coming from. Toxic, problematic, etc. Are just synonyms for those who exert power, namely those few who are given a voice...the self-described journalist's, to label stuff they don't like. They use other terms like racist, mysoginy, Nazi, Alt-right, all synonyms for people who hold ideas contrary to their own.
As soon as we have a level playing field online, where all voices can be heard, not just blue checkmark, then maybe we'll make progress. Until then, things will get so worse and when they have the power to do so, these avenues of expression...FB, Twitter, will just be more places where you can express yourself freely as long as you say what you are allowed to say.
The problem with the internet is that it’s simply too easy to spread nonsensical views that have no basis in fact, like the anti-vaxxers and White supremacists.
People also want to blame others for what’s wrong with their lives (“[ethnic group X] are stealing our jobs”). Worse, is just as easy for opposing views to do the same. This makes the first group feel like they’re under attack.
These people are easy to manipulate and they are manipulated through the politics of fear. Look no further than Fox News and Donald Trump.
So this Is only an Internet problem in the sense that the barrier to saying stupid unfounded shit is not so low and motion can (and does).
But the single biggest factor that's driving this, in my opinion, is recommendation systems. Every source of information now follows the same pattern to increase engagement: Fill up the screen with as many different links and recommendations as possible, and determine the links shown by a machine learning algorithm to increase engagement.
If you have an app to communicate with friends, don't show them their friend list, show them a "news feed". A video player? Put recommendations on the side, and at the end of the video, and pop up recommendations when they hit pause, and autoplay the next recommendation. Sort all comments and replies based on engagement.
This increases income, but it makes people go crazy. Because what kinds of content increases engagement? For a lot of people, it's crazy inflammatory content. Women insulting all men, men insulting all women, Black Lives Matters, Back the Blue, flat Earth, vaccines, conspiracy theories... eventually the algorithm will find what triggers you to click or stay engaged. And once it figures it out, you're hooked in a crazy polarized hatred cycle.
And it doesn't help that people and companies and Russian troll farms realize this and push inflammatory content for clicks and followers.
How do you stop this? I have no idea. On a personal level, I use uBlock to clean up all recommendations on sites that I visit, and I make an effort not to respond to engage in any of the toxicity. But that doesn't really solve the problem.
Companies don't want to solve it because they'll lose money. Legislation doesn't seem like the solution. Individuals will complain if you took away recommendations. No one really wants it to stop.
Also: Fire every moderator, everywhere, and never let anyone be a moderator ever again.
&tldr; limit comments per day, require minimum amount of positive reputation to leave a comment, make it customizable per website by its owners.
"I wish I could code because I have an idea for something I think would be valuable. Basically I think debate is a good thing, and especially online debate, where you can meet & talk with so many different people and learn so much you might have otherwise missed (and teach, too).
But the problem with the platforms people debate and converse on (Facebook, reddit, YouTube) is that they're essentially popularity contests. They reward being visible and playing to people who already agree with you, which means many good ideas get steamrolled by enthusiastic trolls.
So I was thinking that maybe instead of relying on the "wisdom" of the mob, I mean crowd, to determine what's noteworthy and important and productive - which inevitably leads to a constant churn of middle school student council-like dynamics - I thought maybe you could flip it.
Don't ask people who are like you if you're making a good point. Ask the people who AREN'T like you. If a bunch of gun owners look at what a gun control advocate has to say and think, "You know, they're onto something," maybe that's more valuable than getting support from other advocates.
So I imagine a debate space where people can talk with each other in a structured fashion. Other people can watch. Somehow we have an idea of the "way" people lean politically, identity-wise, etc., or at least how they compare to others. Isn't this something that NN-driven text analysis would be good at? And at the end, you can rate someone's ideas and performance, and how much your opinion counts is weighted by how you compare to them, and then the system's understanding of who you are is adjusted accordingly. It need not even assign human-readable tags to a given personal value; all we'd know is that this person is very or not very similar to a given other. Maybe this helps to disarm trolls and to push thoughtful, respectful voices to the top. Maybe it allows voices that would usually get drowned out on any given issue to have a say. Maybe it helps us get to the heart of what debate truly gives us, which is understanding and compromise and solidarity."
tl;dr Keep upvotes/downvotes but weight them based on how similar the person voting is to the person being voted on.
But here are some of my observations:
- Everyone's online. Everyone. That includes people a) who had a less than stellar upbringing (whatever that means to which one of us); b) with mental illness (some of them untreated and/or self-diagnosed); c) from other cultural background (which don't necessarily align with western point-of-views); d) who are underaged; e) who are (very) bored (and some find entertainment in provoking chaos); f) who lack proper education; g) who simply do not care about educating themselves and others; and h) who are naturally antagonistic for whatever reason.
- Bots. There are also a good number of botnets out there (the "like" economy comes to mind) or subverted systems (smartphones, routers, PCs, etc) which act without the owner's knowledge;
I'm convinced there's also botnets out there that purposefully amplify certain controversies online (websites' comment sections come to mind). Now, understand, this doesn't apply to every controversy. But I've seen some weird stuff on YouTube, for example. And, quite frankly, it's not that hard to extrapolate that since there are "like" bots there are equally "hate" bots as well. The end game is, in most cases, visibility. In others, it's probably social engineering for whatever reason. Mostly political.
Unfortunately, media outlets have fallen into this scheme as well. I mean, it's not like the concept of clickbait is new. Attractive headlines are as old as newspapers/journalism.
- Highly progressive opinions will always clash with the status quo;
- There's a fundamental and unavoidable loss (or lack) of signal whenever people communicate in short sentences, or when they do not have enough time to fully understand what's being said, or when they communicate with people who they don't know;
If I have a friend who's a prankster/jester I might be used to their shenanigans and tolerate them--a stranger might not find them funny. Maybe the stranger will find it funny after they know my friend, but that would depend on a lot of other variables as well.
- There's also the problem of the person who transmits information not expressing themselves properly (for whatever reason) which will unavoidably lead to miscommunication;
- Social networks are fundamentally designed for engagement--now that everyone is online there will be a clash of ideas. Tribes will organically form (just like offline). And - with few exceptions - what was meant as a benign message ends up in a declaration of war from the other tribe.
So, in sum, I just scratched the surface of the problem. It's very hard to make sense of all the noise and coming up with a proper solution to this problem.
Is it an education problem? Is democracy the root cause? Would a totalitarian (regardless it being Right-Wing or Left-Wing) system work? Is it something fundamentally ingrained into our human condition that makes it impossible to solve this problem? (Look at bees, for instance, they're ruthless and yet highly efficient at what they do.) Is it a fundamental purpose (or lack thereof) that each and everyone of us has defined - through whatever heuristic and for whatever reason - that's to blame for all this chaos? Is it its visibility?
Yeah. It's a hard problem to solve. Maybe lots of compromise would help.
I honestly don't know.
The quote
"All models are wrong, but some are useful." --George Box
comes to mind.
0. Behavior online doesn't arrive out of a vacuum. If people are miserable because they're working harder, making less and their society is in retreat, they're probably going to take out their frustrations out on the easiest targets.
1. Unplug from social and mainstream media because it doesn't have much value.
2. Stop demonizing, hating on groups of people and falling for the unthinking of mob tribalism, even rhetorically. Hate bad ideas, not people.
3. Realize that we're divided-and-conquered if we're going to let a few rich people and their corporate media keep us set against to each other. Solidarity is the only way.
4. Anonymity is good in small doses, but it's too easy for people to act unreasonably hiding behind it (cyberdisinhibitionism). DHH's company wrote a blog article about improving the quality of discussions with profile pictures and real names.
5. Stop and use every instance of it as a teachable moment, where feasible. It only works though if people have shame and can be brought around to the Golden Rule/empathy... it seems to me most parents these days aren't as involved in active parenting, so their kids run ferrel and so more people grow up to act more brutally and sociopathic. Furthermore, the current prevalence of parasitic vulture capitalism valuing myopic greed and selfishness above all else reinforces a disinterest in the concerns and well-being of others... which is antithetical to community and civilization.
6. There's nothing yet so far to replace the community function filled by religion, and so many people aren't interested in behaving themselves or doing right by their neighbors or strangers if they can get away with it.
7. More people have lost most of their hope about the future. For example, no healthy society has mass shootings/suicides nearly every day that no longer make the news.
I remember being VERY active in that forum, but eventually I quit and sent an email to the owner ( I know I know ), his name was Carl I think?...saying that I could no longer deal with the absolute political polarization (anti-Libertarian) I found on the site.
So yes...you "fix" it by being the change you want to see. There is really nothing else one can do.
One thing we could do, which would not solve the problem but perhaps illuminate a better way forward, would be to work on communications technology. Not simply technology for data transfer, but for actual information and understanding transfer. Technologies and systems which facilitate the good behavior without facilitating the bad nearly as well. By that I mean things such as systems which enable presenting large volumes of nuanced information in ways accessible to more people, and systems which enable both constructing, sharing, and refining complex arguments.
As an increasingly large portion of humanity conducts themselves online, we must keep in mind that there is much of human behavior which is not admirable. The solutions to that behavior are not technological except in the most dystopian and (philosophically to me at least) disgusting scenarios. There has never been, nor ever will be, a happy, prosperous police state. Autonomy and free expression are not luxuries, they are necessary for human health. We must also always be vigilant that the systems we implement are flexible and permit society to change both within and through them. Take a thought experiment I came up with for example. Imagine that tomorrow morning 90% of the population of planet earth awoke to a realization that agitation over nudity was ludicrous. Would it be possible for our existing systems to accommodate this change, or would it actively thwart every single attempt by any individuals to live their life according to this newly adopted principle? Would it result in a global relaxation of pointless anxieties, or would it result in increased anxiety as people felt themselves isolated in their realization, 'judged' by their technology which would filter them, block them, and reject them at every stage?
At no point in history has any society, so far as we know, hit upon "The Correct Ideas" which represent unvarnished truth, eternal and unchanging. And we should be careful to consider that our current social ideas are not unwittingly treated as such, ossifying human culture.
In Eric Schmidt's book "The New Digital Age" he speaks about wishing to play a very active role in exactly this kind of cultural ossification, expressing an extremely elitist view that due to the fact Google is rich, they are Better and should therefore take steps to actively guide and mold society in the ways which Eric Schmidt believes are best. Those just so happen to be the social values of the late 1990s when Google was introduced and which facilitated their wealth-building. That is, to my mind, a dangerous game. Past history would suggest that attempts at "social engineering" which do not rely completely upon broad social consensus and upon society reaching its own conclusions and doing its own enforcement of its own ideals tends to backfire in spectacularly catastrophic and inevitably violent ways.
The essential reason for the toxicity is the contestation of online spaces, which are virtual territory over which proxy wars can be fought as a low-cost substitute for physical violence (although there is a direct nexus to physical violence, and the threat thereof, by both individual and state actors, is a factor in the aforementioned contestation.
There are 3 basic approaches to encroaching toxicity:
a. ignore it aka 'don't feed the trolls.
b. implement technical solutions to manage it.
c. Fight it aka flame wars.
Ignoring it doesn't work. It just tells the most vulnerable members of a community that they don't matter and that if they are repeatedly harassed other community members will sympathize but not really do anything to help.Technical solutions originate with the California preference for systems thinking, and are reflective of the legislative and administrative technology in which they've been incubated. They are somewhat effective, but any system can be gamed. Most sites opt for a mix of technical means and hands-on moderation by a benevolent* dictatorship which works moderately well but is not responsive or effective against determined attack.
* benevolent in terms of close alignment with the ethos of the forum, whatever that happens to be
dictatorship in terms of being arbitrary rather than mechanistic, semi-transparent, and unilateral
Flame wars are upsetting to everyone, and people in the first 2 camps view as the worst-case outcome because they take over the thread/forum/platform where they occur and are destructive of comity, much like their real-world analogs. However, they can be effective in repelling invasive toxicity - if a sufficient majority of the forum regulars participate cooperatively. If too few participate or forum norms inhibit or punish participation, then toxicity will prevail or advance.
Here's some empirical evidence supporting this based on data collected from raiding behavior on Reddit, which is similar enough to HN to serve as a useful comparison (includes links to papers, slides): https://snap.stanford.edu/conflict/
Cross-platform raiding behavior has existed as long as bulletin boards, and has been systematized and refined in line with the systematization and refinement of game and software development strategies. Here's a (somewhat offensive) overview from some years ago of trolling strategies, summarized near the end in a convenient flowchart: https://digitalvomit.wordpress.com/the-ultimate-guide-to-the...
The sophistication of raiding tactics, documentation, and so on has increased significantly since that was published. Ultimately toxicity online is neither a product of technology or the exposure of an inherent flaw of human nature, but the visible manifestation of multilateral information warfare, which is itself preliminary maneuvering and battlespace preparation for more overt forms of conflict like cyberwarfare, open economic warfare, and kinetic warfare.
Some folks find that really depressing because they get so much more negativity in their headspace than they used to. A primary approach to not being depressed about The News is to stop letting it take up so much of your time and attention. Actively seek to tune it out and focus on other things.
Similarly, the internet is bigger than it used to be, so some of this is perceptual and/or a numbers game. It's easy to find fightiness and feel like "It's everywhere." It's relatively easy to take good things for granted and underappreciate them.
Some things I find helpful:
1. Actively seek constructive engagement. For me, this involves declining to indulge the knee-jerk reaction to rebut anything and everything that directly disagrees with some comment I made (ie replies to me that tell me "You are wrong!" or similar). This is a bad habit of mine that just makes things worse and no amount of trying to justify to myself why I tend to do this makes it not a bad habit.
2. If I do choose to reply to people who disagree with me or who are being negative, think about what my goal is and what I'm trying to accomplish. Is there particular information I would like to put out as a result of their comment? Can I do it without just going down that path of "No, you!"?
3. Grow a thicker skin. I don't absolutely have to have every single person who replies to me on the internet like me, be nice to me, be my friend, blah blah blah. It's okay for other people to disagree with me and to talk about what they think. I can decline to take it so freaking personally that the entire world doesn't always agree with me.
4. Keep in mind that people are much more likely to reply to you online if they disagree. There isn't a whole lot to say if you agree and HN in particular actively discourages low value replies. So you aren't going to see a lot of vacuous "Me toos" here. That doesn't mean people here hate me.
5. View some of it through the lens of "HN/The Internet is bigger than it used to be. It's a numbers game. Multiple replies disagreeing with me say more about that fact than about me, this opinion, etc."
6. Work on my communication skills. This is an ongoing effort. Some phrases or framings tend to get knee-jerk negative engagement. Learning to say it better helps reduce the nonsense.
At the same time, I try to make my peace with the fact that no amount of effort on my end will ever completely put a stop to other people choosing to do whatever the heck they choose to do. "You can't please all of the people all the time" and that sort of thing.
(This comment is not intended to be comprehensive. It's just an off-the-cuff forum comment, not a PhD thesis.)
> ten years ago Steve 'Asshole' Jobs played a hilarious prank on all the digitally-illiterate 20th century luddites by pick-pocketing their ol' trust, reliable telephones and replaced them with computers instead. Tee-hee! Let's see if they notice the difference! And then he promptly died.
> The result, we see today, is a stratified understanding of "the internet".
> One level consists of everyone who really knows what the internet is. The internet's early adopters were a bunch of nerdy white males out of touch with society, guilty as charged. Annoying athiests and such. The thing is that this group (which I count myself among) knows how communities online function because they've been part of them for decades. They've lived the lifecycle of growth and collapse of forums (platforms) again and again. They see how technology brings people together, but not the people in their immediate life. Rather, safely brings anonymous groups of people who hide behind pseudonyms and avatars together through common interest and lively discussion and debate. [...] They see the new digital medium for what it is: connections between individual users spread across many different forums and platforms which are ever-changing, rising and falling. friends/strangers/communities first, platforms/sites second.
> Another level, the social media level, is what Fruit Juice Jobs foisted onto the unsuspecting public who now think they understand current technology and the state of the art of digital communications. Of course, they don't. They are sold a bill of goods and put all their identities into profiles which are used to sell them shit. And of course now they are vulnerable to a) people who spend hours arguing on the internet and tearing stranger's ideas apart (me right now) and b) actual trolls who love exploiting psychological vulnurabilites to make their victims squirm and squee and cry and through tantrums for the sake of drama (4chan, etc.). This decade-young group has no experience in what the internet actually and take no personal responsibility for their own online safety. For instance, they assume or act like a) Twitter will be around forever and b)Twitter can just block the bad guys and create a peaceful, harmonious online community. They cede all personal responsibility to a corporate hiearchy, like they were customers in a fast-food joint demanding to speak to a manager or some shit. They see and use the old medium in the new one, as McLuhan would say. Like how every town they visit has the same half-dozen franchises, their internet consists of the same top-5 "apps" and Google. Platforms and friends first, strangers/communities second.
(b) The legitimacy of a platform is based on what is said there, not the other way around. You view Voat as less legitimate than Reddit because you disagree with what is said on Voat more than you disagree with what is said on Reddit. If someone said something bigoted on Reddit, would you view it as more legitimate? No? Then why do you think anyone else sees it that way?
This creates an incentive for people to write provocative and controversial content because it's one of the easiest ways to get attention, because the platform assumes that if you're getting reactions from people, your content deserves to be seen by more eyeballs.
I've thought a lot about this and I think there are many changes, mostly design changes, that would decrease toxicity on social media platforms. I think most platforms wouldn't implement these ideas because they would decrease engagement and ad revenue on their sites.
Here are some of my crazy ideas (in paragraphs because I had trouble formatting bullets):
Allow negative reactions from normal users to make content less visible to others. The advantage of a downvote system is that it takes the burden of content moderation off of moderators and puts it more on the people who consume the content most. The disadvantage of a downvote system is that often, when people are downvoted, they don't learn anything because they often don't know why they were downvoted and they have to guess. Maybe when you downvote someone, you should have to pick a reason and then a breakdown of the downvote reasons should be available transparently for everyone to see.
Content recommendation improvements. YouTube's algorithm is infamous for converting people from moderates to Neo-Nazis. Part of this is that the algorithm that shows you what video you should watch next just shows you what videos keep people watching YouTube. It makes no effort to separate news and facts from opinion and infotainment, so the lines get extremely blurry. If there were separate communities on YouTube for news on different topics, or for opinion and infotainment on various topics, it could shield people who really just want the news from being exposed to three-hour-long alt-right rants, if that is not what they were looking for in the first place.
We need more awareness of emotional manipulation on social media and news content. As an example, if a headline has the word "disturbing" in it, I would call that emotional manipulation because the headline is trying to do your thinking for you, reach conclusions for you, and tell you what to think and feel before you had a chance to read the article and digest it. Too often, we vaguely point to "education" as a way to protect people from sharing toxic content or disinformation online. The fact is that educated people are human too, just as emotional as anyone else, and are not going to be in a fact-checking mindset when they see a social media post with a headline or image that makes them feel sad, angry, or disgusted. We need a platform design that lets ordinary people flag, downvote, or otherwise participate in the fight against emotionally manipulative content.
We need to improve the moderation process so that people trust moderators more, can understand their decisions more, and can contest their decisions if necessary. I think that when a moderator removes a post, it shouldn't disappear. It should be replaced with a list of the rules that were violated for everyone to see. And moderators should be able to hide a post temporarily to give the author a chance to edit it and resubmit it. Also, the first time someone posts in a forum, when they write a comment, they should see sitewide rules and community rules right before they submit, giving them a chance to go back and revise their post. (This mainly applies to a place like Reddit, but I think a platform like Facebook would also be better off with community rules.) These changes would make it so that people understand moderation more, can see a more transparent process, will stop being surprised by moderation, and will overall have a more positive experience with moderators.
On Facebook, Twitter, and YouTube, I believe some of the problem is that the platform encourages you to follow people. When you follow personalities instead of topics, it introduces a lot of potential toxicity and content moderation problems. There's no bechmark for civility, nothing you're encouraged to talk about, just opportunities and rewards for creating drama. If these platforms made it harder to make everything oriented around personal brands, and easier to engage in communities that at least have a stated purpose, that would probably help to discourage toxicity, because in a community with a stated purpose, it's easier for a community member to say, "That type of content doesn't belong here, this community is supposed to be about x." Where x is a fairly uncontroversial topic.
You're crazy my dude(tte). I've been on the interbutts since 1991 and Usenet days, and it's exactly as it always was. The big difference is we have reporter ding dongs on Twitter thinking Twitter, aka the comments section they removed from their web presence, is the real world. Also old people on Facebook who never learned the lessons of being on Usenet in the early 1990s, aka arguing on the internet is a lame and addictive hobby.
There are minor accelerants for this; youtube really did have some kind of pathological radicalization rabbit hole in its recommendation engine for a while (now it's just boring and useless and shows you "more of the same" on "blue checked" accounts). And of course, the other media encourages polarization and demonization of the other for dumb short term. That's a purely American phenomenon, and nothing's going to change it until the people pushing this swill on MSNBC, Fox and CNN decide to change it.
You seem to be assuming a. that these are real people and not bots and b. that these people are not paid shills etc. - yet we have ample evidence to the contrary on both points. If you ask me, we are in the midst of an information war that is crossing over with a culture war. I wish it would blow over but we seem to be stuck with it until the people behind it either win or become demoralised. The only solution that springs to mind is the complete removal of anonymity online, but that would bring its own problems.
It is a feedback loop. The American ideal of rugged individualism has mutated into anti-neighbor and is exactly the model for echo chambers (not neighbors with diverse opinions but cultural clones!), which are facilitated by technology. The ease at which our culture Balkanizes is backed up by trillion dollar profits from a handful of companies, which then pay politicians to look the other way.
This online toxicity is a reincarnation of the populist hate that has driven wars for millennia. It is a cancer that has been given super-steroids because of the intersection of huge amounts of cash and limitless political control. Not to mention the propaganda that idolizes billionaires and strongmen.
I think the crash and burn of America would be a solution: the collapse of trillion dollar companies and monopolies that own the government. Unfortunately this is infecting other countries where the top 1% want's to be as rich as America's, hence their urgency to imitate what is happening here for their own profit.
There's no cure: the tech monopoly and insatiable greed of the top 0.1% will be the main drivers of global poverty and oppression, and to get there they need to keep us hating each other and divided. They even picked wedge issues like race and religion, which are almost 100% irreconcilable.
I would give the top-tier credit, but I think stimulating base beliefs for profit isn't really that hard, you just need a country-sized platform.
How do you stop people from behaving like they do..? And why would you want to? Who should decide how people should behave (beyond established laws, of course)?
If people don't do something illegal that should be reported, they're being people. You have nice people, assholes, politically correct, politically incorrect, etc.
The web has nothing to do with it.
For a while we thought more people were getting cancer. Turned out cancer detection rates were low and lots of mysterious deaths in history were probably just cancer.
The web has helped illuminate in the US how big the political ideology gap always was. It was papered over by information manipulation of the corporate press for decades, coddling sensibilities of luddites, innumerate, and nesters who preferred to stay home rather than see for themselves. They also made up the biggest voting block for years.
I grew up in Trump country and left two decades ago. Was shocked to find coasties really were convinced it was Leave it to Beaver land while rural folks were convinced urban areas are universally slums. Those are REAL narratives I get from people today. Shocked about how ass backwards the other cohort feels about life. Ridiculously sheltered attitudes on both sides. Complete disinterest in negotiating. As we see in Congress.
Consider that perfect rural life and urban police dramas are common fiction tropes and it’s not hard to see why those emotional descriptions are knee jerk go to for the masses
Free speech doesn’t oblige anyone else to abide the embedded semantics of the speech in question. Emit whatever syntax you want, no one has to put their agency into the behaviors the speaker thinks achieve the outcome they seek with their speech.
Good luck.
The more mainstream a site gets, the worse it seems to be. Especially where voting is involved (reddit, facebook etc).
Add to this "grassroots" efforts to "takeover" public internet forums, which has been going on for decades by both far left and right wingers. Applying for moderator roles, not because they want to moderate discussion, but have complete control of it..
Are all vaccines good? If not, who chooses? The government? What about when Gov. Rick Perry tried to force the HPV vaccine on all teenage girls in TX, even though it was suspected he had financial incentive to do so?
Is it ok for the state to force you or your kids to receive a medical treatment? Even when there are multiple demonstrated cases of drug companies breaking the law and corruption?
My point isn't to fall on either side of the debate.
My point is that it's wrong to silence the debate and to label people who are thinking about these things so simply as "anti-vaxxers" and dismiss them.
Reality is complex. As soon as you're 100% sure you're right, it'll bite you.