HACKER Q&A
📣 atleastoptimal

What's a strong tech opinion you have that few agree with you on?


What's a strong tech opinion you have that few agree with you on?


  👤 anovikov Accepted Answer ✓
Clouds are a scam and are designed for:

- fools who can't do simple math or read fine print ("have no idea how much 20 $2 per hour instances pumping data at 10 mbit/second add up in a month, and OMG that data is not free when i already pay for the instance?!". And besides they give me a whopping $100,000 credit - that will last eternity!")

- corporate tricksters ("if we don't invest into our hardware and buy AWS instead, our next quarter bottom line will look GREAT and i get a nice bonus, and by the time truth transpires, i will jump the boat to the next shop i do the same trick with")

- people with breaks in basic logic and total lack of foresight ("i can't afford buying all the hardware for my small pet startup, and will make do with just $200 a month in AWS, and i don't realise it will only work for as long as my startup is not successful and has no users - and when it's no longer the case, i will be vendor-locked with tech solutions based on AWS and petabytes of data which is $0.05 per GB to download, locked up there, and will bleed money for years").

They should be avoided at all costs except for development purposes, and if you don't know how to or can't afford to do something without clouds, you just don't know how to do it or can't afford it.

In Europe, none of my clients use clouds. They have dedicated setups with reputable providers that work a lot better than cloud-based ones and cost pennies. Also, i realise that my custom software development biz doesn't really work with EU clients, i barely make a profit with them and they get to be real pain. Probably suggests that educational level of Europe is a lot higher.


👤 jmarchello
You should start with a single monolithic application on a single server that you scale vertically as much as possible before even thinking of scaling horizontally. Most apps won’t ever need architecture more complex than this.

👤 necovek
Making proper use of functional (stateless) paradigm in non-functional languages embodies a bunch of other good practices (testability, isolation, dependency inversion...).

Refactoring can always be done with a running (no-downtime) system at no extra cost (time or money) compared to rewriting or downtime-requiring approach.

You can always deliver user value and "paying up technical debt" can and should be done as part of regular work (corrollary from above: at no extra cost).

We'll never do away with physical keyboards for inputting text (yet I only have one mechanical keyboard I don't even use regularly :).


👤 amatecha
So many.

"AI" is the dotcom bubble (notice how every big company HAS to get in on it, no matter how ridiculous their application is?)... Further, it will simply allow those who apply their power unto others to do so in an even more egregious or deeply-reaching way.

Advertising should be illegal.

Proprietary software is basically always a trap (if it's not harmful or coercive at first, it eventually will be, well after you're locked in).

The web has been ruined by turning it into an operating system (also see "advertising should be illegal"). 99% of the time I just want very lightly-styled text, and some images. I don't need (or want) animated, drop-shadowed buttons.

Graphical OS user experience was basically "solved" 30 years ago and there hasn't been much of anything novel since -- in fact, in terms of usability, most newer OSes are far worse to use than, say, Macintosh System 7 (assuming you like a GUI for your OS). The always-online forced updates of modern OSes exacerbates their crappiness -- constant change and thus cognitive load, disrespectfully changing how things work despite how much effort you spent to familiarize yourself with them.


👤 gary_0
HTML, and retained-mode GUIs and DOMs generally, is all you need. Anything more complex is over-engineering. JavaScript was, broadly speaking, a mistake. 90% of what we need computers to do is do some I/O and put text, colored rectangles, and JPEGs/WEBMs on a screen, and that shouldn't be that complicated.

A lot of good things about way we wrote websites and native applications back in the early 2000's were babies that got thrown out with the bathwater. That's why we can't seem to do what we could do back then anymore--at least not without requiring 4x as many people, 3x as much time, and 20x more computing power.

(Maybe more than a few people on HN will agree with this, now that I think of it...)


👤 caprock
Almost all software best practices and programming idioms are just shared, personal preferences and not objectively valuable.

👤 RetroTechie
User time is more valuable than programmer's time. Read: programmers should operate as if CPU cycles, RAM, disk space etc is precious. Less = more.

Why? If programmer builds something only for him/herself, or a few of their peers, it really doesn't matter. Do as you like. But be aware that one-off / prototype != final product.

Commonly held view is that programmers are a small % of population, thus their skills are rare (valuable), thus if programmer's time can be saved by wasting some (user) CPU cycles, RAM etc (scripting languages, I'm looking at you!), so be it. Optimize only if necessary.

BUT! Ideally, the programming is only done once. If software is successful, it will be used by many users, again & again over a long time.

The time / RAM / storage wasted over the many runs of such software (not to mention bugs), by many users, outweighs any saving in programmers time.

In short: fine, kick out a prototype or something duct-taped from inefficient components.

But if it catches on: optimize / re-design / simplify / debug / verify the heck out of it, to the point where no CPU cycle or byte can be taken out without losing the core functionality.

Existing software landscape is too much duct-tape, compute expensive but never-used features, inefficient, RAM gobbling, bug-ridden crap that should never have been released.

And that developer has a beefy machine doesn't mean users do.


👤 mikewarot
We will eventually adopt Capability Based Security out of necessity. Until then you really can't trust computers. I think it's still at least a decade away.

WASM is as close as we've been since Multics. Genode is my backup plan, should someone manage to force POSIX file access in order to "improve" or "streamline" WASM.


👤 jhp123
typescript is worse than plain javascript. The type system is not sound and can't actually catch that many bugs; at the same time it adds a lot of verbosity and wasted time satisfying the type checker.

mandatory code reviews on every merge are a net negative. Too many people wasting time on nits and YAGNI "improvements". Actually improving the code in a structural way is too hard and most reviewers won't spend the effort. It would be better to dedicate time and resources to code audit and improvement on a regular cadence, e.g. pre-release.

A/B testing is pure cargo cult science, it has no where near the rigor to actually determine anything about human behavior (look at the replication crisis in real psychology where they use 10x as much rigor!) You might as well use a magic eight-ball.


👤 russellbeattie
Heh. This thread is like an invitation to get downvoted.

I'll bite: Using const for all variables in JavaScript is moronic. It's a trend that should have been killed with prejudice in the crib. If you want type safety, use another language. Use let for variables and const for actual constants. Words have meanings. The const statement wasn't created so developers could litter their code with it to show how cool they are.

Imagine you're learning JavaScript at young age: "Here are three ways to declare variables - var, let and const - but you should only use the most confusing one that doesn't actually make verbal sense for general use, nor do what you'd think it would do based on its description. Use it anyways. Because reasons."


👤 mattbillenstein
I'd almost always rather build something than use a 3rd party service.

I understand it, I run it, and I don't have to deal with 3rd party's changes, performance problems, or downtime. Also, less bugs related to data consistency.


👤 admissionsguy
PHP + MySQL is all you really need

(Modern web stacks are 99% bloat and impediment to human progress.)


👤 keiferski
The actual forms of computing devices have gotten worse and more boring over time. Everything is just a flat slab these days. There seems to be little interest in experimenting with the sculptural design of phones or laptops. A few decades ago, this wasn’t the case.

This is especially relevant with touch screens, which I wish weren’t so omnipresent. I really don’t want to use a touch screen in a car, for example…


👤 cratermoon
I have two, at least.

1. Personal names are not generic strings, any more than Dates are numbers or Money is a floating point value. Name, as in a person's given and family name, or whatever, should be a type in every standard library, with behaviors appropriate for the semantics of a Name, language, and country. Yes there are lots of conflicting international differences. We've managed to handle languages and calendars of significant complexity, we can do so sufficiently well for the things we use to identify ourselves.

2. "AI" is a marketing term for a kind of automation technology able to use pattern matching to reproduce plausible new versions of the data used to train the model's algorithms. It couches this automation as especially powerful or magical as a way to draw a smokescreen around the real problems and limitations of the technology.


👤 mikewarot
Quantum annealing is the only truly useful quantum computing architecture to date.

I further believe that someone will figure out an algorithm that runs on classical computers to match the speed of quantum annealing, thus the Quantum Annealing machines, while currently useful, have a limited shelf life.

Shor's algorithm, which runs on Quantum Computers, requires the use of many repeated cycles of small rotations of qubits in the complex plane of the Bloch sphere. These are analog operations that accumulate phase error. It's not possible to use error correction with these operations, as those techniques necessarily sacrifice information in the complex component plane, leaving the real component "corrected".


👤 tenahu
Software should not be free for the users. The expectation that profits must be made through advertising, paired with the entitlement of users who scoff at the idea of paying for a product is a recipe for disaster over the long term.

👤 fragmede
I knew I harbored one, and apparently my belief that we'll always have general purpose computing is controversial! That is, unlocked bootloaders on general purpose computers; laptops and desktops, will continue to exist well into the future. I guess others here are more cynical about a dystopian future coming about than I. Turns out MacOS is just a few releases away from flipping the switch and only running blessed software. Any day now, we won't be able to run our own programs on our own computers because iphones already don't let you do that.

Days I learn things about other people are always welcome!


👤 bnchrch
Oooh this is fun!

Ok quick list

* Class Inheritance is an anti-pattern.

* Python is an anti-pattern.

* The 80/20 of functional programming is use pure functions and acknowledge that everything is a list or an operation on a list.

* Javascript/Typescript will continue to win.

* Erlang/Elixir/BEAM is the better timeline. But unfortunately we're not on it.

* Microservices are a luxury and most people should just vertically scale their monolith

* GraphQL is better than REST when you have multiple clients, codegen or any AI/RAG needs

* Frontend Frameworks are not changing as fast as a backend developer wants you to believe

* Including a FE Framework in your application at the begining is a smart move for anything consumer facing.

* The Haskell community is their own worst enemy.


👤 hnthrowaway0315
I want to build as many tools by myself as possible.

Obviously this is shunned in every company.


👤 solardev
The future of gaming is streaming, and home PC gaming hardware will eventually go the way of DVD players. GeForce Now is succeeding where OnLive and Stadia failed (20+ million users now, apparently). Offloading rendering to the cloud means much improved thermals, graphics, and battery life -- especially for laptop users and Mac owners. Apple Silicon is cool, but it's not going to beat a 4080 for pure graphics performance. And that's just computers... GFN can also stream to tablets, phones, TVs, and anything else with a screen and an internet connection.

It makes more sense for Nvidia to put GPUs in data centers with good cooling, shared between multiple gamers and idle workloads (AI, etc.), instead of having them sit in expensive but unused home desktops most of the day

Nvidia is the only one who has a real shot at this because they're the only ones who can directly allocate GPUs to cloud gaming (unless what's left of AMD wants to get in on the action too). And they're the only ones that Steam specifically partners with for its Cloud Play beta: https://partner.steamgames.com/doc/features/cloudgaming

Stadia failed not because of the technology, but because Google mismanaged it and never understood PC gaming cultures. Nvidia and Steam do, and it's a much, much better product for it.


👤 mikewarot
BitGrid[1] is the most efficient general purpose compute architecture possible for Petaflops and beyond.

It eliminates memory bandwidth issues.

Its a nice elegant alternative architecture. I'm quite surprised nobody actually tried it earlier.

[1] https://github.com/mikewarot/Bitgrid

* I give 50/50 odds I'm a crackpot on this one... I'd really like to know for sure which way that coin toss comes out in the end


👤 calmbeluga54
something being powered by "AI" is actually not that strong of a selling point to most normal people. I've seen several ads that say "try , now powered with AI". To me, most people don't want to use something just because it's powered by AI. It's almost like bragging about automated phone systems: "Call our customer service, now powered by automated machines!".

👤 caprock
Software as a team sport usually produces both poor software and a poor team experience.

👤 barrysteve
We could have kept social rules outside of the CPU and made sure the machines were unintelligible.

We should have kept the machines blind and retained social control between people and families.

Once we allowed machine code to be intelligible, we poured control of services, governments and speech into machines.

Truly free speech in a CPU became synonymous with the concept of a virus. And real free speech has followed, programmed rules on an electronic device are socially silencing people.

The concept of a virus, outlawed one of the best uses computers could have provided, maintaining speech globally.

Now we are racing to the bottom, to put the rules of society into a machine. Then lock society out of the machine, with a few holding all the keys to the computing kingdom.

Slavery to machines, as enjoyable as it is, will eventually become nonsensical.

As power becomes centralized, all the new moves will be made by those outside the system.

So I think that CPUs should be running unintelligible code, and LLMs are a baby step towards that.


👤 thefz
VR is a solution in search of a problem and will pass.

👤 caeril
Python is not "readable" to anyone with eyes.

Using whitespace as scope definition is a war crime.

(same goes for YAML)


👤 archagon
Technology must serve humanity, not the other way around.

If a new technology threatens to eliminate hundreds of thousands of jobs, and the benefits are marginal or largely in favor of the capital class, then we should probably not pursue that technology.


👤 deterministic
1. The less code I depend on written by others, the fewer maintenance problems I have.

2. Most developers 10x over-complicate solutions and spend 20x too much time implementing them.

3. Don't ever deprecate API's! An API is a contract that potentially thousands of systems or developers rely on. So find ways to move forward without breaking things. You are a pro right? So act like one. Linus gets it. Most don't.


👤 Am4TIfIsER0ppos
Smartphones are the worst invention of mankind.

👤 erik_seaberg
I see a lot of untapped potential in using a domain-specific language to drive a framework, instead of using the general-purpose language that doesn’t know anything about what the framework has to offer. In general we’re too eager to throw boilerplate at problems, and too reluctant about tools that require learning something extra.

👤 karmakaze
Concurrency using async function type is dumb. Either make and compose futures/promises with normal language features OR support auto-yielding coroutines/goroutines/fibers (lightweight threads).

👤 efortis
Monospace fonts slow you down.

https://x.com/efortis/status/1501988964198191113


👤 flappyeagle
Instead of technical debt you should take on product debt.

Do not take on tech debt. If you absolutely need to pay it off ASAP


👤 CableNinja
Late to the party, but my big one of late.. Job Titles.

- Job titles in tech are out of control. Searching for a job requires me to use like 3-7 searches because everyone is making up titles as they go, and its just insane. We call ourselves engineers, and its time we take another page from the book of engineering, for job titles. Engineer titles are very pointed, and thus job function limited by their title. Example: Cloud Infrastructure Engineer [I], this would be a generalist cloud role, at the jr/entry level, eventual knowledge of all clouds and their things as you reach [III] status (or [V], depending on company size). Their only function is infrastructure for cloud services. Today, such a title would also require devops knowledge, CI/CD, various services for monitoring, logging, kubernetes, etc. These should obviously be jobs for multiple people but we just keep letting things get stacked on to a single title. Ive even seen ridiculous titles that are very obviously 3 jobs in one, especially when you look at the responsibilities/qualifications of it. Really need to get a grasp on this. likely nothing will get fixed without entire unionization of people in tech, unfortunately.

- React, Next, etc, are steaming piles of code. I was going to build a react frontend for a project that was size constrained. The out of the box build, adding only react and some bare minimum packages resulted in well over 4mb of js, AND I DIDNT EVEN HAVE A WORKING ANYTHING YET. Utterly disgusting. Add to that that it seems that any dev who does front/backend work worships their framework of choice to the point of using it in places it has absolutely no reason to be. Ive seen these developers trying to write shell scripts in js/ts, using react. The amount things ive seen written in js that had no right to be, is way to high.


👤 peteradio
Declarative programming is terrible.