HACKER Q&A
📣 amichail

Why don't smartphones encourage programming like early 80s computers?


Early 80s computers start up with a BASIC prompt and hence encourage you to learn programming right away.

Why don't smartphones do something similar?


  👤 nindalf Accepted Answer ✓
The answer is obvious - most people don’t want to do this. They want to text or call a friend, look up the weather or sports, stream some entertainment or read the news. They’re experts in their own domains and don’t have the time or inclination to learn programming.

There’s some heavy condescension in this thread, claiming that not programming “keeps people stupid”. As if creating can only happen in an IDE and not in ProCreate or iMovie. It would help if programmers like us eased up on everyone else. Not everyone needs to be exactly like us.

Also, you have to be living under a literal rock if you’ve missed the potential impact of LLMs, specifically chatGPT plugins. Once that’s generally available, everyone can tell the LLM what they want in their native language and have it done. Everyone can build custom recipes that combine multiple apps and APIs in novel ways to get stuff done. That’s a revolution waiting to happen - where everyone will be limited only by their imagination, not by their knowledge of programming. And it’ll happen without having to show every smartphone user a BASIC terminal.


👤 danuker
I'd argue that a mobile phone is not an adequate platform for coding. The touchscreen and keyboard layouts make it difficult to type needed symbols, distracting from the problem at hand.

Coding requires deep thought, but phones are optimized for moving around rather than sitting in one place and thinking. As such, use cases like maps, calendars, communication and alarms get priority.

The reason people buy a phone is to communicate and get around. And manufacturers cater to those needs.


👤 pdntspa
Because smartphones are designed to keep you stupid. Computer literacy means you are able to poke holes in their shitty abstractions. The industry does not want you programming or creating, the industry wants you to be consuming. Preferably with your credit card.

👤 everdrive
Right now phones are really only consumption devices. This is mostly an intentional limitation. I'm not sure if you've ever actually hooked up a keyboard to a phone, but even with a keyboard (usually bluetooth) a phone is still far less convenient than a simple laptop. Some keyboard commands work. Focus is very spotty, and often you have to touch the screen to interact with elements so you can actually use the keyboard. There seems to be no app switching, but perhaps I'm just missing the keyboard command. Your list of applications is limited to what's available in the app store, and much of it is questionable. I could go on, but phones are basically the TV replacement in a certain sense. The user gets what's provided to them, and creativity and control are kept at a minimum.

👤 taffronaut
1. 80s computers were basically a mechanical keyboard connected to a low resolution CRT display (typically a TV). Text entry was the default. There was no GUI or pointing device. For smartphones, text entry isn't the default.

2. In the 80s seeing your choice of text on a screen was something really new to most people. Teletext would not be universal until the end of the decade. Nothing in the house had even a text display. Calculators were numeric only. These days, everything has a display.

3. There were no polished programs to compete with. A lot of early games were very simple. There was no App Store or Steam. You could write a Tic Tac Toe game never having seen a commercial implementation. Even if it was bad, your friends would be impressed.

If you want inclusive coding, Scratch is the world’s largest coding community for children - https://news.ycombinator.com/item?id=35373052 . The 40 characters wide Basic terminal was of its time.


👤 mhd
Because they're not equivalent to 80s computers, they're equivalent to 80s TVs.

👤 alt227
Nokia tried this in 2009. They released the n900 phone which IMO was the best phone ever made.

It came with a full version of linux on it, with a hardware keyboard. Their marketing around it was that it was 'the hackers phone', and they started trying to build a community around it by launching competitions and advertising the best hacks and out of the box ideas which could be achieved on the phone.

I remember running aircrack-ng on it to crack wifi handshakes which could be captuered on the stock phone. Running nmap etc, war driving. It truly was incredible.

As expected though, it died in the water. There was such little uptake in the community that nokia stopped developing the Maemo OS it had made for it, and silently killed the brand. I sometimes like to imagine what could have been achieved if they had pressed on with that product line and the marketing which encouraged breaking the norm and creating something cool on your phone.


👤 musictubes
Speaking for myself, the command line prompt never inspired me to learn programming. I messed with quite a few Commadore and even Sinclair computers back in the day and the command line was only ever used to load programs from disk.

I did try programming from scratch and even typed in some of the programs published in BYTE and whatnot. Nothing I ever came up with on my own ever seemed worth the effort and certainly wasn’t how I wanted to use my time. What a pain in the ass. Typing all that crap and debugging was never as interesting as using programs.

The vast majority of people have no interest in messing with programming. Most people have no internet in making anything. You can complain that it’s a symptom of consumer culture or whatever but I think it has always been the case that more people want to use a tool than make a tool. Even back in the 80s computers were bought to use programs primarily. Only a tiny minority were interested in programming and that will always be the case.


👤 ajuc
In BASIC times there were dozens of incompatible platforms and they were competing on the available software. Programmers were very scarce (and you had to persuade them to write for your platform instead of other platforms).

You couldn't expect to have most use-cases solved by the available code, so people were willing to code what they needed. 8-bit computer without BASIC was almost useless.

Software was also very crude back then - it was possible for some teen to write a commercially useful piece of code in a few weeks - that code could then help you sell your hardware to other people.

Nowadays we have millions of programmers worldwide writing for only a few possible platforms and the low hanging fruits are long picked. Writing successful software today usually takes millions of man-hours. So hobbyists aren't that important.


👤 nickdothutton
When I got my first computer, a Z80 based system running BASIC from ROM and able to boot CP/M 2, it came with almost no software. If you wanted to make it do something, you got out the BASIC manual or the Z80 assembler. Since most professional software was way out of my price range, and shareware was had to come-by (2400bps modem + expensive calls), you really had to be a builder. I’m grateful I didn’t have to do any soldering though.

👤 joeman1000
Of course, as the current top comment mentions: because most people don’t want to make programs. Computing in the 80s was limited to nerds, business users and a school kids. All of these groups have some incentive or drive to write programs, above that of the average person from the population.

Anyway, this is not an excuse for the state of operating systems today, which dissuade people towards programming. Yes, I even mean Unix-like operating systems. Why? Simply because the mental gap between ‘programs as we use them’ and ‘programs as we write them’ is so large on a modern OS. Consumers are stuck essentially playing in the sandpit and don’t really learn anything about computing while they use the computer. Unless of course they have some weird drive that helps them weather the pain of going against the tide and actually learning what’s going on in their computer.

There is very little you can learn from an application (such as Firefox, telegram etc.) because they offer zero introspection and are completely alien things compared to how they are conjured.

Think of the Apple II, there was very little gap between using the ‘OS’, making a program and using a program. They were nearly the same thing, or felt the same.

How would you interact with the computer? Enter some text. How did you make a program? Enter some text and save it. How did you run and interact with the program? …

Alan Kay talks about this a lot. If you imagine Smalltalk (now Squeak) as an OS, the gap between programming and using programs is tiny, so you learn by consuming and thus the leap to proficiency isn’t so big.

Another great counter-example is emacs. It offers introspection info everything and lets you modify a lot of its behaviour in a uniform way. I thing this is why people like it so much. You learn it as you use it and you’re encouraged to modify it. It’s a far cry from how most software is made today.


👤 ofalkaed
Early 80s computers were a niche market largely sold to people interested in computers and programming.

👤 Ciantic
In the 80s computer programming was done with the same device. With today's mobile phones, programming is done with another device, usually a laptop.

This is of course changing, as mobile phones gain more functions. However, for it to happen it needs a change in culture. Phones have been largely seen as consumption devices, and still, a lot of people aren't comfortable writing long pieces with them.

It also could be as simple as what priorities the executive class wants to give their devices. I could imagine an alternate world where Steve Wozniak still had influence at Apple and he could push their lineup towards more hackable.


👤 johannes1234321
With early computers they were made from people who enjoyed exploring the possibilities for people who liked that, combined with the fact that software, especially general purpose software was limited. Niche products for niche markets. Except from playing games there was little to do. If you wanted to use it for anything else you'd have to program your software yourself.

In the nineties computers hot Excel and similar, which took over basic tasks and people built their databases for their hobby or business, there ledgers, ... on top of that.

Today there is special software for most of those things and computers are mass market products, made by companies which want the user's to consume media (via iTunes, or YouTube ads etc.) and approachable for everybody. The product managers fear that just seeing a glimpse of a programmable interface drives people away and reduces the media consumption.


👤 ksec
I dont think it is anything about Smartphone. It is about the software / technology today, or Abstractions.

In the 80s and 90s. One could still have a very high level understanding of every single part of the computer. From Hardware to Software. The whole stack.

We now have a whole generation of programmers who dont understand a thing about Hardware. Not even enthusiast level. Nor do they understand anything beyond their domain within Software.

The bar is also a lot higher, when basic graphics and UI luxury and command line is still acceptable. You could quickly make a script to do something you want.

And then there is a whole generation of I would argue needless complexity added on top of complexity. Take Web development for example. CGI or PHP used to be Simple. It wasn't easy, but it was simple. I dont know where to begin to describe in today's web dev.


👤 smitty1e
A smartphone is not an exploratory device.

Its tiny form factor and lack of keyboard input are forbidding.

Even a lower-end unit is a relatively sealed package. I can get a command prompt on my 'droid unit via termux, but it's not going to show much, or easily integrate with the OS, for security reasons.

The kicker is that, from a risk-management perspective, I don't want to jack around with my (non-cheap) 'droid unit and risk bricking or destabilizing it.


👤 13415
Lack of a keyboard is not the only reason. A good ad hoc/hobby programming environment needs to offer a lot nowadays:

- The IDE and language must be preinstalled or available with a one-click installer.

- The programming language must be extremely simple and easy to learn. Almost no currently popular language satisfies this requirement. Even Python is too complicated, requires learning too many libraries.

- Easy input and output, ideally with a GUI, but at least console style.

- Simple way to run a program. Just click or type "run", for example.

- Integration into the target platform. If programs are started by clicking on an icon, the deployment must provide apps with icons, of course.

- Easy deployment, either by source code or by a single file that can be run everywhere with an interpreter.

In addition to this, for modern phones there would need to be an interpreter for running programs on desktops, too, and an online library of extension packages and programs.

Not many languages/implementations/IDEs for phones satisfy these criteria. There are not even many for desktop. How many IDE/language combos do you use that are easy to learn and allow one-click deployment to all major platforms?


👤 gabereiser
Because it's not in their best interest. Smartphone makers are big tech, big tech only wants you to learn their framework to be locked into using big tech for life. No one wants you to learn to think on your own. Learn Dashly(tm) for Pear Phone or don't program at all.

The reality is the time for tinkering and stuff is over (for pc workloads, you can tinker all you like with hardware/agri/space/radio/nuclear open-source). Computers, whether they are on your lap or in your pocket, are being controlled by big corporations. You might say "But, Linux isn't controlled by a big corporation" and you'd be wrong. All these big tech companies want to wall you into their garden to shake you down anytime they want more capital.

The best way to get that 80s tinkering feeling again is to go get a Raspberry Pi or something and start building your own thing. Don't expect any of the consumer tech to ever cater to that kind of crowd again. They may pay homage to their roots but their bottom line depends on you forking over cash on their services and app stores.


👤 seba_dos1
Some of them do. A lot of my programming skills come thanks to hacking on my Openmoko Neo Freerunner around 2008 when I was a teenager. A few years later I was hacking on (although definitely not as much) a Nokia N900, and these days hacking on a Librem 5 (which can even be connected to an external screen, keyboard and mouse and act as a regular GNU/Linux PC) is my job.

Both Android and iOS are designed to be locked-down consumer goods owned by the manufacturer and only used by you; they're not tools that encourage you to learn and tweak. You need to look elsewhere for that.


👤 ergonaught
Profit motive as the only motive, combined with scale, mostly, with a side of "freedom helps bad people do bad things" variations.

ex: In many respects there is vastly more effort involved in making them general purpose computing devices. Why harm profits catering to people who want that when the crushingly vast majority don't want it, when they can't be trusted to use it (directly or indirectly) without messing something up (harming profit via support costs), and when they aren't able to prevent bad actors from misusing it and harming them (again harming profits)?

ex: Why do game developers skip story? Because 90% of the market doesn't care about it. Why do movie studios crank out films with garbage stories/plot? Because 90% of the market doesn't care about it. As long as people and therefore companies believe that generating revenue is the only thing that matters, this results.


👤 seydor
Smartphones were designed to do the opposite, they will never do that. The worst thing is that desktop development has been influenced by mobile and become just as restrictive.

👤 herbst
When Smartphones came out I couldn't think of anything else than creating apps for it.

Naturally I learned Java and android as soon as I could. I published my first app soon after.

Well it didn't work so well on all devices. It worked on all my test Devices so I was stuck and the app was removed from the store.

Years later I released a game. It was taken offline several times before I didn't care anymore. Half of the time I didn't know why, I just submitted a new build on the current SDK it got approved and a few weeks later taken offline.

The whole ecosystem sucks. You build really temporary apps that likely won't work on future phones without rebuilding several times for new SDKs. Java is not a language that encourages fun in coding and Android makes the Java experience just worse.


👤 kzrdude
Early 80s computers were boring if you didn't code something. That's a simplified way to explain it. It reminds me of the programs people wrote on TI-83 calculators, also because they didn't have any diversions unless you created them.

👤 pier25
People in this thread are complaining about smartphones not being designed for tinkering as a bad thing. As a dev I agree with the sentiment but otoh I'm very happy I don't have to tinker with my car or my fridge.

👤 beej71
There's no money in it, is why.

But it can be done to an extent. Grab a cheap external BT keyboard, a folding monitor stand, and Termux or a Scheme interpreter, or a BASIC interpreter (I haven't looked, but I assume they exist), and you're there.

I've done a lot of programming and writing on the phone that way.


👤 bdcravens
Computers from the era were pretty much completely contained units. Smartphones require some other service to do anything useful, and those companies don't want to support devices that have been tinkered with.

👤 mrwh
Lots of interesting replies here. I'd question the premise though. At least where I grew up, programming was a niche interest. Plenty of people I knew had Amigas and Atari STs; very few had any interest in programming them. It wouldn't surprise me that just as a small proportion of people back then did learn to write their own programs (or tried to), a similar proportion today are delving into XCode and Android Studio - and Unity and all the rest, and doing so for free, with a wealth of tutorials.

👤 JoeDaDude
I'd add that modern PCs don't encourage programming. Sure, potential programmer could seek out ways to rode and run javascript in the browser, which is typically included with the PC, but it would still require curiosity and determination. Without that BASIC prompt, most users will never even have the concept that THEY TOO could program this thing.

👤 onion2k
I often wonder how many people actually did any programming on those computers. For a very long time I assumed loads of people did - I did, my brothers and Dad did, my friends did. There were listings in magazines and books. It seemed wildly popular. But looking back now, and having met far more people from that era than I knew in my small town at the time, hardly anyone says they did too. They had computers but they used them for gaming and nothing much else. I was in a bubble, living in place where a major employer was a tech company. What reality was like outside seems to explain why programming was so appealing - it made me a bit special and opened a lot of doors back in the 90s.

I suspect that, even now, including an IDE as a default app with a language compiler or interpreter would be a waste of space for most users. They wouldn't use it, and they'd complain about it taking up valuable photo and music storage.


👤 zumu
Programming is a means to an end.

If kids could have just instantly downloaded any program they wanted, they wouldn't have been copying programs out of the back of magazines.

Personally, I came a little later, but I also learned stuff because I had to in order to do the things I wanted to do. After enough exposure I realized it was fun and interesting.


👤 masswerk
Form factor and mobile devices aside (there are already plenty of good comments on that), I simply don't see any eligible programming language for this. I.e, something that can be learned in an afternoon or two, a single loop construct, simple variable management etc. Much like JavaScript used to be in the 1990s (up to including ES3.) Modern languages are just too much tailored towards software engineering, IDEs and static analysis, have multiple redundant layers of syntax and semantic sugar, and, even if you have managed this, the core language takes you nowhere as the meat is in APIs and/or frameworks. In other words, we may need a simple language that is more on the art than on the engineering side of programming, in order to be accessible for fun activities and exploration.

👤 amadeuspagel
Also: Why don't browsers encourage programming? Browsers have amazing dev tools, but you're never going to discover them unless you know what to look for. And if you do, you won't know how to get started. Why not integrate a tutorial?

👤 mg
The C64&Co did not have a browser.

Nowadays you can go to a page like this ...

https://no-gravity.github.io/html_editor/

... edit the code and see the result in real time.


👤 simne
Because just about 10% of people could enjoy mathematics with current education technologies, and I think, people who enjoy programming are subset of who enjoy mathematics. This is natural limitation, unfortunately.

This is not to say, some people better then others, just every human have some strong sides from possible set, but very rare people strong at all.

For simplicity, there are three main classes - math, arts, psychology (as example, Betazoids in Star Trek are not totally fictitious, exists people who are so strong in sensing others, so it could appear, like they read your thoughts). This is good correlated with number of graduated in Germany - ~35% (I think ~10% math, ~10% art, ~10% psychology). Some people are physically strong.

Programming is mostly math.

So, early 80s computers, gives great tools into hands of math people, and this duet gives lot of fruits.

After computers penetration become much more than 10%, it become impossible for education system, to educate more math people, so for others computer becomes just entertainment.

I believe, with AI teachers, private highly-custom education will become more affordable, so larger percent of math-capable will appear, but I can't predict exact number, may it become twice bigger, the same with others (arts, psychology), so 70% will be with diploma.


👤 jmclnx
The bar of entry is much higher and Smart Phones are designed that way. It is impossible to play around with programming on Smart Phones when you need to either Jail Break the phone or pay for expensive licenses.

The last thing companies want to happen with smart phones is what happened to PCs. People were able to upend many large commercial companies in the 80s and 90s due to the low bar of entry.

Just to name a few: DEC, CDC, Data General, Wang. IBM almost folded due to the PC. So Smart Phones were made closed to prevent the same revolution from repeating itself.


👤 eternityforest
Because that's a terrible way to interact with computers for 99% of applications.

Now the market is so big, we have enough customers to justify a polished, easy to use, featureful app for every task.

And people want security, meaning we need lots and lots of boilerplate to deal with all the permissioned APIs, plus we want reliability, so we can't just let people peek and poke. If we let people do that the app store would fill up with apps that used it, and half of them would be unreliable, and stuff would suck.

In fact, if there was a prompt at all, hackers could trick people into typing evil commands like they do to Linux newbies. Then those of us with tech experience would have to deal with it.

Much better to offer equivalent capability in a way that doesn't have an easy to exploit hole on the human side.

Smartphones DO encourage programming. There's like a million apps. They don't need a basic prompt, they just need to be there and really capable and make people feel its worth learning to develop for them.

There are various sandboxed easy to use dev apps on Android for those who want to do old school 80s BASIC stuff. And it's a pretty cool thing, but it's best kept where it belongs, in an app, not conflicting with the primary use case of the phone.

I do wish Google would support Flutter a little better and make stuff easier so we didn't need buggy third party libs though.


👤 RobotToaster
It's more profitable to sell apps and micro-transactions through their respective walled gardens.

👤 corobo
Because the UX of that would be terrible and nobody would buy the product and they would go out of business. Best case only people who wanted to learn programming would use the weird nerdy programming phone.

Why aren't AAA games running on text based interactive fiction engines? Because for the target market, that's crap.

Most people do not want to learn programming. I know, it's a horrible thought, but it is what it is.


👤 randcraw
Early PCs lacked two things that made smartphones a whole new ballgame: a network, and software. When the Altair 8800, IBM PC, Apple II, Commodore PET, and Tandy TRS-80 shipped (1975-1980), there was no way to communicate with other people, nor was there much software (nor was there convenient storage for it — cassette tapes and floppies were severely limiting and a major pain to use). If the machine didn't have a Basic interpreter, it was neither useful (little software), nor engaging to the hobbiest (no way to explore the machine).

By the time smartphones shipped, communication with others and availability of vast repos of apps were almost unlimited. Only a tiny fraction of owners ever thought about coding up their own apps or exploring the infrastructure of the device.

And of course the makers of smartphone OSes had no intention of opening up their sealed gardens so users could compete with their near-total control of the user experience — a lesson that Microsoft's domination of the PC's OS taught to all budding monopolists.


👤 javajosh
Nindalf is correct: the primary reason is lack of desire.

But there are other reasons - these are the reasons that people who already like to program don't do so on their phones. Optimistically, improved interfaces (e.g. AI listening to your words and groking your system diagrams) will improve the programmability of phones perhaps even beyond the laptop.

The difference between Apple IIe and the iPhone 12 is scale. 40 years of Moore's law means the phone you hold in your hand has roughly 2^10 more components at 1/100 the volume. Early computers, you could look at them with your eyes, fix (and break) them with your hands. The same was true of the experience of using a computer - it was all very "close to the metal", with small, simple abstractions (like booting into BASIC, or restarting your computer with a new floppy in it - computers were essentially stateless).

It feels like a stroke of luck to have grown up alongside each iteration of PC technology because I can see how it would be overwhelming to try to understand it starting from 0.


👤 xaduha
I'd say Raspberry Pi 400 could ignite a similar spark, I don't think you're comparing apples to apples here.

Smartphones are consumer devices. And a Raspberry Pi 400 is probably even more niche than 80s computers were in comparison.

In any case I don't think it's about the devices at all, there are just better things to do even for an introvert who doesn't go out much. Just different times.


👤 bawolff
Because keyboards aren't the native interface to a smartphone.

And the target audience doesn't want to be a programmer or even customize their phone.


👤 kkfx
Classic systems was designed with the idea of desktop computing: having something flexible at the users fingertips. Modern mobile devices are designed as consumption and monitoring devices of someone else service.

The user in the classic desktop design was intended as a powerful tools in hes/shes hands, modern mobile+cloud are designed to milk data and keep people entertained.


👤 ElfinTrousers
Early 80s computers were meant for hobbyists and enthusiasts, people who liked to tinker with the fancy machine they just bought. A smartphone is, as others have pointed out, a consumption device with mass appeal. They have some of the same kinds of hardware, and do some of the same things, but they are intended for very different things in the end.

👤 daviddever23box
IMHO a mobile handset's primary use case is availability to respond to (mobile telephony) messaging events; as a consequence, its power management domains are intentionally oriented toward short-duration events.

A tablet device, on the other hand, is more likely to possess a higher-capacity battery and be usable for content generation / long-form editing.


👤 rerdavies
There's nothing stopping you from downloading Android development tools, and programming away to your heart's content.

And nothing stopping you from pushing your app to Google Play so all your friends can download it onto their phones.

Probably the major reason why early 80's computers made basic so prominent is that there wasn't much else to do with them.


👤 8bitsrule
They certainly could. The CPUs are fast enough, they've plenty of memory. They could have a programming mode you could flip into and displace all the other kludges.

They could have a USB port that takes a keyboard, because typing with your thumbs is simian. They could have a video-output port, because the screen is too damn small. They could have an easy-to-use filesystem ... but they don't.

So they're unsuited to purpose, on purpose I'm sure. They're not trying to get to thinkers. If I wanted to program away from a desktop, it'd be something on Linux, something tablet-sized, a USB port for a roll-up keyboard, an analog audio-jack. Maybe with a RasPi. No phone, radio, bluetooth, GPS or spyware, no fat cluster of worthless junkware. I can wait. Until then, a laptop. And if the code can't run on a phone, oh well. They had their chance.


👤 PhilipRoman
I have very sweet memories of installing an app called "Terminal IDE". At the time I had no idea how to install (or use) Linux so this was my first time

The best part was that the author had included a lovely help manual which briefly showed how to use command line tools, gdb, tmux, vim, irc, git, etc. but it was cryptic enough that I had to experiment a lot. And the environment was pretty constrained - BusyBox, a couple of pre-compiled tools but no package manager or X11.

This was my gateway into "real" programming, as opposed to typing things in an IDE.

But there is nothing phone-specific about this, if I had an easy way to try Linux on my computer, I would have preferred that. Obviously now WSL exists so it kind of fills that niche.

I think the app is no longer available and there are better alternatives, but I still have it on an 8 year old device.


👤 stuartjohnson12
Why... why would they?

👤 rmellow
There are positive and negative factors presented by mobile platforms:

Negative:

1. Highly abstract, hiding away internal functionality. 99% of mobile is GUI. iOS even abstracts away the filesystem. Android is more transparent.

2. Highly locked down: walled gardens increase the friction to run custom code. Terminal as an app that can be installed, but on unrooted devices it's almost useless compared to terminal on a desktop.

3. Ergonomics: smartphone keyboards and screens are not conducive for onboard development. Solutions exist but are pretty niche, and most people would prefer attaching a real keyboard.

Positive:

1. Arguably mobile platforms have increased the number of coders! The app stores facilitate distribution and payment, encouraging new programmers to make an impact. Desktops are in most cases the actual development platform though.


👤 flipcoder
Because then Apple wouldn't get its 30% cut or force you to buy a Mac to develop for iOS

👤 pluc
For the same reasons you can't open up a smartphone and tinker with it whereas it was and still is one of the great things about most non-Apple computers. They'd love to kill the culture that birthed them, because then there can't be another.

👤 throw_m239339
The interface? you kind of need a keyboard to type code, and a touch screen doesn't make typing easier nor faster. But you could create some sort of visual programming language (with nodes for instance or bricks) that is easy to edit on mobile.

👤 hammyhavoc
Because whilst more people than ever use computers, their literacy of them, i.e., understanding of how they function, it as at an all time low.

Computers used to be nerds. Now they're for grandmothers. They're now idiot-proof appliances for consumption.


👤 nubinetwork
> Why don't smartphones do something similar?

The same reason google and apple try to chain you to the app store... money. If they let you easily write your own apps, then they lose out on all those app store "fees".


👤 saltcured
From just the headline, I expected this question was going to lament the "wasteful" coding practices of mobile apps and pine of the days of hand-tuned, low footprint software. So, I was all set to point out that smartphones with multiple cores, GBs of RAM and storage, high speed networking, and massive screen resolutions are nothing like 80s computers. These things aren't even as limited as computers from the mid-2000s!

But, as others have mentioned in various posts, I think the answer to the question of BASIC REPLs has multiple angles.

First, most early 80s computers did not boot straight to a programming REPL either. As soon as you had some kind of disk operating system you ended up in a shell and would have to invoke a BASIC interpreter or other programming environment if you wanted it. As far as I can tell, the feature threshold was having enough storage to make multiple programs persistently available. That was true across CP/M, MS-DOS, and Apple computers. Then, when they added GUIs, the shell became some kind of GUI menu instead of a CLI. With more storage, you get more built-in apps and less emphasis on a "blank machine" ready to take custom code.

After decades of this growth of storage, a smartphone is really not marketed as a general purpose compute platform. It still has more than vestigial "communications appliance" characteristics and is morphing into "cloud appliance". There is a feedback loop where vendors are marketing an integrated experience that sets the expectation for the next round of products too. And at the mass market volumes they are reaching, economies of scale mean that this approach is targeted towards the largest consumer markets.

It's almost fractal, but the cloud applications themselves are going through the same kind of shift. The commercial pressures are to create ever more integrated experiences for the mass-market user. It is a niche interest to want a general purpose platform where lots of capabilities are available, but the integrated whole is absent and waiting for a new custom program to be entered. Most consumers don't want the device or app that lacks this complete solution, and vendors don't want to provide all the infrastructure and then have some other party come in and claim all the value-add experience that is most visible to the paying customers.


👤 pipeline_peak
Because programming on phones is awkward and miserable.

We need something that utilizes touch screens to implement logic.

Something like Scratch but that would let you import libraries. Although I don’t think that’s imaginative enough to surpass traditional keyboard programming.

The problem with visual languages thus far is that they don’t really accomplish anything beyond text languages. I find them to be more difficult to understand than say python or JS.

The future of business logic app programming is going to be describing the details and having some AI implement it.


👤 bonestamp2
Because most of the complexity of smartphones is hidden from the user.

Meanwhile in the 80s, you had to know some command line just to run a game. If you wanted free games, you had to hand copy the code from a book. Once you learned a little bit, you were either content and kept playing the games or you wanted to learn more about how the game and the computer works.

We don't have that same barrier to entry now. That's good and bad, but that's part of why computers and smartphones are so ubiquitous now.


👤 swader999
Phones are designed for consuming, not creating unlike the desktop.

👤 hotpathdev
Prior to smart phones, programming on phones was much more inaccessible due to hardware fragmentation. Mobile Apps are popular and APIs are available everywhere. Some deployed apps include implementations of BASIC, Python and other languages which you can use to develop right on the phone. The problem is, typing on a smart phone is terrible, and it's worse for programming. You need a keyboard and by then you might as well use a laptop.

👤 friend_and_foe
Because you cannot be productive in that way on a smartphone alone.

The smartphone is descended from simple communication devices, and adapted into a richer communication environment and a content consumption device.

You need a computer to do anything hacky with a smartphone, so you're more likely to get into the details on a computer in the first place. Smartphones are mostly useful for consumption of media, an activity that isn't really condusive to tinkering and modifying.


👤 HarHarVeryFunny
I grew up programming in late 1970s. Early computers were simple, simple to understand, and simple to program - whether in assembler, BASIC, or anything else.

The closest modern equivalent to the early 70s/80s computers as something to learn programming on are things like the Arduino.

Smartphones just don't compare - they are inherently complex, not fully open/documented anyways, and tooling doesn't help. Can't even develop for an iPhone without owning a Mac too.


👤 surgical_fire
People buy iPhones dude. I cab hardly think of something more hostile to tinkering than that.

Cellphones are consumed by the vast majority of people as expensive toys and nothing else.


👤 geoduck14
I would disagree. I wrote an app for a non profit I work with. Phones and tablets provide so many great sensors that my laptop doesn't.

The non profit hosts a race where the participants build their own cars. My app helps track the race cars. We use GPS, 3g, wifi, a touch screen, and a 12 hr battery to: track cars and send their location live to our website. This simply wasn't possible before tablets - amd it is really cheap with a cheap tablet.


👤 never_inline
The "why" aside, if you have an android, check out Termux.

That's a UNIX-ish shell environment.

I wish there was a way to use Android APIs from a scripting environment in Android phone. I tried a bit during last semester, but didn't get anywhere due to lack of time. What I wrote was a beanshell environment which lets you evaluate statements like REPL. But since Android and Java bytecode is different, it can't even pass lambdas properly.


👤 sadpolishdev
Because smartphones give access to lot of distractions and content out of the box, and when you got your Atari XE or similar machine, you had 10 floppies with games and INTRODUCTION to basic - at some point you'd start writing it from pure boredom. Now you can infinitely scroll social media and never had a glimpse of thought about doing something by yourself.

👤 analog31
For that matter why don't PC's do something similar?

I started with the BASIC prompt in 1981. And I still feel some nostalgia for those times. I don't write "software" per se, but use programming as a problem solving tool. On the PC, when I'm creating stuff rather than consuming, I typically live inside a Jupyter notebook.

I need a keyboard and a screen. We haven't come up with a better coding interface.


👤 nigamanth
Everyone nowadays is lazier than the average man in the 80s. In the 80s, you probably had to buy music in a store rather than use Spotify or Apple Music or had to look up information in the library instead of google it.

The industry wants you to get everything you want through one click, nowadays though, it's not even a click, you just need to talk about it and Alexa and Google Home will pick up on it.


👤 Pamar
Allow me to add a bit of context (my first computer was an Apple ][, Xmas 1979).

Yes, computers at that time stsrted with a Basic prompt... because there was no appstore, and (at the start) very little in terms of apps. You were supposed to program it yourself (even by tediously copying listings from magazines) and, especially for the early generation products, storage was either very finicky (cassette tapes) or quite expensive (floppy).

In the same years, consoles also started to become a real product, like Atari 2600, Colecovision etc.

Guess what? Consoles did not start up with Basic or Assembly... because you were supposed to use cartridges to play something that had been programmed by a guy working for a company.

So I would argue that personal computers were built and sold (at least at the very start) for people who had a background in electronics and/or an interest in programming. Visicalc changed this almost overnight because after that computers became "interesting" for small business... and this in turn created a market for word processors, small inventory management systems and so on. But also a big push to make "serious" computers (CP/M) that could fit the format/size/price of Apple and TSR-80.

What I am trying to argue here is that PC were at the start mostly intended as educational devices, because there was very little in terms of shrinkwrapped sw to sustain other business cases. And if you just wanted something for your children to play with, you would buy an Atari at a fraction of the price of a Commodore PET or Apple.

Smartphones were always sold as "communication device first" and also the business infrastructure to almost immediately create a large portfolio of apps was already in place (if you remember the original idea for iPhones was not to develop for iOS but just create webapps).

TL;DR: If you bought an Apple at the end of the 70s you absolutely needed a Basic interpreter or it would have been just a very expensive paperweight. If you bought an iPhone in 2007 you wouldn't need to write your own software to get any use out of it.


👤 EVa5I7bHFq9mnYK
Smartphones allow such things. In fact, I wrote an app with an interpreter, ide, debugger and even its own app store, that was intended to build small programs (a few lines of js can do wonders). There wasn't a word of discouragement from Google. But very few people used it. I guess too much trouble.

👤 6gvONxR4sf7o
You program computers on computers. You program phones on computers. You don’t program phones on phones. It seems as simple as that.

👤 eimrine
How about the attitude of businesses? They want you to run as much of non-free code as possible, especially banks. Need to run a non-free is not compatible with the idea of open filesystem which make people afraid about corrupting something. That's why people do not request a terminal and developers do not develop it.

👤 aristofun
Obviously because they are designed for consumers to consume. Not for producers to produce like 80s computers.

👤 ornornor
Was watching videos about late 90s early naughties PDAs and things like the Psion or the Palm had ways to code directly on the device for the device. That’s quite unusual in this age, you need permission from the platform now to even hope writing any code to run on your own smartphone.

👤 34679
I missed out on 80's home computing, but my 90's TI-85 calculator was definitely my first step into programming. I'd like cell phones to be more like that. It had plenty of functions built in, but I could always add more, and it was fun.

👤 Am4TIfIsER0ppos
Because they are Read Only.

Also imagine how much better the world would be if you turned on a smartphone and all you got was a BASIC prompt and had to program the rest of the system yourself.


👤 failrate
I occasionally program on my smartphone, and the big problems are the manual interface and the lack of total control of my file systeme.

👤 cannabis_sam
A BASIC prompt does not appeal to most people, a smartphone does.

If you wanna make youtube videos, I’d take an iPhone over a BASIC prompt any day.


👤 turmeric_root
Have you tried writing code on a smartphone?

👤 kaffeeringe
Companies want you to consume and not to create to become a competitor.

👤 flappyeagle
No keyboard. Maybe with modern dictation we can figure something out

👤 b20000
because people now have less time and expect everything to be done for them

👤 mdmglr
See Shortcuts on iOS.

👤 BruceEel
Kudos, OP. Great question and really interesting HN thread.

👤 more_corn
Raspberry pi

👤 anontrot
Money

👤 here4U
Bec

👤 7373737373
Probably because natural language will supercede progranming language soon: https://youtu.be/PgT8tPChbqc