HACKER Q&A
📣 bckr

Weirdest Computer Architecture?


My limited understanding of “the stack” is:

  Physical substrate: Electronics
  Computation theory: Turing machines
  Smallest logical/physical parts: Transistors, Logic gates
  Largest logical/physical parts: Chips
  Lowest level programming: ARM/x64 instructions 
  First abstractions of programming: Assembly, C compiler
  Software architecture: Unix kernel , Binary interfaces
  User interface: Screen, Mouse, Keyboard, GNU
Does there exist a computer stack that changes all of these components?

Or at the least uses electronics but substitutes something else for Turing machines and above.


  👤 runjake Accepted Answer ✓
Here are some architectures that might interest you. Note these are links that lead to rabbit holes.

1. Transmeta: https://en.wikipedia.org/wiki/Transmeta

2. Cell processor: https://en.wikipedia.org/wiki/Cell_(processor)

3. VAX: https://en.wikipedia.org/wiki/VAX (Was unusual for it's time, but many concepts have since been adopted)

4. IBM zArchitecture: https://en.wikipedia.org/wiki/Z/Architecture (This stuff is complete unlike conventional computing, particularly the "self-healing" features.)

5. IBM TrueNorth processor: https://open-neuromorphic.org/blog/truenorth-deep-dive-ibm-n... (Cognitive/neuromorphic computing)


👤 jecel
"Computer architecture" is used in several different ways and that can lead to some very confusing conversations. Your proposed stack has some of this confusion. Some alternative terms might help:

"computational model": finite state machine, Turing machine, Petri nets, data-flow, stored program (a.k.a. Von Neumann, or Princeton), dual memory (a.k.a. Harvard), cellular automata, neural networks, quantum computers, analog computers for differential equations

"instruction set architecture": ARM, x86, RISC-V, IBM 360

"instruction set style": CISC, RISC, VLIW, MOVE (a.k.a TTA - Transport Triggered Architecture), Vector

"number of addresses": 0 (stack machine), 1 (accumulator machine), 2 (most CISCs), 3 (most RISCs), 4 (popular with sequential memory machines like Turing's ACE or the Bendix G-15)

"micro-architecture": single cycle, multi-cycle, pipelines, super-pipelined, out-of-order

"system organization": distributed memory, shared memory, non uniform memory, homogeneous, heterogeneous

With these different dimensions for "computer architecture" you will have different answers for which was the weirdest one.


👤 defrost
Setun: three-valued ternary logic computer instead of the common binary- https://en.wikipedia.org/wiki/Setun

Not 'weird' but any architecture that doesn't have an 8-bit byte causes questions and discussion.

EG: Texas Instruments DSP chip family for digital signal processing, they're all about deep pipelined FFT computations with floats and doubles, not piddling about with 8-bit ASCII .. there's no hardware level bit operations to speak of, and the smallest addressable memory size is either 32 or 64 bits.


👤 mikewarot
BitGrid is my hobby horse. It's a Cartesian grid of cells with 4 bit in, 4 bit out, LUTs (look up tables), latched in alternating phases to eliminate race conditions.

It's the response to the observation that most of the transistors in a computer are idle at any given instant.

There are a full rabbit hole worth of advantages to this architecture once you really dig into it.

Description https://esolangs.org/wiki/Bitgrid

Emulator https://github.com/mikewarot/Bitgrid


👤 jy14898
Transputer

> The name, from "transistor" and "computer", was selected to indicate the role the individual transputers would play: numbers of them would be used as basic building blocks in a larger integrated system, just as transistors had been used in earlier designs.

https://en.wikipedia.org/wiki/Transputer


👤 amy-petrik-214
they was some interesting funk in the 80s : Lisp computer: https://en.wikipedia.org/wiki/Lisp_machine (these were very hot in 1980s era AI) connection machine: https://en.wikipedia.org/wiki/Connection_Machine (a gorillion monobit processor supercluster)

let us also not forget The Itanic


👤 ithkuil
CDC 6000 was a barrel processor: https://en.m.wikipedia.org/wiki/Barrel_processor

Mill CPU (so far only patent-ware but interesting nevertheless) : https://millcomputing.com/


👤 sshine
These aren't implemented in hardware, but they're examples of esoteric architectures:

zk-STARK virtual machines:

https://github.com/TritonVM/triton-vm

https://github.com/risc0/risc0

They're "just" bounded Turing machines with extra cryptography. The VM architectures have been optimized for certain cryptographic primitives so that you can prove properties of arbitrary programs, including the cryptographic verification itself. This lets you e.g. play turn-based games where you commit to make a move/action without revealing it (cryptographic fog-of-war):

https://www.ingonyama.com/blog/cryptographic-fog-of-war

The reason why this requires a specialised architecture is that in order to prove something about the execution of an arbitrary program, you need to arithmetize the entire machine (create a set of equations that are true when the machine performs a valid step, where these equations also hold for certain derivatives of those steps).


👤 mikewarot
I thought magnetic logic was an interesting technology when I first heard of it. It's never going to replace semiconductors, but if you want to compute on the surface of Venus. You just might be able to make it work there.

The basic limit is the curie point of the cores, and the source of clock drive signals.

https://en.m.wikipedia.org/wiki/Magnetic_logic


👤 drakonka
This reminds me of a talk I went to at the 2020 ALIFE conference, in which the speaker presented an infinitely-scalable architecture called the "Movable Feast Machine". He suggested relinquishing hardware determinism - the hardware can give us wrong answers and the software has to recover, and in some cases the hardware may fail catastrophically. The hardware is a series of tiles with no CPU. Operations are local and best-effort, determinism not guaranteed. The software then has to reconcile that.

It was quite a while ago and my memory is hazy tbh, but I put some quick notes here at the time: https://liza.io/alife-2020-soft-alife-with-splat-and-ulam/


👤 CalChris
Intel's iAPX 432. 1975. Instructions were bit-aligned, stack based, 32-bit operations, segmented, capabilities, .... It was so late+slow that the 16-bit 8086 was created.

https://en.wikipedia.org/wiki/Intel_iAPX_432


👤 muziq
The Apple ‘Scorpius’ thing they bought the Cray in the 80’s for emulating.. RISC, multi-core, but could put all the cores in lockstep to operate as pseudo’SIMD.. Or failing that, the 32-bit 6502 successor the MCS65E4.. https://web.archive.org/web/20221029042214if_/http://archive...

👤 mac3n
FPGA: non-sequential programming

Lightmatter: matrix multiply via optical interferometers

Parametron: coupled oscillator phase logic

rapid single flux quantum logic: high-speed pulse logic

asynchronous logic

https://en.wikipedia.org/wiki/Unconventional_computing


👤 nailer
The giant global computers that are Solana mainnet / devnet / testnet. The programs are compiled from Rust into (slightly tweaked) EBPF binaries and state updates every 400ms, using VDFs to sync clocks between the leaders that are allowed to update state.

👤 yen223
A lot of things are Turing-complete. The funniest one to me are Powerpoint slides.

https://beza1e1.tuxen.de/articles/accidentally_turing_comple...

https://gwern.net/turing-complete



👤 metaketa
HVM using interaction nets as alternative to Turing computation deserves a mention. Google: HigherOrderCompany

👤 0xdeadbeer
I heard of counter machines on Computerphile https://www.youtube.com/watch?v=PXN7jTNGQIw

👤 AstroJetson
Huge fan of the Burroughs Large Systems Stack Machines.

https://en.wikipedia.org/wiki/Burroughs_Large_Systems

They had an attached scientific processor to do vector and array computations.

https://bitsavers.org/pdf/burroughs/BSP/BSP_Overview.pdf


👤 phyalow


👤 gregorymtravis


👤 GistNoesis
https://en.wikipedia.org/wiki/Unconventional_computing

There is also Soap Bubble Computing, or various form of annealing computing (like quantum annealing or Adiabatic quantum computation), where you set up your computation as the optimal value of a physical system you can define.


👤 elkekeer
A multi-core Propeller processor by Parallax (https://en.wikipedia.org/wiki/Parallax_Propeller) in which multitasking is done by cores (called cogs) taking turns: first, code is executed on the first cog, then, after a while, on the second, then on the third, etc.

👤 vismit2000
How about water computer? https://youtu.be/IxXaizglscw

👤 yencabulator
Just the operating system, but I like Barrelfish's idea of having a separate kernel on every core and doing message passing. Each "CPU driver" is single-threaded, non-preemptible (no interrupts), shares no state, bounded-time, and runs to completion. Userspace programs can access shared memory, but the low-level stuff doesn't do that. Bounded-time run to completion kinda makes me think of seL4, if it was designed to be natively multicore.

https://en.wikipedia.org/wiki/Barrelfish_(operating_system)

https://barrelfish.org/publications/TN-000-Overview.pdf


👤 Joker_vD
IBM 1401. One of the weirdest ISAs I've ever read about, with basically human readable machine code thanks to BCD.

👤 29athrowaway
The Soviet Union water integrator. An analog, water based computer for computing partial differential equations.

https://en.m.wikipedia.org/wiki/Water_integrator


👤 trealira
The ENIAC, the first computer, didn't have assembly language. You programmed it by fiddling with circuits and switches. Also, it didn't use binary integers, but decimal ones, with 10 vacuum tubes to represent the digits 0-9.

👤 Paul-Craft

👤 jareklupinski
Physical Substrate: Carbon / Water / Sodium

Computation Theory: Cognitive Processes

Smallest parts: Neurons

Largest parts: Brains

Lowest level language: Proprietary

First abstractions of programming: Bootstrapped / Self-learning

Software architecture: Maslow's Theory of Needs

User Interface: Sight, Sound


👤 variadix
From the creator of Forth https://youtu.be/0PclgBd6_Zs

144 small computers in a grid that can communicate with each other


👤 porridgeraisin
Crab powered computer:

http://phys.org/news/2012-04-scientists-crab-powered.html

Yes, real crabs


👤 RecycledEle
Using piloted pneumatic valves as logic gates blew my mind.

If you are looking for strangeness, the 1990's to early 2000's microcontrollers had I/O ports, but every single I/O port was different. None of them had a standard so that we could (for example) plug in a 10-pin header and connect the same peripheral to any of the I/O ports on a single microcontroller, much less any microcontroller they made in a family of microcontrollers.


👤 mbfg
I've got to believe x86 is in the running. We don't think of it because it is the dominate architecture, but it's kind of crazy.

👤 PeterStuer
In the 80's our lab lobbied the university to get a CM-1. We failed and they got a Cray instead. The connection machine was a realy different architectute aimed at massive partallel execution https://en.wikipedia.org/wiki/Connection_Machine

👤 dwrodri
If you really want to see some esoteric computer architecture ideas, check out Mill Computing: https://millcomputing.com/wiki/Architecture. I don't think they've etched any of their designs into silicon, but very fascinating ideas nonetheless.

👤 jacknews
Of course there are things like the molecular mechanical computers proposed/popularised by Eric Drexler etc.

I think Transport-triggered architecture (https://en.wikipedia.org/wiki/Transport_triggered_architectu...) is something still not fully explored.


👤 BarbaryCoast
Look at the earliest computers, that is, those around the time of ENIAC. Most were electro-mechanical, some were entirely relay machines. I believe EDSAC was the first _electronic_ digital computer.

As for weird, try this: ENIAC instructions modified themselves. Back then, an "instruction" (they called them "orders") included the addresses of the operands and destination (which was usually the accumulator). So if you wanted to sum the numbers in an array, you'd put the address of the first element in the instructions, and as ENIAC repeated that instruction (a specified number of times), the address in the instruction would be auto-incremented.

Or how about this: a computer with NO 'jump' or 'branch' instruction? The ATLAS-1 was a landmark of computing, having invented most of the things we take for granted now, like virtual memory, paging, and multi-programming. But it had NO instruction for altering the control flow. Instead, the programmer would simply _write_ to the program counter (PC). Then the next instruction would be fetched from the address in the PC. If the programmer wanted to return to the previous location (a "subroutine call"), they'd be obligated to save what was in the PC before overwriting it. There was no stack, unless you count a programmer writing the code to save a specific number of PC values, and adding code to all subroutines to fetch the old value and restore it to the PC. I do admire the simplicity -- want to run code at a different address? Tell me what it is and I'll just go there, no questions asked.

Or maybe these shouldn't count as "weird", because no one had yet figured out what a computer should be. There was no "standard" model (despite Von Neumann) for the design of a machine, and cost considerations plus new developments (spurred by wanting better computers) meant that the "best" design was constantly changing.

Consider that post-WWII, some materials were hard to come by. So much so that one researcher used a Slinky (yes, the toy) as a memory storage device. And had it working. They wanted acoustic delay lines (the standard of the time), but the Slinky was more available. So it did the same job, just with a different medium.

I've spent a lot of time researching these early machines, wanting to find the path each item in a now-standard model of an idealized computer. It's full of twists and turns, dead ends and unintentional invention.


👤 supercoffee
I'm fascinated by the mechanical fire control computers of WW2 battleships.

https://arstechnica.com/information-technology/2020/05/gears...


👤 ChristopherDrum
Mythic produces an analog processor https://mythic.ai/

There is also the analog computer The Analog Thing https://the-analog-thing.org/


👤 sshb
This unconventional computing magazine came to my mind: http://links-series.com/links-series-special-edition-1-uncon...

Compute with mushrooms, compute near black holes, etc.


👤 t312227
hello,

great collection of interesting links - kudos to all! :=)

idk ... but isn't the "general" architecture of most of our computers "von neumann"!?

* https://en.wikipedia.org/wiki/Von_Neumann_architecture

but what i miss from the various lists, is the "transputer"-architecture / ecosystem from INMOS - a concept of heavily networked arrays of small cores from the 1980ties

about transputers

* https://en.wikipedia.org/wiki/Transputer

about INMOS

* https://en.wikipedia.org/wiki/Inmos

i had the possibility to take a look at a "real life" ATW - atari transputer workstation - back in the days at my university / CS department :))

mainly used with the Helios operating-system

* https://en.wikipedia.org/wiki/HeliOS

to be programmed in occam

* https://en.wikipedia.org/wiki/Occam_(programming_language)

the "atari transputer workstation" ~ more or less a "smaller" atari mega ST as the "host node" connected to an (extendable) array of extension-cards containing the transputer-chips:

* https://en.wikipedia.org/wiki/Atari_Transputer_Workstation

just my 0.02€


👤 dsr_
There are several replacements for electronic logic; some of them have even been built.

https://en.wikipedia.org/wiki/Logic_gate#Non-electronic_logi...


👤 floxy

👤 solardev
Analog computers, quantum computers, light based computers, DNA based computers, etc.


👤 osigurdson
I'm not sure what the computer architecture was, but I recall the engine controller for the V22 Osprey (AE1107) used odd formats like 11 bit floating point numbers, 7 bit ints, etc.

👤 joehosteny
The Piperench runtime reconfigurable FPGA out of CMU:

https://research.ece.cmu.edu/piperench/


👤 ranger_danger
9-bit bytes, 27-bit words... middle endian.

https://dttw.tech/posts/rJHDh3RLb


👤 vapemaster
since this is a bit of a catch-all thread, i'll toss Anton into the ring: a whole bunch of custom ASICs to do Molecular Dynamics simulations from D.E. Shaw Research

https://en.wikipedia.org/wiki/Anton_(computer)


👤 dongecko
Motorola used to have a one bit microprocessor, the MC14500B.

👤 ksherlock
The tinker toy computer doesn't even use electricity.

👤 gjvc
rekursiv