There's a lot of different ways to make code. But at this point there's just so little evidence that languages engender interesting new capabilities. There's some flourishistic differences, but most code reads fairly the same, if you squint. Rust is by far one of the most interesting languages, and it's only interesting adds are... constraints. We need some new frontiers of possibility, not just constraint. Languages are not leading us to new potentials, these days.
Mojo is another approach to solving "the two-language problem," similar to what Julia is trying to do, but the code looks like Python. Mojo is being designed by Chris Lattner [2], who knows a thing or two about language & compiler design.
[1] https://en.wikipedia.org/wiki/Mojo_(programming_language)
If only it could run outside of js realm - there is almost nothing more to desire from a general purpose application language (vs system/lowlevel ones).
I am excited about the problems that go away under the unison model.
For example:
- no dependency version resolution
- fully incremental builds
- no merge conflicts
- fully incremental distribution
- easy, fine-grained tree shaking
- no code formatting (except for display in your editor)
https://github.com/merrymercy/awesome-tensor-compilers
Modern ML training/inference is inefficient, and lacks any portability. These frameworks are how that changes...
As random examples, TVM runs LLaMA on Vulkan faster than PyTorch CUDA, and AITemplate almost doubles the speed of Stable Diffusion. Triton somewhat speeds up PyTorch training in the few repos that use it now, and should help AMD hardware even more than Nvidia.
1. Unison
2. Gleam
3. Futhark (and similar GPU compute PLs)
4. Lisp-s with static typing / algebraic type systems
It has the balance between the simplicity of Golang, the explicity of control flow, the performance of C but with type.