HACKER Q&A
📣 mg

Is genetic programming still actively researched?


It is fascinating that neural networks have such a run at the moment. I wonder if this will continue "forever". Or if we will see a different paradigm eclipse them in the future.

Is anybody still doing research in the area of genetic programming?

The genetic programming books of John R. Koza were the first I ever read about machine learning. It felt like magic at that time.

I have the feeling that the approach to generate programs for the CPU via evolution still has a lot to offer if it was explored further.

If there is research going on out there, I would love to follow it.


  👤 cookiengineer Accepted Answer ✓
The genetic programming scene kind of evolved into NEAT, HyperNEAT and ES-HyperNEAT as a meta learning concept.

Connections between layers/nodes are serialized as genes of agents with phenotypes and dominant/recessive markers, and an observing CPPN learns to categorize agents into different traits to find more efficient breeding mechanisms.

It's a strong concept, and AFAIK it's still used a lot in the robotics world where you have to guarantee behaviors and have to be able to reproduce behaviors due to safety regulations.

There was a nice intro video into the underlying base concept which is called NEAT by a youtuber named SethBling [2]

[1] http://eplex.cs.ucf.edu/ESHyperNEAT/

[2] https://m.youtube.com/watch?v=qv6UVOQ0F44


👤 versteegen
It's completely mistaken to think that the NN craze means noone works on anything else. Academia has very many people researching whatever they want, full-time or on the side. AI has especially many veteran researchers stubbornly following long-standing lines of research which have unimpressive results. Noone can say they're wrong. Hinton was once that guy doing unfashionable research into NNs.

Anyway, there are also memetic algorithms, which extend genetic algorithms by adding local search (some form of local improvement such as gradient following or simple handcoded heuristics) to the genetic global search. Actually a very simple idea (e.g. alternate mutation and/or recombination and optimisation steps). They tend to perform better than pure genetic algorithms because they can actually use gradient information or heuristics. It's a very broad class of algorithms which tend to have many hyperparameters.


👤 jacquesm
This is probably a good starting point:

https://sig.sigevo.org/index.html

Genetic programming is a bit of a misnomer, evolutionary algorithms is probably a better name.


👤 opless

👤 deadly_penguin
Yes, although it is much more active in robotics. York still has quite active research into evolutionary algorithms and genetic programming (https://www.york.ac.uk/physics-engineering-technology/resear...).

It's been used to do things like find design parameters (https://pure.york.ac.uk/portal/en/publications/evolving-desi...) and attempt to evolve robots to fit an environment (https://www.york.ac.uk/robot-lab/are/)


👤 bionhoward
Yes it is going on. I wound up having to write a new python GP library from scratch due to DEAP license being gpl3. Now I’m translating that to Rust. One point of order: don’t only try stuff randomly. Also try trying stuff in shortlex order. Also, make sure you use hyperparameter optimization on the outside of the GP evolution process or else you’ll wind up with too many parameters to hand-tune.

I think the link between Pascal’s Simplex, Koza GP Tree Words, and Levin Search, is fascinating.


👤 androidbishop
One of the reasons I got my degree in Biotechnology was because I realized that the technology of life is mind-bogglingly advanced and learning how it does things can have profound insights into how we solve other problems. The process of mutation and evolution is definitely a strong contender for this, maybe one of the most important and powerful.

👤 Aicy
I'm a non-expert here but it seems intuitive to me that directly moving towards an improved model via linear regression is more efficient than randomly changing your model and then running a natural selection simulation to improve fitness.

👤 bjornasm
Google Scholar shows hundreds of Genetic Programming papers published in 2023. If we expand with some years, several have hundreds of citations, so the field is seemingly alive and well. I hope to find some uses of it myself, just because I like the concept.

👤 aaron695
Intuitively to me Monty Carlo would be better. Monty Carlo rocks.

Intuitively to me intelligent design is going to beat genetic programming.

It's the constants and knowing which intelligently designed algorithm is better that is impossible to know, which Monty Carlo solves.

Look at the antenna designed on Wiki and think how easy that would be with Monty Carlo - https://en.wikipedia.org/wiki/Genetic_algorithm#:~:text=The%....

John R. Koza book was 1992, computational power now allows us to smash things.

Here's a comparison at a wind farm design between Monte Carlo and genetic algorithms (Monty Carlo was better) - https://rera.shahroodut.ac.ir/article_2146_5e7bee97938fcd513...

But it's really interesting, have fun looking into it. Have a look through HN articles - https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

[edit] I haven't differentiated between "What are the differences between genetic algorithms and genetic programming?" - https://stackoverflow.com/questions/3819977/what-are-the-dif...