A simulation of evolution — Two

louis030195
6 min readAug 18, 2020

Make sure to read my previous post:

Antifragility

Some things benefit from shocks; they thrive and grow when exposed to volatility, randomness, disorder, and stressors and love adventure, risk, and uncertainty. Yet, in spite of the ubiquity of the phenomenon, there is no word for the exact opposite of fragile. Let us call it antifragile. Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better

— Nassim Nicholas Taleb

The fittest to survival are antifragile, they take shocks, fail, fall and rise much stronger.

We tend to say “Survival of the species”, but there is no such thing, it’s survival of the individual and the individual is the gene.
Genes use all life forms as vehicle to survive over the generations or like Dawkins like to say “survival machines”.
When a bee sting and die, it’s for the gene survival which is likely to be present in other bees and not for the survival of the bodies.

The theory of evolution is everywhere

In any case, I believe that understanding the theory of evolution can help through many aspects of life, as a kind of mental model, a powerful heuristic, everything follow the rule of the evolution even in software engineering:

In software engineering, when you implement a new feature, you do a pull request and collaborators review your work, critic, you then improve your code accordingly until it’s approved and merged: trial-and-error, best ideas “have sex” producing higher “fitness” code.

Importance of competition

https://www.pexels.com/photo/kick-chess-piece-standing-131616/

The struggle for existence never gets easier. However well a species may adapt to its environment, it can never relax, because its competitors and its enemies are also adapting to their niches. Survival is a zero-sum game.

— Matt Ridley.

Let me do a quick time travel into my previous blog post, where I talked about Evol.

https://github.com/louis030195/Evol_ML

The blue cubes are herbivorous animals trying to survive by eating the white cubes while the red cubes are carnivorous animals trying to eat the herbivorous. What I did is just telling them that they can move around, see around, get health when eating, punished when dying, over time they learned that it’s good to eat, simple !

So both agents were trying to improve at the expense of the other, in game theory we call this a zero-sum game, this setting allow great growth of both adversarial species.

Competition had a great impact on machine learning research too, many implementations which introduced competition between two networks improved performance a lot, for example Policy Gradient methods in reinforcement learning and as an example: Generative Adversarial Networks (GAN).
You don’t have to know what machine learning is, let’s say it’s an extension of computer programming.
GAN purpose is simply to generate images.
If we look how it works at the high-level, there is 2 parts: one that try to create images that look real and give it to the other that try to tell if it’s real or not.
As it’s included in the title machine "learning" you can guess there is some learning process, both parts will start with poor skill and evolve, each part trying to be better than the other : competition.

https://www.researchgate.net/figure/Generative-Adversarial-Network-Architecture_fig3_334100947

Result:

https://miro.medium.com/max/2468/1*Yw2KxjmIkj8yqS-ykLCQCQ.png

Nature has evolved over millions of years and has learned so far that competition is a powerful way to maximize the fitness of the gene pool and we should learn from this and not avoid competition.

Red Queen Hypothesis

The Red Queen Hypothesis is closely related to competition, the name comes from the famous book of Charles Lutwidge Dodgson, better known by his pseudonym Lewis Carroll: Through the Looking Glass

In Through the Looking Glass, Alice, a young girl, is taught by the Red Queen an important life lesson that many of us ignore. Alice finds herself running faster and faster while staying in the same place.

Alice never could quite make out, in thinking it over afterwards, how it was that they began: all she remembers is, that they were running hand in hand, and the Queen went so fast that it was all she could do to keep up with her: and still the Queen kept crying ‘Faster! Faster!’ but Alice felt she could not go faster, though she had not breath left to say so.

The most curious part of the thing was, that the trees and the other things round them never changed their places at all: however fast they went, they never seemed to pass anything. ‘I wonder if all the things move along with us?’ thought poor puzzled Alice. And the Queen seemed to guess her thoughts, for she cried, ‘Faster! Don’t try to talk!’

Finally, the queen stops running and pushes Alice against a tree, telling her to rest.

Alice looked round her in great surprise. ‘Why, I do believe we’ve been under this tree the whole time! Everything’s just as it was!’

‘Of course it is,’ said the Queen, ‘what would you have it?’

‘Well, in our country,’ said Alice, still panting a little, ‘you’d generally get to somewhere else — if you ran very fast for a long time, as we’ve been doing.’

‘A slow sort of country!’ said the Queen. ‘Now, here, you see, it takes all the running you can do, to keep in the same place.

If you want to get somewhere else, you must run at least twice as fast as that!’

The Red Queen theory tries to explain why competition makes evolution thrive and tries to find the reason why there are sexes.
A prey must constantly adapt to the evolution of its predators and its predators must constantly adapt to its evolution, the same goes for the host in relation to parasites or even males in relation to females.

Erutan

Coming back to the subject of simulation, Erutan was another simulation of evolution.

https://github.com/The-Tensox/Erutan-unity

This time, I went for simple heuristics for behaviors, I had in mind a simulation that could be viewed from several clients & the possibility to apply artificial selection on the populations.

For the technical details, I used Unity for the front-end and Go for the back-end with gRPC, Protobuf, Entity Component System design (ECS) plus the cherry on the cake: an Octree data structure.

Octree

https://en.wikipedia.org/wiki/Octree

Imagine, someday, you lose your wallet.
You remember having it until you came back home by driving.
You will therefore search for your wallet around your car, you won’t search the whole universe to find your wallet, right ?
That’s the principle of an Octree data structure.

At first I did the “search the whole universe” solution in order to see if objects are overlapping (collision) because it’s the easy solution to implement, but as you will suppose, it’s inefficient, we say O(n²) time complexity to find an element however O(log^n) using the Octree data structure. Concretely, using a Octree is a way to efficiently compute collision detection in three dimensions.

Simulate physics is hard

As I was keeping a decent inertia on this fun project, I didn’t think long-term, was just playing around, I didn’t anticipate the fact that it’s hard to simulate physics and I was actually re-inventing the wheel.

So I decided to instead use Unity’s physics engine, still Go for communication and coordination of the Unity nodes & other components of the pipeline.

Next

In the next post I’ll present niwrad, a “distributed sandbox evolution simulation”, which is the offspring of all these experiences.

https://github.com/louis030195/niwrad

Further reading

--

--