This is a simulation of agents living inside an environment where they compete for food. Each creature moves through it's surroundings by using it's own simple neural "brain" to make decisions about which direction to move based on a simple set of inputs or "senses" about its immediate surroundings.
The creatures are ranked by their fitness (quantity of food found), and the most successful creatures are colored most brightly. Periodically the worst performing creatures are removed while the best performers are given offspring whose brain wirings are based on their parents', plus a random set of mutations.
Over time, these random mutations and pairings coupled with the selection process result in creatures that are increasingly adept at navigating their surroundings and securing food. If you leave this page running for many generations, you'll be able to observe the results. This process is analogous to the natural phenomenon of evolution.
Usually, genetic algorithms use a child creation algorithm based on naively combining traits from one parent with traits from the other. When these traits are the weights of a neural network with hidden nodes, this approach has the drawback of destroying semantically similar but differently configured sets of weights. This is known as the "Competing Conventions Problem" or the "Permutations Problem" - when there are many equivalent but incompatible ways to express a strategy in a neural network, genomes representing the same solution do not route that information through the same hidden nodes, and crossover is highly likely to produce invalid (dramatically less fit) offspring.
This system's solution to that problem is to replace genetic crossover (genome of parents directly recombining to create children) with simulated "tutoring" process. In this process, a child is first created by cloning one of the parents. Then, we take a set of snapshots of inputs from the previous generation's lifetime, and record the other parent's repsonse to each input snapshot. These pairs of inputs and responses are used as training pairs to do backpropagation training on the child, teaching that child to respond to inputs more similarly to the parent. In biological terms, the child is created in a method that aims to transfer phenotypical properties over genetic ones.
In the future, I would like to improve this method by performing clustering on the input samples, assigning each cluster of input samples to one parent or the other, and then training the child on both sets.
If you're interested in seeing more artificial evolution experiments like this one, I recommend checking out creatures avoiding planks or the 1996 research project/game Darwin Pond. A more established solution for the Competing conventions problem is the use of the NeuroEvolution of Augmenting Topologies (NEAT) Algorithm, which modifies topology between creatures and maintains a concept of seperate species and uses these subdivisions to inform the way weights can be shared between creatures.
To make collision detection more efficient, a spatial hash data structure (binning) is built and maintained. I rendered it and mapped density to hue for debugging, but kept it because it looks kind of cool.