Back

Why Averages are Evolutionary Dead Ends

February 17, 2026

3 min read

A long time ago, we thought that inheritance worked like mixing paints. Crossover took two fit parents and mixed their qualities together to produce a better offspring. That’s a common misconception about evolution, that I too had pictured crossover. It’s like if you mix two paints red and white, you get pink. Now the new color born is pink, and if you mix it again with white we get lighter pink, eventually we get pure white. If crossover worked like this, variation among the population would be lost within a few generations, everyone would look the same. For evolution, everyone looking the same means that process stops entirely. There’s nothing natural selection can filter out. Natural Selection relies on variation among the individuals (diversity). If everyone blends into the average, there’s nothing left to select for or against.

Modern genetics have proved that inheritance is not blending two qualities. Genes act like a deck of cards, and crossover is just shuffling the cards. The cards will mix, but not blend. Imagine if numbers were blended when crossover, after some generations all the cards would be some around a numerical average (eg: ~7). So the point is that an offspring will be a combination of discrete genetic units derived from parents and not an average. That’s why in the bit string analogy, I mentioned that the offspring should have discrete 1s or 0s and not a floating point (but a lot of algorithms use floating points with high mutation rates to prevent the effect of averaging).

An effect of this non-blending crossover is the preservation of variance. A trait can be “hidden” (recessive) in one generation and re-emerge fully formed in the next. A child can have blue eyes like their grandfather, even if both parents have brown eyes. This proves the “blue eye” packet of information was passed down intact, not averaged out by the “brown eye” information. Finally this makes sure that Natural Selection has some variation to work with even when the environmental conditions rapidly change.

Note on evolutionary algorithms

In Machine Learning terms, “blending” collapses the search space to a single point (the mean). Evolution requires a broad distribution (high variance) to explore the “fitness landscape” effectively. If a population averages out too quickly, the algorithm gets stuck in a “local optimum” and we’ll end up with a sub-optimal solution.

Another note on evolutionary algorithms with LLMs

A lot of newer approaches use LLMs as a mutation and crossover operators. My hypothesis is that while mutation works well, crossover might converge to an average very quickly. More work should be done to confirm that it does not blend two parents but discretely mix like a deck of cards.