I’m not sure whether or not to be surprised when computer programming and philosophy collide. The two disciplines seem so incompatible, yet are both based on an ability to look at the world and discern truisms in the way things behave. It is always fascinating to me when I discover some fundamental truism in life that is reflected in, or uncovered by, a piece of software. This will hopefully be the first in a series of articles on the subject.
The most philosophically influential software I’ve encountered was a Plague applet I wrote about 8 years ago while working on a more realistic variation of the Game of Life. Among other things, the applet reveals the cyclic nature of a natural system that is balanced by opposing forces. Which, I suppose, is where the saying, “The only constant is change” comes from.
However, the most important thing it revealed to me was how balanced systems react when attempts are made to suppress one or more of the forces at work. Such suppression will work for a little while but the system eventually adapts and returns to more or less the same state as before. The only problem being that maintaining the that state now requires you to keep the suppression measure in place; as soon as you remove it, the opposing force – now stronger and more evolved – will run rampant.
The real-world example – the one on which the applet is based – is our modern dependency on antibiotics. They worked great when initially developed. But as their use has become pervasive, we are starting to see reduced efficacy and the emergence of “super germs”.
It’s not just medicine where we see such behavior. Virtually every aspect of our modern world behaves this way to some degree: the Cuban Trade embargo, the fight for water in the American Southwest, carpool lanes, rent controlled apartments and most especially the War on Terror. All are dynamic systems that react in unpredictable ways to attempts to influence them.
Thus, as a society we need to be much more critical of the policies and infrastructure we develop because short term fixes that are simply responses to a surge in public opinion or an anomalous event, create political, social, and technological dependencies that last for generations.
3 responses to “Wisdom in Algorithms, Part I – Be Careful What You Wish For”
“Infections spread faster/easier between cells that have different colors.”
But shouldn’t they spread more easily between cells of similar colors/genetic code? I’m more likely to catch the stomach flu from my son than his goldfish. (I happen to know this empirically.)
What happens to the simulation if you make this change?
“What happens to the simulation if you make this change?”
The short answer is that the simulation becomes very, very boring. 🙂
Infection is the dominant force responsible for changing the color of a cell. If only similar-color cells got infected, the system would quickly trend toward a couple large patches of the uniform color, and then not change much.
My original logic behind the different-color approach – aside from wanting the simulation to be visually appealing – was that viruses work by changing our genetic code (or, in this case, the color). Or, to put it another way, the color indicates what the cell is resistant to.
Instead of using goldfish and people as an analogy, think in terms of human populations like the native Americans when Europeans first arrived in America. Genetically very similar, but immunologically drastically (catastrophaclly!) different.
Thanks for including “carpool lanes, rent controlled apartments”… two issues you and I have discussed, and two of my pet peeves, primarily for the reasons mentioned in your thesis here.