theory of complexity must account for both the intuitive pull of these measures and deal with the troubling relativism lurking beneath their surfaces.

The Shannon entropy measure suffered from two primary problems. First, since Shannon entropy is an *information theoretic* quantity, it can only be appropriately applied to things that have the logical structure of *messages*. To make this work as a general measure of complexity for *physical systems*, we would have to come up with an uncontroversial way of representing parts of the world as messages generally—a tall order indeed. Additionally, we saw that there doesn’t seem to be a strict correlation between changes in Shannon entropy of messages and the complexity of systems with which those messages are associated. I argued that in order for Shannon entropy to function as a measure of complexity, a requirement called the correlation condition must be satisfied: it must be the case that a monotonic increase in complexity in physical systems is correlated with either a monotonic increase or a monotonic decrease in the Shannon entropy of the message associated with that system. The paradigm case here (largely in virtue of being quite friendly to representation as a string of bits) is the case of three strings of DNA: one that codes for a normal human, one that consists of randomly paired nucleotides, and one that consists entirely of cytosine-guanine pairs. In order for the correlation condition to obtain, it must be the case that the system consisting of either the randomly paired nucleotides (which has an associated message with maximal Shannon entropy) *or* the C-G pair molecule (which has an associated messages with minimal Shannon entropy) is more complex than the system consisting of the human-coding DNA molecule (which has an associated message with Shannon entropy that falls between these two extremes). This is not the case, though: any reasonable measure of complexity should rate a DNA strand that codes for a normal organism as

79