Page:Lawhead columbia 0054D 12326.pdf/67

From Wikisource
Jump to navigation Jump to search
This page has been proofread, but needs to be validated.

complex than frogs and ferns. This isn't going to do it, then: while size certainly matters somewhat, the mereological size measure fails to capture the sense in which it matters. Bigger doesn't always mean more complex, even if we can solve the all-important problem of defining what "bigger" even means.

In the case of Strevens’ proposal, we might well be suspicious of what Wikipedia editors would recognize as “weasel words” in the definition: a complex system is one that is made up of many parts that are somewhat independent of one another, and yet interact strongly. It’s difficult to extract anything very precise from this definition: if we didn’t already have an intuitive grasp of what ‘complex’ meant, a definition like this one wouldn’t go terribly far toward helping us get a grasp of the concept. How many parts do we need? How strongly must they interact? How autonomous can they be? Without a clear and precise answer to these questions, it’s hard to see how a definition like this can help us understand the general nature of complexity. In Strevens’ defense, this is not in the least fatal to his project, since his goal is not to give a complete analysis of complexity (but rather just to analyze the role that probability plays in the emergence of simple behavior from the chaotic interaction of many parts). Still, it won’t do for what we’re after here (and Kiesling can claim no such refuge, though her definition does come from an introductory-level talk). We’ll need to find something more precise.

2.1.2 Complexity as Hierarchical Position

First, let's try a refinement of the mereological size measure. The language of science (and, to an even greater degree, the language of philosophy of science) is rife with talk of levels. It's natural to think of many natural systems as showing a kind of hierarchical organization: lakes are