to the words one, two, three, . . . A child may learn to know these words in order, and to repeat them correctly like the letters of the alphabet, without attaching any meaning to them. Such a child may count correctly from the point of view of a grown-up listener, without having any idea of numbers at all. The operation of counting, in fact, can only be intelligently performed by a person who already has some idea what the numbers are; and from this it follows that counting does not give the logical basis of number.
Again, how do we know that the last number reached in the process of counting is the number of the objects counted? This is just one of those facts that are too familiar for their significance to be realised; but those who wish to be logicians must acquire the habit of dwelling upon such facts. There are two propositions involved in this fact: first, that the number of numbers from 1 up to any given number is that given number—for instance, the number of numbers from 1 to 100 is a hundred; secondly, that if a set of numbers can be used as names of a set of objects, each number occurring only once, then the number of numbers used as names is the same as the number of objects. The first of these propositions is capable of an easy arithmetical proof so long as finite numbers are concerned; but with infinite numbers, after the first, it ceases to be true. The second proposition remains true, and is in fact, as we shall see, an immediate consequence of the definition of number. But owing to the falsehood of the first proposition where infinite numbers are concerned, counting, even if it were practically possible, would not be a valid method of discovering the number of terms in an infinite collection, and would in fact give different results according to the manner in which it was carried out.