Page:EB1911 - Volume 22.djvu/392

From Wikisource
Jump to navigation Jump to search
This page has been proofread, but needs to be validated.
378
[FIRST PRINCIPLES
PROBABILITY


principle, of which the following may be taken as an equivalent. If we distribute the favourable cases into several groups the probability of the event will be sum of the probabilities pertaining to each group.[1]

10. Another important instance of unverified probabilities occurs when it is assumed without specific experience that one phenomenon is independent of another in such wise that the probability of a double event is equal to the product of the one event multiplied by the probability of the other—as in the instance already given of two aces occurring. The assumption has been verified with respect to “runs” in some games of chance;[2] but it is legitimately applied far beyond those instances. The proposition that very long runs of particular digits, e.g. of 7, may be expected in the development of a constant like πe.g. a run of six consecutive sevens if the expansion of the constant was carried to a million places of decimals—may be given as an instance in which our conviction greatly transcends specific verification. In the calculation of probable, and improbable, errors, it[3] has to be assumed without specific verification that the observations on which the calculation is based are independent of each other in the sense now under consideration. With these explanations we may accept Laplace's third principle “If the events are independent of each other the probability of their concurrence (l'existence de leur ensemble) is the product of their separate probabilities.”[4]

11. Interdependent Probabilities.—Among the principles of probabilities it is usual to enunciate, after Laplace, several other propositions.[5] But these may here be rapidly passed over as they do not seem to involve any additional philosophical difficulty.

12. It has been shown that when two events are independent of each other the product of their separate probabilities forms the probability of their concurrence. It follows that the probability of the double event divided by the probability of either, say the first, component gives the probability of the other, the second component event. The quotient, we might say, is the probability that when the first event has occurred, the second will occur. The proposition in this form is true also of events which are not independent of one another. Laplace exemplifies the composition of such interdependent probabilities by the instance of three urns, A, B, C, about which it is known that two contain only white balls and one only black balls.[6] The probability of drawing a white ball from an assigned urn, say C, is ⅔. The probability that, a white ball having been drawn from C, a ball drawn from B will be white, is ½. Therefore the probability of the double event drawing a white ball from C and also from B is ⅔ × ½ or ⅓. The question now arises. Supposing we know only the probability of the double event, which probability we will call BC, and the probability of one of them, say [C] (but not, as in the case instanced, the mechanism of their interdependence); what can we infer about the probability [B] of the other event (an event such as in the above instance drawing a white ball from the urn B)—the separate probability irrespective of what has happened as to the urn C? We cannot in general say that [B] = [BC] divided by [C] but rather that quotient × k, where k is an unknown coefficient which may be either positive or negative. It might, however, be improper to treat k as zero on the ground that it is equally likely (in the long run of similar data) to be positive or negative. For given values of [BC] and [C], k has not this equiprobable character, since its positive and negative ranges are not in general equal; as appears from considering that [B] cannot be less than [BC], nor greater than unity.[7]

13. Probability of Causes and Future Effects.—The first principles which have been established afford an adequate ground for the reasoning which is described as deducing the probability of a cause from an observed event.[8] If with the poet[9] we may represent a perfect mixture by the waters of the Po in which the “two Doras” and other tributaries are indiscriminately commingled, there is no great difference in respect of definition and deduction between the probability that a certain particle of water should have emanated from a particular source, or should be discharged through a particular mouth of the river. “This principle,” we may say with De Morgan, “of the retrospective or ‘inverse’ probability is not essentially different from the one first stated (Principle I.).”[10] Nor is a new first principle necessarily involved when after ascending from an effect to a cause we descend to a collateral effect.[11] It is true that in the investigation of causes it is often necessary to have recourse to the unverified species of probability. An instance has already been given of several approximately equiprobable causes, the several values of a quantity under measurement, from one of which the observed phenomena, the given set of observations, must have, so to speak, emanated. A simpler instance of two alternative causes occurs in the investigation which J. S. Mill[12] has illustrated—whether an event, such as a succession of aces, has been produced by a particular cause, such as loading of the die, or by that mass of “fleeting causes” called chance. It is sufficient for the argument that the “a priori” probabilities of the alternatives should not be very unequal.[13]

14. (2) Whether Credibility is Measurable.—The domain of probabilities according to some authorities does not extend much, if at all, beyond the objective phenomena which have been described in the preceding paragraphs. The claims of the science to measure the subjective quantity, degree of belief, are disallowed or minimized. Belief, it is objected, depends upon a complex of perceptions and emotions not amenable[14] to calculus. Moreover, belief is not credibility; even if we do believe with more or less confidence in exact conformity with the measure of probability afforded by the calculus, ought we so to believe? In reply it must be admitted that many of the beliefs on which we have to act are not of the kind for which the calculus prescribes. It was absurd of Craig[15] to attempt to evaluate the credibility of the Christian religion by mathematical calculation. But there seem to be a number of simpler cases of which we may say with De Morgan[16] “that in the universal opinion of those who examine the subject, the state of mind to which a person ought to be able to bring himself” is in accordance with the regulation measure of probability. If in the ordeal to which Portia's suitors were subjected there had been a picture of her not in one only, but in two of the caskets, then—though the judgment of the principal parties might be distorted by emotion—the impartial spectator would normally expect with greater confidence than before that at any particular trial a casket containing the likeness of the lady would be chosen. So the indications of a thermometer may not correspond to the sensations of a fevered patient, but they serve to regulate the temperature of a public library so as to secure the comfort of the majority. This view does not commit us to the quantitative precision of De Morgan that in a case such as above supposed we ought to “look three times as confidently upon the arrival as upon the non-arrival” of the event.[17] Two or three roughly distinguished degrees of credibility—very probable, as probable as not, very improbable, practically impossible—suffice for the more important applications of the calculus. Such is the character of the judgments which the calculus enables us to form with respect to the occurrence of a certain difference between the real value of any quantity under measurement and the value assigned to it by the measurement. The confidence that the constants which we have determined are accurate within certain limits is a subjective feeling which cannot be dislodged from an important part of probabilities.[18] This sphere of subjective probability is widened by the latest developments of the science[19] so far as they add to the number of constants for which it is important to determine the probable—and improbable—error. For instance, a measure of the deviation of observations from an average or mean value was required by the older writers only as subordinate to the determination of the mean, but now this “standard deviation” (below, par. 98) is often treated as an entity for which it is important to discover the limits of error.[20] Some of the newer methods may also serve to countenance the measurement of subjective quantity, in so far as they successfully apply the calculus to quantities not admitting of a precise unit, such as colour


  1. Bertrand on “Probabilités composées,” op. cit. art. 23.
  2. In some of the experiences referred to at par. 5.
  3. See below pars. 132, 159.
  4. Op. cit. Introduction.
  5. There is a good statement of them in Boole's Laws of Thought, ch. xvi. § 7. Cf. De Morgan “Theory of Probabilities” (Encyc. Metrop.), §§ 12 seq.
  6. Laplace, op. cit. Introduction, IVe Principe; cf. Ve Principe and liv., II. ch. i. § 1.
  7. In such a case there seems to be a propriety in expressing the indeterminate element in our data, not as above, but as proposed by Boole in his remarkable Laws of Thought, ch. xvii., ch. xviii., § 1 (cf. Trans. Edin. Roy. Soc., (1857), vol. xxi.; and Trans. Roy. Soc., 1862, vol. ix., vol. clii. pt. i. p. 251); the undetermined constant now representing the probability that if the event C does not occur the event B will. The values of this constant—in the absence of specific data, and where independence is not presumable—are, it should seem, equally distributed between the values 0 and 1. Cf. as to Boole's Calculus, Mind, loc. cit., ix. 230 seq.
  8. Laplace's Sixth Principle.
  9. Manzoni.
  10. De Morgan, Theory of Probabilities, § 19; cf. Venn, Logic of Chance, ch. vii. § 9; Edgeworth, “On the Probable Errors of Frequency Constants,” Journ. Stat. Soc. (1908), p. 653. The essential symmetry of the inverse and the direct methods is shown by an elegant proof which Professor Cook Wilson has given for the received rules of inverse probability (Nature, 1900, Dec. 13).
  11. Laplace's Seventh Principle.
  12. Logic, book III., ch. xviii. § 6.
  13. Cf. above, par. 8; below, par. 46.
  14. Cf. Venn, Logic of Chance, p. 126.
  15. See the reference to Craig in Todhunter, History . . . of Probability.
  16. Formal Logic, p. 173.
  17. Ibid. Cf. “Theory of Probabilities” (Encyc. Metrop.), note to § 5, “Wherever the term greater or less can be applied there twice, thrice, &c., can be conceived, though not perhaps measured by us.”
  18. It is well remarked by Professor Irving Fisher (Capital and Income, 1907, ch. xvi.), that Bernoulli's theorem involves a “subjective” element a “psychological magnitude.” The remark is applicable to the general theory of error of which the theorem of Bernoulli is a particular case (see below, pars. 103, 104).
  19. In the hands of Professor Karl Pearson, Mr Sheppard and Mr Yule. Cf. par. 149, below.
  20. Cf. Edgeworth, Journ. Stat. Soc. (Dec. 1908).