Page:The New International Encyclopædia 1st ed. v. 16.djvu/489

From Wikisource
Jump to navigation Jump to search
This page needs to be proofread.
*
421
*

PROBABILITY. 421 PROBABILITY. cur, the probability of its happening is defined to be — ZHT' ^°*^ *^^ probability of its failing is defined to be , . the two being complementary. lu this case, the odds in favor of the event are said to be a to 6, and the odds against the event are said to be 6 to a. E.g. there are five ways of drawing one black ball from five black balls and tliree ways of drawing a white one from three wliite balls. Hence the probability of drawing a black ball from the whole eight on the first trial is |, and of not drawing a black ball, or, what is the same thing, of drawing a white one, is |. The odds in favor of drawing a black ball are 5 to 3. The odds against this are 3 to 5. Likewise, the odds in favor of drawing a white ball are 3 to 5, and the odds against it are 5 to 3. If the probability of two independent events a "' taking place are respectively ,_ , ,.. and the probability that (a--b){a--b ) bb- in; {a'+b-y happen is The probability of both events fail- both will , — r-j- j^— , When fail is substituted for lo-|-6X<( --bp) ' happen, hh' must be substituted for aa'. Simi- larlv, the probabilitv that the first event happens ab' and the second event fails is , — ; — ,., , . — ttt and (a + b)(a -- b ) the probability that the first event fails and the a'b second event happens is^ E.g., if p a + b){a+b')' and p' are the respective probabilities that each of two events happens, then pp' is the proba- bility that both happen. In like manner, if there are any number of independent events, the proba- bility that they will all happen is the product of their respective probabilities of happening. If p represents the probability of the happening of an event in one trial and q the probability of its failing, the probability that it will hajjpen exactly r times in n trials is n{n — ) (n — r + 1) ^ JT ft"- The probability that an event will fail exactly r times in n trials is n{n—l)---{n — r+l) „ _ p^-.q,. In the expansion of (p + 7) ", viz. , , -nin — 1) ' , , P° + wp" 9 + — 2I — '^" i' "" ' the terms represent respectively the probabilities of the happening of the event exactly n times, n — 1 times, n — 2 times, and so on, in n trials. Hence the most probable number of successes and failures in n trials is given by the greatest term in the corresponding series. E.g. the probability of throwing an ace in one trial with a die is and of failing to do so is 4. Also (^ -f- 5 )* = T!Trf + .4r + ^¥i;+ irii + tWtt, hence the proba- bility of throwing an ace 4 times in 4 throws is t:i57' t'lP probability of throwing an ace 3 times in 4 throws is ^4^. the probability of throwing an ace 2 times in 4 throws is ttj^s. t^ probability of throwing an ace 1 time in 4 throws is ||J, the probability of throwing an ace no time in 4 throws is fv,-Jv. Since the last fraction is the largest, the case of no ace in 4 throws of a die is more probable than that of 1, 2. 3, or 4 aces. A problem in life insurance, a subject to which the theoiy of probability has been of indispensa- ble service, will serve to show the applications of the subject. A table of mortality gives the num- bers alive at each successive year of their age, out of a given number of children horn. If A,^ and A„ 4- 1 be the numbers in the table correspond- ing to the nth and (»+ l)th years of age, the inference from the table is, that of A^ individuals now alive, and of n j'ears of age. A„ + , will live one additional year at least. Hence, the chance that anv one of them die during the year is A _A " " , ° ' • Calling this 1 — p, p is the chance that any one of them will survive the year. Of two individuals, one n years old, and the other n', what are the chances that (a) only one lives a year? (b) one, at least, lives a year? (c) both do not live a year? Calling the individuals A and B, the chance of A living out the year is p, and the chance of his dying within the year is 1 — p. For B these are p' and 1 — p'. Hence that A lives and B dies the chance is p ( 1 — p') . That B lives and A dies the chance is p'(l — ;)). Hence the answer to (a) is p -- p' — -pp'- The second case includes, in addition to the conditions of (a ) . the chance that both survive, which is pp'. Hence the answer to (b) is p + p' — pp'. In the third case the chance that both live a year is Pp'. Hence the chance that both will die is 1 — pp'. The theory of probability also furnishes a measure of expectation. The law of expectation in its simplest form may be stated thus: The value of a contingent gain is the product of the sum to be gained into the chance of winning it. Suppose A, B, and C have made a pool, each sub- scribing .$1, and that a game of pure chance (i.e. not dependent on skill) is to be played by them for the $3. What is the value of the expectation of each ? By the conditions, all are equally likely to win the pool, hence its contingent value must be the same to each ; and, obviously, the sum of these values must represent the whole amount in question. The worth of the expectation of each is therefore $1. That is, if A wishes to retire from the game before it is played out, the fair price which B or C ought to pay him for his share is simply .$1. But this is obviously tij of .$3, i.e. the value of the pool multiplied by his chance of getting it. Another very important application of the theory of probability is to the deduction of the most probable value from a number of observa- tions, each of which is liable to certain accidental errors. In a set of such observations, the proba- ble error is a quantity such that there is the same probability of the true error being gieater or less than it, and this probable error has been shown to be least when the sum of the squares of the errors is a minimum. The method for ob- taining this least error is called the method of least squares. See Le.st Squares, Method of. The doctrine of probabilities dates as far back as Fermat and Pascal (16.54). Huygens (1657) gave the first scientific treatment of the subject, and .lakob Bernoulli's Ars Conjectandi (posthu- mous, 1713) and De iloivre's Doctrine of Chances ( 171S) raised the subject to the plane of a branch of mathematics. The theory of errors may be traced back to Cotes's Opera Miscellanea (posthu- mous. 1722), but a memoir prepared by Simpson in 1755 (printed 1756) first applied the theory