<< . .

. 13
( : 20)



. . >>

7
[Ans. .]
12

(b) What is the probability that the ¬rst ball drawn was red,
given that the second ball drawn was black?
2
[Ans. .]
5

Supplementary exercises.

13. Assume that in the World Series each team has probability 1 of2
winning each game, independently of the outcomes of any other
game. Assign a tree measure. (See Section ?? for the tree.) Find
the probability that the series ends in four, ¬ve, six, and seven
games, respectively.
120 CHAPTER 4. PROBABILITY THEORY

14. Assume that in the World Series one team is stronger than the
other and has probability .6 for winning each of the games. Assign
a tree measure and ¬nd the following probabilities.

(a) The probability that the stronger team wins in 4, 5, 6, and
7 games, respectively.
(b) The probability that the weaker team wins in 4, 5, 6, and 7
games, respectively.
(c) The probability that the series ends in 4, 5, 6, and 7 games,
respectively.
[Ans. .16; .27; .30; .28.]
(d) The probability that the strong team wins the series.
[Ans. .71.]

15. Redo Exercise 14 for the case of two poorly matched teams, where
the better team has probability .9 of winning a game.

[Ans. (c).66;.26;.07;.01; (d).997.]

16. In the World Series from 1905 through 1965 (excluding series of
more than seven games) there were 11 four-game, 14 ¬ve-game,
13 six-game, and 20 seven-game series. Which of the assumptions
in Exercises 13, 14, 15 comes closest to predicting these results?
Is it a good ¬t?

[Ans. .6; No.]

17. Consider the following assumption concerning World Series: Ninety
per cent of the time the two teams are evenly matched, while 10
per cent of the time they are poorly matched, with the better
team having probability .9 of winning a game. Show that this
assumption comes closer to predicting the actual outcomes than
those considered in Exercise 16.

18. We are given three coins. Coin A is fair while coins B and C are
loaded: B has probability .6 of heads and C has probability .4 of
heads. A game is played by tossing a coin twice starting with coin
B. If a head is obtained, B is tossed again, otherwise the second
coin to be tossed is chosen at random from A and C.
121
4.7. BAYES™S PROBABILITIES

(a) Draw the tree for this game, assigning branch and path
weights.
(b) Let p be the statement “The ¬rst toss results in heads” and
let q be the statement “The second toss results in heads”.
Find Pr[p], Pr[q], Pr[q|p].
[Ans. .6; .54; .6.]
19. A and B play a series of games for which they are evenly matched.
A player wins the series either by winning two games in a row,
or by winning a total of three games. Construct the tree and the
tree measure.
(a) What is the probability that A wins the series?
(b) What is the probability that more than three games need to
be played?
20. In a room there are three chests, each chest contains two drawers,
and each drawer contains one coin. In one chest each drawer
contains a gold coin; in the second chest each drawer contains a
silver coin; and in the last chest one drawer contains a gold coin
and the other contains a silver coin. A chest is picked at random
and then a drawer is picked at random from that chest. When
the drawer is opened, it is found to contain a gold coin. What is
the probability that the other drawer of that same chest will also
contain a gold coin?
2
[Ans. .]
3



4.7 Bayes™s probabilities
The following situation often occurs. Measures have been assigned in
a possibility space U . A complete set of alternatives, p1 , p2 , . . . , pn has
been singled out. Their probabilities are determined by the assigned
measure. (Recall that a complete set of alternatives is a set of state-
ments such that for any possible outcome one and only one of the
statements is true.) We are now given that a statement q is true. We
wish to compute the new probabilities for the alternatives relative to
this information. That is, we wish the conditional probabilities Pr[pj |q]
for each pj . We shall give two di¬erent methods for obtaining these
probabilities.
122 CHAPTER 4. PROBABILITY THEORY

The ¬rst is by a general formula. We illustrate this formula for
the case of four alternatives: p1 , p2 , p3 , p4 . Consider Pr[p2 |q]. From the
de¬nition of conditional probability,
Pr[p2 § q]
Pr[p2 |q] = .
Pr[q]
But since p1 , p2 , p3 , p4 are a complete set of alternatives,
Pr[q] = Pr[p1 § q] + Pr[p2 § q] + P r[p3 § q] + Pr[p4 § q].
Thus
Pr[p2 § q]
Pr[p2 |q] = .
Pr[p1 § q] + Pr[p2 § q] + P r[p3 § q] + Pr[p4 § q]
Since Pr[pj § q] = Pr[pj ]Pr[q|pj ], we have the desired formula
Pr[p2 ]Pr[q|p2 ]
Pr[p2 |q] = .
Pr[p1 ]Pr[q|p1 ] + Pr[p2 ]Pr[q|p2 ] + Pr[p3 ]Pr[q|p3 ] + Pr[p4 ]Pr[q|p4 ]
Similar formulas apply for the other alternatives, and the formula gen-
eralizes in an obvious way to any number of alternatives. In its most
general form it is called Bayes™s theorem.

Example 4.14 Suppose that a freshman must choose among mathe-
matics, physics, chemistry, and botany as his or her science course. On
the basis of the interest he or she expressed, his or her adviser assigns
probabilities of .4, .3, .2 and .1 to the student™s choosing each of the
four courses, respectively. The adviser does not hear which course the
student actually chose, but at the end of the term the adviser hears
that he or she received an A in the course chosen. On the basis of the
di¬culties of these courses the adviser estimates the probability of the
student getting an A in mathematics to be .1, in physics .2, in chemistry
.3, and in botany .9. How can the adviser revise the original estimates
as to the probabilities of the student taking the various courses? Using
Bayes™s theorem we get
(.4)(.1) 4
Pr[The student took math|The student got an A] = =
(.4)(.1) + (.3)(.2) + (.2)(.3) + (.1)(.9) 25
Similar computations assign probabilities of .24, .24, and .36 to the
other three courses. Thus the new information, that the student re-
ceived an A, had little e¬ect on the probability of having taken physics
or chemistry, but it has made mathematics less likely, and botany much
™¦
more likely.
123
4.7. BAYES™S PROBABILITIES

It is important to note that knowing the conditional probabilities
of q relative to the alternatives is not enough. Unless we also know the
probabilities of the alternatives at the start, we cannot apply Bayes™s
theorem. However, in some situations it is reasonable to assume that
the alternatives are equally probable at the start. In this case the
factors Pr[p1 ], . . . , Pr[p4 ] cancel from our basic formula, and we get the
special form of the theorem:
If Pr[p1 ] = Pr[p2 ] = Pr[p3 ] = Pr[p4 ] then

Pr[q|p2 ]
Pr[p2 |q] = .
Pr[q|p1 ] + Pr[q|p2 ] + Pr[q|p3 ] + Pr[q|p4 ]


Example 4.15 In a sociological experiment the subjects are handed
four sealed envelopes, each containing a problem. They are told to open
one envelope and try to solve the problem in ten minutes. From past
experience, the experimenter knows that the probability of their being
able to solve the hardest problem is .1. With the other problems, they
have probabilities of .3, .5, and .8. Assume the group succeeds within
the allotted time. What is the probability that they selected the hardest
problem? Since they have no way of knowing which problem is in which
envelope, they choose at random, and we assign equal probabilities to
the selection of the various problems. Hence the above simple formula
applies. The probability of their having selected the hardest problem
is
.1 1
=.
.1 + .3 + .5 + .8 17
™¦

The second method of computing Bayes™s probabilities is to draw a
tree, and then to redraw the tree in a di¬erent order. This is illustrated
in the following example.

Example 4.16 There are three urns. Each urn contains one white
ball. ln addition, urn I contains one black ball, urn II contains two,
and urn III contains 3. An urn is selected and one ball is drawn. The
probability for selecting the three urns is 6 , 1 , and 3 , respectively. If
1 1
2
we know that a white ball is drawn, how does this alter the probability
that a given urn was selected?
First we construct the ordinary tree and tree measure, in Figure
4.7.
124 CHAPTER 4. PROBABILITY THEORY




Figure 4.7: ™¦




Figure 4.8: ™¦
125
4.7. BAYES™S PROBABILITIES




Figure 4.9: ™¦

Next we redraw the tree, using the ball drawn as stage 1, and the
urn selected as stage 2. (See Figure 4.8.) We have the same paths as
before, but in a di¬erent order. So the path weights are read o¬ from
the previous tree. The probability of drawing a white ball is
1 1 1 1
++ =.
12 6 12 3
This leaves the branch weights of the second stage to be computed. But
this is simply a matter of division. For example, the branch weights
for the branches starting at W must be 4 , 2 , 1 to yield the correct
11
4
path weights. Thus, if a white ball is drawn, the probability of having
1
selected urn I has increased to 4 , the probability of having picked urn
1
III has fallen to 4 , while the probability of having chosen urn II is
™¦
unchanged (see Figure 4.9).
This method is particularly useful when we wish to compute all the
conditional probabilities. We will apply the method next to Example
4.14. The tree and tree measure for this example in the natural order is
shown in Figure 4.10. In that ¬gure the letters M, P, C, and B stand
for mathematics, physics, chemistry, and botany, respectively.
The tree drawn in reverse order is shown in Figure 4.11. Each
path in this tree corresponds to one of the paths in the original tree.
Therefore the path weights for this new tree are the same as the weights
assigned to the corresponding paths in the ¬rst tree. The two branch
weights at the ¬rst level represent the probability that the student
receives an A or that he or she does not receive an A. These probabilities
are also easily obtained from the ¬rst tree. In fact,
Pr[A] = .04 + .06 + .06 + .09 = .25
and
Pr[¬A] = 1 ’ Pr[A] = .75.
126 CHAPTER 4. PROBABILITY THEORY




Figure 4.10: ™¦




Figure 4.11: ™¦
127
4.7. BAYES™S PROBABILITIES




Figure 4.12: ™¦

We have now enough information to obtain the branch weights at
the second level, since the product of the branch weights must be the
path weights. For example, to obtain pA,M we have
.25 · pA,M = .04; pA,M = .16.
But pA,M is also the conditional probability that the student took
math given that he or she got an A. Hence this is one of the new
probabilities for the alternatives in the event that the student received
an A. The other branch probabilities are found in the same way and
represent the probabilities for the other alternatives. By this method
we obtain the new probabilities for all alternatives under the hypothesis
that the student receives an A as well as the hypothesis that the student
does not receive an A. The results are shown in the completed tree in
Figure 4.12.

Exercises
1. Urn I contains 7 red and 3 black balls and urn II contains 6 red
and 4 black balls. An urn is chosen at random and two balls are
drawn from it in succession without replacement. The ¬rst ball
is red and the second black. Show that it is more probable that
urn II was chosen than urn I.
2. A gambler is told that one of three slot machines pays o¬ with
probability 1 , while each of the other two pays o¬ with probability
2
1
.
3
128 CHAPTER 4. PROBABILITY THEORY

(a) If the gambler selects one at random and plays it twice, what
is the probability that he or she will lose the ¬rst time and
win the second?
25
[Ans. .]
108

(b) If the gambler loses the ¬rst time and wins the second, what
is the probability he or she chose the favorable machine?
9
[Ans. .]
25


3. During the month of May the probability of a rainy day is .2. The
Dodgers win on a clear day with probability .7, but on a rainy
day only with probability .4. If we know that they won a certain
game in May, what is the probability that it rained on that day?

1
[Ans. .]
8


4. Construct a diagram to represent the truth sets of various state-
ments occurring in the previous exercise.

5. On a multiple-choice exam there are four possible answers for
each question. Therefore, if a student knows the right answer, he
or she has probability 1 of choosing correctly; if the student is
1
guessing, he or she has probability 4 of choosing correctly. Let us
further assume that a good student will know 90 per cent of the
answers, a poor student only 50 per cent. If a good student has
the right answer, what is the probability that he or she was only
guessing? Answer the same question about a poor student, if the
poor student has the right answer.

11
[Ans. ; .]
37 5


6. Three economic theories are proposed at a given time, which ap-
pear to be equally likely on the basis of existing evidence. The
state of the American economy is observed the following year,
and it turns out that its actual development had probability .6 of
happening according to the ¬rst theory; and probabilities .4 and
.2 according to the others. How does this modify the probabilities
of correctness of the three theories?
129
4.7. BAYES™S PROBABILITIES

7. Let p1 , p2 , p3 , and p4 be a set of equally likely alternatives. Let
Pr[q|p1 ] = a, Pr[q|p2 ] = b, Pr[q|p3 ] = c, Pr[q|p4 ] = d. Show that if
a + b + c + d = 1, then the revised probabilities of the alternatives
relative to q are a, b, c, and d, respectively.

8. In poker, Smith holds a very strong hand and bets a considerable
amount. The probability that Smith™s opponent, Jones, has a
better hand is .05. With a better hand Jones would raise the bet
with probability .9, but with a poorer hand Jones would raise
only with probability .2. Suppose that Jones raises, what is the
new probability that he or she has a winning hand?
9
[Ans. .]
47


9. A rat is allowed to choose one of ¬ve mazes at random. If we know
that the probabilities of his or her getting through the various
mazes in three minutes are .6, .3, .2, .1, .1, and we ¬nd that the
rat escapes in three minutes, how probable is it that he or she
chose the ¬rst maze? The second maze?
63
[Ans. ; .]
13 13


10. Three men, A, B, and C, are in jail, and one of them is to be
hanged the next day. The jailer knows which man will hang, but
must not announce it. Man A says to the jailer, “Tell me the
name of one of the other two who will not hang. If both are to go
free, just toss a coin to decide which to say. Since I already know
that at least one of them will go free, you are not giving away
the secret.” The jailer thinks a moment and then says, “No, this
would not be fair to you. Right now you think the probability
1
that you will hang is 3 , but if I tell you the name of one of the
1
others who is to go free, your probability of hanging increases to 2 .
You would not sleep as well tonight.” Was the jailer™s reasoning
correct?

[Ans. No.]

11. One coin in a collection of 8 million coins has two heads. The
rest are fair coins. A coin chosen at random from the collection
is tossed ten times and comes up heads every time. What is the
probability that it is the two-headed coin?
130 CHAPTER 4. PROBABILITY THEORY

12. Referring to Exercise 11, assume that the coin is tossed n times
and comes up heads every time. How large does n have to be
to make the probability approximately 1 that you have the two-
2
headed coin?

[Ans. 23.]

1
13. A statistician will accept job a with probability 2 , job b with
probability 1 , and job c with probability 6 . In each case he or
1
3
she must decide whether to rent or buy a house. The probabilities
1 2
of buying are 3 if he or she takes job a, 3 if he or she takes job
b, and 1 if he or she takes job c. Given that the statistician buys

<< . .

. 13
( : 20)



. . >>