|
Mr.A and Ms.S are betting on repeated flips of a coin. At the start of the game Mr.A has a dollars and Ms.S has b dollars, at each flip the loser pays the winner one dollar and the game continues until either player is "ruined". Making use of the fact that in an equitable game each player's mathematical expectation is zero, find the probability that Mr.A will win Ms.S's b dollars before he loses his dollars.
can someone pls explain the approach. Thanks.
|