|
Can someone please help me with this one ?
A college is trying to fill one remaining seat in its Masters programme. It
judges the merit of any applicant by giving him an entrance test. It is known
that there are two interested applicants who will apply sequentially. If the
college admits the first applicant, it cannot admit the second. If it rejects the
first applicant, it must admit the second. It is not possible to delay a decision on
the first applicant till the second applicant is tested. At the time of admitting or
rejecting the first applicant, the college thinks the second applicant’s mark will
be a continuous random variable drawn from the uniform distribution between 0
and 100. (Recall that a random variable x is uniformly distibuted on [a, b] if the
density function of x is given by f(x) = 1
b−a for x 2 [a, b]). If the college wants
to maximize the expected mark of its admitted student, what is the lowest mark
for which it should admit the first applicant?
(ii) Now suppose there are three applicants who apply sequentially. Before an
applicant is tested, it is known that his likely mark is an independent continuous
random variable drawn from the uniform distribution between 0 and 100. What
is the lowest mark for which the college should admit the first student? What is
the lowest mark for which the college should admit the second student in case
the first is rejected?
|