# mathematics, statistics

You observe k i.i.d. copies of the discrete uniform random variable Xi , which takes values 1 through n with equal probability.

Define the random variable M as the maximum of these random variables, M=maxi(Xi) .

1.) Find the probability that M≤m , as a function of m , for m∈{1,2,…,n} .

2.) Find the probability that M=1 .

3.) Find the probability that M=m for m∈{2,3,…n} .

4.) For n=2 , find E[M] and Var(M) as a function of k .

1. 👍
2. 👎
3. 👁
1. 5.) As k (the number of samples) becomes very large, what is E[M] in terms of n ?
As k→∞ , E[M]→

1. 👍
2. 👎

1. 👍
2. 👎
3. 1. (m/n)^k
2. (1/n)^k
3. 1-(1/n)^k

1. 👍
2. 👎
4. 4. E[M] = 2−1/2^k, Var[M] = 1/2^k - 1/2^(2*k)
5. E[M] = n

1. 👍
2. 👎
5. I think 1. is:
P(M<=m)=(1/m)^k.

1. 👍
2. 👎
6. I think 1 is: (n-m+1)/n . There are always n numbers to choose from but only a subset of those will be >= the current max???

1. 👍
2. 👎
7. 3.) Find the probability that M=m for m∈{2,3,…n} .
3. 1-(1/n)^k
Shouldn't this probability depend on m somehow?
P(M=m)=P(M<=m+1)-P(M<=m)=(m+1/n)^k-(m/n)^k
?

1. 👍
2. 👎
8. Yeah melo, your number 3 is clearly wrong. It is dependent on m, first of all

Anon is right. P(M=m) = P(M<=m) - P(M<=m-1) = (m/n)^k - [(m-1)/n)]^k

1. 👍
2. 👎

## Similar Questions

1. ### Probability

Question:A fair coin is flipped independently until the first Heads is observed. Let the random variable K be the number of tosses until the first Heads is observed plus 1. For example, if we see TTTHTH, then K=5. For K=1,2,3...K,

2. ### statistics

Suppose that the random variable Θ takes values in the interval [0,1]. a) Is it true that the LMS estimator is guaranteed to take values only in the interval [0,1]? b) Is it true that the LLMS estimator is guaranteed to take

3. ### probability

t the discrete random variable X be uniform on {0,1,2} and let the discrete random variable Y be uniform on {3,4}. Assume that X and Y are independent. Find the PMF of X+Y using convolution. Determine the values of the constants

4. ### probability

A fair coin is flipped independently until the first Heads is observed. Let K be the number of Tails observed before the first Heads (note that K is a random variable). For k=0,1,2,…,K, let Xk be a continuous random variable

1. ### Probability

Let Θ be an unknown random variable that we wish to estimate. It has a prior distribution with mean 1 and variance 2. Let W be a noise term, another unknown random variable with mean 3 and variance 5. Assume that Θ and W are

2. ### probability

The random variable X has a PDF of the form fX(x)={1x2,0,for x≥1,otherwise. Let Y=X2 . For y≥1 , the PDF of Y it takes the form fY(y)=ayb . Find the values of a and b . a= b=

3. ### MATH

Determine whether the situation calls for a discrete or continuous random variable. The braking time of a car Is it discrete?

4. ### Probability

For each of the following statements, determine whether it is true (meaning, always true) or false (meaning, not always true). Here, we assume all random variables are discrete, and that all expectations are well-defined and

1. ### Math

For the discrete random variable X, the probability distribution is given by P(X=x)= kx x=1,2,3,4,5 =k(10-x) x=6,7,8,9 Find the value of the constant k E(X) I am lost , it is the bonus question in my homework on random variables

2. ### math

A fair coin is flipped independently until the first Heads is observed. Let the random variable K be the number of tosses until the first Heads is observed plus 1. For example, if we see TTTHTH, then K= 5. For k = 1,2,...,K, let

3. ### probability

Problem 2. Continuous Random Variables 2 points possible (graded, results hidden) Let 𝑋 and 𝑌 be independent continuous random variables that are uniformly distributed on (0,1) . Let 𝐻=(𝑋+2)𝑌 . Find the probability

4. ### Probability

Let Θ be an unknown random variable that we wish to estimate. It has a prior distribution with mean 1 and variance 2. Let W be a noise term, another unknown random variable with mean 3 and variance 5. Assume that Θ and W are