You observe π‘˜ i.i.d. copies of the discrete uniform random variable 𝑋𝑖 , which takes values 1 through 𝑛 with equal probability.

Define the random variable 𝑀 as the maximum of these random variables, 𝑀=max𝑖(𝑋𝑖)

1a. Find the probability that π‘€β‰€π‘š, as a function of π‘š, for π‘šβˆˆ{1,2,…,𝑛}.
1b. Find the probability that 𝑀=1.
1c. Find the probability that 𝑀=π‘š for π‘šβˆˆ{2,3,…𝑛}.
1d. For 𝑛=2, find 𝐄[𝑀] and 𝖡𝖺𝗋(𝑀) as a function of π‘˜.
𝐄[𝑀]=
Var [M] =

1e. As π‘˜ (the number of samples) becomes very large, what is 𝐄[𝑀] in terms of 𝑛?

As π‘˜β†’βˆž, 𝐄[𝑀] >

𝑛.

1a. The probability that π‘€β‰€π‘š is the probability that all π‘˜ samples are less than or equal to π‘š. Since each sample has an equal probability of being any value from 1 to 𝑛, the probability that a single sample is less than or equal to π‘š is π‘š/𝑛. Therefore, the probability that all π‘˜ samples are less than or equal to π‘š is π‘š/𝑛 raised to the power of π‘˜:

𝑃(π‘€β‰€π‘š) = (π‘š/𝑛)^π‘˜

1b. The probability that 𝑀=1 is the probability that all π‘˜ samples are equal to 1, which is (1/𝑛)^π‘˜.

1c. The probability that 𝑀=π‘š for π‘šβˆˆ{2,3,…𝑛} is the probability that exactly one sample out of the π‘˜ samples is equal to π‘š, and all the other samples are less than π‘š. There are π‘˜ ways to choose which sample is equal to π‘š, and the probability of that sample being π‘š is 1/𝑛. The probability of all the other π‘˜-1 samples being less than π‘š is ((π‘š-1)/𝑛)^(π‘˜-1). Therefore, the probability that 𝑀=π‘š is π‘˜ * (1/𝑛) * ((π‘š-1)/𝑛)^(π‘˜-1).

1d. For 𝑛=2, the random variable 𝑀 can only take values 1 and 2. The probability that 𝑀=1 is the same as in part 1b, which is (1/2)^π‘˜. The probability that 𝑀=2 is the same as in part 1c, which is π‘˜ * (1/2) * ((2-1)/2)^(π‘˜-1).

𝐄[𝑀] = 1 * (1/2)^π‘˜ + 2 * (π‘˜/2) * ((1/2)^(π‘˜-1))

Var [M] = 𝐄[𝑀^2] - 𝐄[𝑀]^2

To find 𝐄[𝑀^2], we calculate:

𝐄[𝑀^2] = 1^2 * (1/2)^π‘˜ + 2^2 * (π‘˜/2) * ((1/2)^(π‘˜-1))

Then we plug these values into the formula to find Var [M].

1e. As π‘˜β†’βˆž, the probability that π‘€β‰€π‘š approaches 1 for any π‘š. Therefore, 𝐄[𝑀] also approaches 𝑛.

1a. To find the probability that π‘€β‰€π‘š, as a function of π‘š, for π‘šβˆˆ{1,2,…,𝑛}, we need to consider the probability that each of the π‘˜ observations is less than or equal to π‘š. Since each observation is a discrete uniform random variable taking values from 1 to 𝑛 with equal probability, the probability that an individual observation is less than or equal to π‘š is π‘š/𝑛.

Since the π‘˜ observations are independent and identically distributed, the probability that all π‘˜ observations are less than or equal to π‘š is (π‘š/𝑛)^π‘˜. Therefore, the probability that π‘€β‰€π‘š, as a function of π‘š, is given by:

𝑃(π‘€β‰€π‘š) = (π‘š/𝑛)^π‘˜

1b. To find the probability that 𝑀=1, we need to find the probability that all π‘˜ observations are equal to 1. Since each observation has a probability of 1/𝑛 of being 1, the probability that an individual observation is 1 is 1/𝑛. Therefore, the probability that 𝑀=1 is given by:

𝑃(𝑀=1) = (1/𝑛)^π‘˜

1c. To find the probability that 𝑀=π‘š for π‘šβˆˆ{2,3,…𝑛}, we need to find the probability that π‘˜-1 of the π‘˜ observations are less than or equal to π‘š-1, and the π‘˜-th observation is π‘š.

The probability that one observation is less than or equal to π‘š-1 is (π‘š-1)/𝑛. Therefore, the probability that π‘˜-1 of the π‘˜ observations are less than or equal to π‘š-1 is ((π‘š-1)/𝑛)^(π‘˜-1).

The probability that the π‘˜-th observation is π‘š is equal to 1/𝑛.

Therefore, the probability that 𝑀=π‘š is given by:

𝑃(𝑀=π‘š) = ((π‘š-1)/𝑛)^(π‘˜-1) * (1/𝑛)

1d. For 𝑛=2, the random variable 𝑀 can only take the values 1 and 2, as the maximum of two observations.

To find 𝐄[𝑀], we need to find the expected value of 𝑀, which is calculated as:

𝐄[𝑀] = 1 * 𝑃(𝑀=1) + 2 * 𝑃(𝑀=2)

Substituting the respective probabilities from part 1b and 1c, we get:

𝐄[𝑀] = 1 * (1/2)^π‘˜ + 2 * ((1/2)^(π‘˜-1) * (1/2))

To find 𝖡𝖺𝗋[𝑀], the variance of 𝑀, we need to find the expected value of 𝑀^2 and subtract the square of the expected value, 𝐄[𝑀], from it.

𝖡𝖺𝗋[𝑀] = 𝐄[𝑀^2] - 𝐄[𝑀]^2

Since 𝑀 can only take the values 1 and 2, 𝑀^2 can only take the values 1 and 4. The probability distribution of 𝑀^2 is given by:

𝑃(𝑀^2=1) = 𝑃(𝑀=1)
𝑃(𝑀^2=4) = 𝑃(𝑀=2)

Using these probabilities, we can calculate 𝐄[𝑀^2] as:

𝐄[𝑀^2] = 1 * 𝑃(𝑀^2=1) + 4 * 𝑃(𝑀^2=4)

Substituting the respective probabilities, we get:

𝐄[𝑀^2] = (1/2)^π‘˜ + 4 * ((1/2)^(π‘˜-1) * (1/2))

Finally, substituting 𝐄[𝑀] and 𝐄[𝑀^2] in the equation for 𝖡𝖺𝗋[𝑀], we get:

Var[𝑀] = 𝐄[𝑀^2] - 𝐄[𝑀]^2

1e. As π‘˜ (the number of samples) becomes very large, the expected value of 𝑀, 𝐄[𝑀], tends to the maximum value 𝑛. This is because as π‘˜ increases, the probability of observing the maximum value 𝑛 becomes higher.

Therefore, as π‘˜β†’βˆž, 𝐄[𝑀] tends to 𝑛.

To find the probabilities and expected values mentioned in the question, we need to understand the properties of the maximum of independent and identically distributed (i.i.d.) random variables from a discrete uniform distribution.

1a. Find the probability that π‘€β‰€π‘š, as a function of π‘š, for π‘šβˆˆ{1,2,…,𝑛}.
To find this probability, we need to consider that for 𝑀 to be less than or equal to π‘š, all π‘˜ random variables 𝑋ᡒ must be less than or equal to π‘š. Since each 𝑋ᡒ is uniformly distributed from 1 to 𝑛, the probability that 𝑋ᡒ is less than or equal to π‘š is (π‘š/𝑛). Since the 𝑋ᡒ's are independent, the probability that all π‘˜ 𝑋ᡒ's are less than or equal to π‘š is the product of their individual probabilities. Therefore, the probability that π‘€β‰€π‘š is (π‘š/𝑛)ᡏ.

1b. Find the probability that 𝑀=1.
For 𝑀 to equal 1, all π‘˜ random variables 𝑋ᡒ must be equal to 1. Since each 𝑋ᡒ is uniformly distributed from 1 to 𝑛, the probability that 𝑋ᡒ=1 is 1/𝑛. Again, since the 𝑋ᡒ's are independent, the probability that all π‘˜ 𝑋ᡒ's are equal to 1 is (1/𝑛)ᡏ.

1c. Find the probability that 𝑀=π‘š for π‘šβˆˆ{2,3,…𝑛}.
For 𝑀 to equal π‘š, at least one of the 𝑋ᡒ must be equal to π‘š, and all the others must be less than π‘š. The probability that a specific 𝑋ᡒ is equal to π‘š is 1/𝑛, and the probability that it is less than π‘š is (π‘š-1)/𝑛. Since there are π‘˜ random variables, the probability that 𝑀=π‘š is π‘˜ times the probability that one of the 𝑋ᡒ's is equal to π‘š, and the others are less than π‘š. Therefore, the probability that 𝑀=π‘š is π‘˜*(1/𝑛)*((π‘š-1)/𝑛)^(π‘˜-1).

1d. For 𝑛=2, find 𝐄[𝑀] and 𝖡𝖺𝗋(𝑀) as a function of π‘˜.
For 𝑛=2, the random variable 𝑀 can only take values 1 and 2, since it is the maximum of the π‘˜ 𝑋ᡒ's. Using the probabilities from 1b and 1c, we can calculate the expected value and variance of 𝑀.

Expected value: 𝐄[𝑀] = 1*(1/2)ᡏ + 2*k*(1/2)*(1/2)^(π‘˜-1) = (1/2)ᡏ + 2k(1/2)ᡏ.
Variance: Var[𝑀] = (1/2)ᡏ + 4k(1/2)ᡏ - (1/2)²ᡏ - (2k)Β²(1/2)^2(1/2)^(π‘˜-1) = 2(1/2)ᡏ - (3/2)²ᡏ.

1e. As π‘˜ (the number of samples) becomes very large, what is 𝐄[𝑀] in terms of 𝑛?
As π‘˜ approaches infinity, the expected value of 𝑀 approaches the maximum value 𝑛. This is because as more samples are taken, the chances of getting the maximum value increase. Therefore, 𝐄[𝑀] approaches 𝑛 as π‘˜ approaches infinity:
𝐄[𝑀] β†’ 𝑛.