In the following problem, please select the correct answer.

Let X be a non-negative random variable. Then, for any a>0, the Markov inequality takes the form
P(X≥a)≤(a^c)E[X^5].

What is the value of c?

c= unanswered

Suppose that X_1,X_2,⋯ are random variables with mean E[X_i]=0 and the same finite variance var(X_i)=σ^2. Let M_n=(X_1+⋯+X_n)/n. In which of the following cases does M_n converge in probability to 0?

(i): X_1,X_2,⋯ are independent and identically distributed.

(ii): X_1,X_2,⋯ are identically distributed and cov(X_i,X_j)≤(1/2)^|i−j| for all i,j.

(iii): X_1,X_2,⋯ are such that cov(X_i,X_j)≤(1/2)^|i−j| for all i,j (but are not necessarily identically distributed).

(i)
(ii)
(iii)
(i)(ii)
(i)(ii)(iii)
unanswered

Let X be a Bernoulli random variable with unknown parameter p∈[0,1]. Does there exist an unbiased estimator Θ^ of 1/p? In other words, does there exist a function g such that Θ^=g(X) and that the E[Θ^]=1/p? (Recall that the function g cannot depend on p).
unanswered

Hint: E[Θ^]=E[g(X)]=(1−p)⋅g(0)+p⋅g(1).

Let X_i (for i≥1) be independent and identically distributed Poisson random variables with parameter λ=1. Let S_n=∑_{i=1}^n X_i. Compute the following limits:

lim_(n→∞)P(S_n≥n)= unanswered

lim_(n→∞)P(S_n≥n+sqrt(n))= unanswered

4.

lim_(n→∞)P(S_n≥n)= 0.5
lim_(n→∞)P(S_n≥n+sqrt(n)) = 1/(2*pi) = 0.159

1. c=-5

3. No

2. (i), (ii) and (iii)

To answer the first question, we need to find the value of c in the Markov inequality equation P(X≥a)≤(a^c)E[X^5]. The Markov inequality states that for any non-negative random variable X and any positive constant a, the probability that X is greater than or equal to a is less than or equal to the expectation of X raised to the power of a.

To find the value of c, we can compare the equation with the standard form of the Markov inequality, which is P(X≥a)≤1/(a^c)E[X]. By comparing the terms in these equations, we can see that c = 5.

For the second question, we are given a sequence of random variables X_1, X_2, ... with mean E[X_i] = 0 and the same finite variance var(X_i) = σ^2. We are asked in which of the three given cases, M_n = (X_1 + X_2 + ... + X_n)/n converges in probability to 0.

(i) In this case, the random variables X_1, X_2, ... are independent and identically distributed. In this case, M_n will converge in probability to the mean, which is 0.

(ii) In this case, the random variables X_1, X_2, ... are identically distributed and the covariance between any two X_i and X_j is at most (1/2)^(|i-j|). In this case, M_n will also converge in probability to the mean, which is 0.

(iii) In this case, the random variables X_1, X_2, ... have covariances that are at most (1/2)^(|i-j|), but they are not necessarily identically distributed. In this case, M_n may or may not converge in probability to 0, depending on the specifics of the distribution.

So the correct answer is (i)(ii), meaning that M_n converges in probability to 0 when the random variables are both independent and identically distributed, and when the covariance between any two variables is at most (1/2)^(|i-j|).

For the third question, we are asked if there exists an unbiased estimator Θ^ of 1/p for a Bernoulli random variable X with unknown parameter p in the range [0, 1]. To find the answer, we can use the given hint: E[Θ^] = E[g(X)] = (1−p)⋅g(0) + p⋅g(1).

For an estimator to be unbiased, its expected value should be equal to the true parameter value. Therefore, we need to find the function g(X) such that E[g(X)] = 1/p.

Since 1/p is a constant, we can set g(0) and g(1) to appropriate values that make the equation E[g(X)] = (1−p)⋅g(0) + p⋅g(1) equal to 1/p. However, we cannot depend on the unknown parameter p to determine the function g(X).

Hence, there does not exist an unbiased estimator Θ^ of 1/p for a Bernoulli random variable X with unknown parameter p.

For the fourth and fifth questions, we are asked to compute the limits of certain probabilities involving a sequence of independent and identically distributed Poisson random variables.

To compute the limits, we need to use the properties of Poisson random variables. Specifically, the sum of independent Poisson random variables with the same parameter is also a Poisson random variable with the parameter equal to the sum of the individual parameters.

In this case, we have X_i ~ Poisson(1) for all i ≥ 1. Therefore, S_n = ∑_{i=1}^n X_i ~ Poisson(n). Using this information, we can compute the desired limits by using the probability mass function of a Poisson random variable.

For the limit lim_(n→∞)P(S_n ≥ n), we have P(S_n ≥ n) = 1 - P(S_n < n) = 1 - P(S_n ≤ n-1). Using the probability mass function of a Poisson random variable, we can compute this limit.

For the limit lim_(n→∞)P(S_n ≥ n+sqrt(n)), we can use a similar approach and compute the probability using the probability mass function of a Poisson random variable.

Please let me know if you would like the detailed calculations for these limits.