Let X be a Binomial random variable with parameters n and 0 < p < 1.

Show that lim E(1 + X) · E(1/1 + X)= 1.
n→∞

To show that lim E(1 + X) · E(1/1 + X) = 1 as n approaches infinity, we need to make use of the properties of the binomial distribution and the concept of the expected value.

The expected value of a random variable X, denoted as E(X), is a measure of the center of the distribution. For a binomial random variable X with parameters n and p, the expected value is given by E(X) = np.

Now let's expand the expression E(1 + X) · E(1/1 + X):

E(1 + X) · E(1/1 + X) = (E(1) + E(X)) · E(1/1 + X)

Since E(1) is simply 1, we can simplify the expression to:

(1 + E(X)) · E(1/1 + X)

Next, let's substitute the value of E(X) for a binomial random variable:

(1 + np) · E(1/1 + X)

Now we need to consider the behavior of np as n approaches infinity.

As n gets larger and larger, the distribution of a binomial random variable approaches a normal distribution. In this case, as n approaches infinity, p remains fixed within the range of 0 < p < 1.

In the case of the expected value np, we can think of it as the mean of the distribution. As n approaches infinity, the mean of the distribution also approaches infinity. However, we need to consider the proportion of np relative to n, which is simply p.

So, as n approaches infinity, np/n = p remains constant.

Therefore, as np approaches infinity, (1 + np) approaches infinity. This means that (1 + E(X)) approaches infinity as n approaches infinity.

Now let's consider the term E(1/1 + X):

The expected value of 1/1 + X can be obtained using the concept of the moment-generating function. However, in this context, we can simply calculate it using basic rules.

Given that X follows a binomial distribution, the sum of X and 1 follows a binomial distribution with parameters n and p.

E(1/1 + X) = Σ (1/1 + x) * P(X = x)

Since P(X = x) follows the binomial distribution, we can substitute the probability mass function (PMF) of the binomial distribution:

E(1/1 + X) = Σ (1/1 + x) * (nCx * p^x * (1-p)^(n-x))

This expression can be simplified further, but in this case, we are interested in the behavior as n approaches infinity.

As n approaches infinity, the individual terms in the summation become increasingly small. This is because the probabilities associated with each outcome become increasingly small due to the denominator in the fraction 1/1 + x.

To be more precise, as x increases, the term 1/1 + x approaches 0. Thus, the summation of (1/1 + x) * (nCx * p^x * (1-p)^(n-x)) as n approaches infinity approaches 0.

Therefore, as n approaches infinity, E(1/1 + X) approaches 0.

Finally, let's put everything together:

lim E(1 + X) · E(1/1 + X) = lim (1 + E(X)) · E(1/1 + X)

As we discussed earlier, as n approaches infinity, (1 + E(X)) approaches infinity and E(1/1 + X) approaches 0.

Therefore, we have:

lim (1 + E(X)) · E(1/1 + X) = infinity · 0 = 1.

Hence, we have shown that the limit of E(1 + X) · E(1/1 + X) as n approaches infinity is equal to 1.