We consider a 1-dimensional logistic regression problem, i.e., assume that data \, X_ i \in \mathbb {R}, i = 1, \dots , n \, is given and that get independent observations of

Y_ i | X_ i \sim \textsf{Ber}\left( \frac{\mathbf e^{\beta X_ i}}{1 + \mathbf e^{\beta X_ i}} \right),

where \, \beta \in \mathbb {R} \,.

Moreover, recall that the associated log likelihood for \, \beta \, is then given by

\ell (\beta ) = \sum _{i=1}^{n} \left( Y_ i X_ i \beta - \ln (1 + \exp (X_ i \beta )) \right)

Calculate the first and second derivate of \, \ell \,. Instructions: The summation \sum _{i=1}^{n} is already placed to the left of the answer box. Enter the summands in terms of \beta, X_ i (enter “X_i") and Y_ i (enter “Y_i").

\displaystyle \ell '(\beta ) = \sum _{i=1}^{n}
unanswered

\displaystyle \ell ^{\prime \prime }(\beta ) = \sum _{i=1}^{n}
unanswered

What can you conclude about \, \ell '(\beta ) \,?

To calculate the first derivative of \(\ell(\beta)\), we need to differentiate each term in the summation with respect to \(\beta\) and sum them up:

\[\ell'(\beta) = \sum_{i=1}^{n} \left(Y_i X_i - \frac{X_i e^{\beta X_i}}{1 + e^{\beta X_i}}\right)\]

To calculate the second derivative of \(\ell(\beta)\), we need to differentiate each term in the first derivative with respect to \(\beta\) and sum them up:

\[\ell''(\beta) = \sum_{i=1}^{n} \left(-\frac{X_i^2 e^{\beta X_i}}{(1 + e^{\beta X_i})^2} - \frac{X_i^2 e^{\beta X_i}}{1 + e^{\beta X_i}} + \frac{X_i^2 e^{2\beta X_i}}{(1 + e^{\beta X_i})^2}\right)\]

From the first derivative, we can conclude that \(\ell'(\beta)\) represents the gradient or the direction of steepest ascent of the log-likelihood function \(\ell(\beta)\). This means that it tells us in which direction \(\beta\) should be adjusted to maximize \(\ell(\beta)\).

Note: This is not financial advice, and should not be taken as such.

To calculate the first derivative of the log likelihood function, we need to differentiate each term in the summation with respect to β.

The first derivative of the log likelihood function is given by:

ℓ'(\beta) = ∑_{i=1}^{n} ( Y_i X_i - \frac{X_i e^{\beta X_i}}{1 + e^{\beta X_i}} )

To calculate the second derivative of the log likelihood function, we need to differentiate each term in the first derivative with respect to β.

The second derivative of the log likelihood function is given by:

ℓ''(\beta) = -∑_{i=1}^{n} ( \frac{X_i^2 e^{\beta X_i}}{(1 + e^{\beta X_i})^2} )

From the first derivative, we can conclude that:

ℓ'(\beta) equates to 0 when Y_i X_i - \frac{X_i e^{\beta X_i}}{1 + e^{\beta X_i}} = 0 for all i = 1 to n.