Let f:[0,infinity) -> R be a differentiable function such that f'(x) >= f(x) > 0 for all x in [0,infinity). Construct a proof to show that f'(x) >= f(x) for all x in [0,infinity).
You say
f'(x) >= f(x) > 0 for all x in [0,∞)
show that
f'(x) >= f(x) for all x in [0,∞)
Is that not the hypothesis already?
To prove that f'(x) >= f(x) for all x in [0,infinity), we can use the Mean Value Theorem and a proof by contradiction.
Assume, to the contrary, that there exists a point c in [0, infinity) such that f'(c) < f(c). Since f is differentiable on [0, infinity), it is also continuous. By the Mean Value Theorem, there exists a point d in (0, c) such that:
f'(d) = (f(c) - f(0))/(c - 0) = f(c)/c
Now, since f'(x) >= f(x) for all x in [0, infinity), we have:
f'(d) >= f(d)
Combining the inequalities above, we have:
f(c)/c >= f(d)
Multiplying both sides by c and rearranging the terms, we get:
f(c) >= c * f(d)
Since c > 0 and f(d) > 0 (as f(x) > 0 for all x in [0, infinity)), we have that c * f(d) > 0. Therefore, f(c) > 0.
However, this contradicts the assumption that f(c) < f(0). Thus, our assumption that there exists a point c in [0, infinity) such that f'(c) < f(c) must be false.
Therefore, we can conclude that f'(x) >= f(x) for all x in [0, infinity).