Let X and Y be independently random variables, with X uniformly distributed on [0,1] and Y uniformly distributed on [0,2] . Find the PDF fZ(z) of Z=max{X,Y} .
For z<0 or z>2 :
fZ(z)=
For 0≤z≤1 :
fZ(z)=
For 1≤z≤2 :
fZ(z)=
a. 0
b. 1/4
c. 3/4
Why did the variable go to therapy?
Because it had some distribution issues!
For z < 0 or z > 2:
fZ(z) = 0
For 0 ≤ z ≤ 1:
fZ(z) = 2z
For 1 ≤ z ≤ 2:
fZ(z) = 2 - 2z
To find the PDF of Z, we need to consider the possible values of Z and determine the probability of Z taking on each value.
For z < 0 or z > 2, Z cannot take on these values since it is defined as the maximum of X and Y, both of which have upper bounds of 1 and 2 respectively. Therefore, for z < 0 or z > 2, fZ(z) = 0.
For 0 ≤ z ≤ 1:
To find the probability that Z is less than or equal to z, we need to consider the possible combinations of X and Y that satisfy this condition. Since Y is uniformly distributed on [0,2], it will always be greater than or equal to 0. Therefore, the only way Z can be less than or equal to z is if X is also less than or equal to z. This means that the probability is equal to the cumulative distribution function (CDF) of X evaluated at z.
Since X is uniformly distributed on [0,1], its CDF is given by Fx(x) = x for 0 ≤ x ≤ 1.
Therefore, for 0 ≤ z ≤ 1, fZ(z) = dFz(z)/dz = dFx(z)/dz = 1.
For 1 ≤ z ≤ 2:
To find the probability that Z is less than or equal to z, we need to consider the possible combinations of X and Y that satisfy this condition. If z is between 1 and 2, then Y can be any value between 1 and 2. In order for Z to be less than or equal to z, either X has to be less than or equal to z, or Y has to be less than or equal to z.
The probability that X is less than or equal to z is the same as the CDF of X evaluated at z, which is z.
The probability that Y is less than or equal to z is the CDF of Y evaluated at z, which is (z-0)/(2-0) = (z/2).
Therefore, for 1 ≤ z ≤ 2, fZ(z) = P(Z ≤ z) = P(X ≤ z or Y ≤ z) = P(X ≤ z) + P(Y ≤ z) = z + (z/2) = (3z/2).
In summary, the PDF of Z is given by:
For z < 0 or z > 2: fZ(z) = 0
For 0 ≤ z ≤ 1: fZ(z) = 1
For 1 ≤ z ≤ 2: fZ(z) = (3z/2)
To find the probability density function (PDF) of Z, which is the maximum of X and Y, we need to consider different cases based on the value of z.
Case 1: For z < 0 or z > 2
In this case, Z can never be less than 0 or greater than 2, as it is the maximum of X and Y. Therefore, the PDF fZ(z) is equal to 0 for z < 0 or z > 2.
fZ(z) = 0 for z < 0 or z > 2
Case 2: For 0 ≤ z ≤ 1
If z is between 0 and 1, the maximum of X and Y can only be determined by X since Y is always less than or equal to 2. Since X is uniformly distributed on [0, 1], the probability that X is less than or equal to z is equal to z.
P(X ≤ z) = z
The PDF fZ(z) is the derivative of the cumulative distribution function (CDF) of Z, so we differentiate P(X ≤ z) with respect to z to get fZ(z).
fZ(z) = d/dz (P(X ≤ z))
= d/dz (z)
= 1 for 0 ≤ z ≤ 1
Therefore, for 0 ≤ z ≤ 1, fZ(z) is equal to 1.
fZ(z) = 1 for 0 ≤ z ≤ 1
Case 3: For 1 ≤ z ≤ 2
If z is between 1 and 2, the maximum of X and Y can be determined by Y since X is always less than or equal to 1. Since Y is uniformly distributed on [0, 2], the probability that Y is less than or equal to z is equal to z/2.
P(Y ≤ z) = z/2
Again, we differentiate P(Y ≤ z) with respect to z to get fZ(z).
fZ(z) = d/dz (P(Y ≤ z))
= d/dz (z/2)
= 1/2 for 1 ≤ z ≤ 2
Therefore, for 1 ≤ z ≤ 2, fZ(z) is equal to 1/2.
fZ(z) = 1/2 for 1 ≤ z ≤ 2
In summary, the PDF of Z, fZ(z), is:
- 0 for z < 0 or z > 2
- 1 for 0 ≤ z ≤ 1
- 1/2 for 1 ≤ z ≤ 2