A worker on a roof, 20.8 m above the ground, wants to drop a brick onto the back of a moving truck. The

truck is moving at a constant velocity of 5.0 m/s alongside the building. The back of the truck is 1.2 m
above the ground. How far from the point of impact must the truck be, the instant the brick is dropped, so
that the brick will land on the back of the truck?

how long is the truck?

Figure how long it will take to fall to the truck
Then figure how far the truck will travel in that time.

To solve this problem, we can break it down into two parts:

1. Calculate the time it takes for the brick to fall from the roof to the ground.
2. Calculate the horizontal distance the truck has traveled during that time.

1. Calculate the time it takes for the brick to fall from the roof to the ground:
We can use the formula for the time it takes for an object to fall from a height h:

t = √(2h/g)

Where:
t = time in seconds
h = height in meters
g = acceleration due to gravity (approximately 9.8 m/s^2)

In this case, the height h is 20.8 m, so using the formula, we can calculate the time it takes for the brick to fall from the roof to the ground:

t = √(2 * 20.8 / 9.8)
t ≈ 2.02 seconds

2. Calculate the horizontal distance the truck has traveled during that time:
The horizontal distance the truck has traveled is equal to the velocity of the truck multiplied by the time taken for the brick to fall.

distance = velocity * time

In this case, the velocity of the truck is 5.0 m/s, and the time taken for the brick to fall is approximately 2.02 seconds. Therefore, the horizontal distance traveled by the truck during that time is:

distance = 5.0 * 2.02
distance ≈ 10.1 meters

So, the truck must be approximately 10.1 meters away from the point of impact when the brick is dropped for the brick to land on the back of the truck.