Plane leaves airport with heading of 110 degrees at 300 mph, the wing out of the southwest is 42 mph.
locate the plane after 4 hours?
find the distance back to airport and angle from due east?
the resultant velocity is
320.0 mph @ 103°
so, now you know how far it goes in 4 hours.
To locate the plane after 4 hours, we need to calculate its position based on its initial heading and speed.
Step 1: Calculate the distance traveled by the plane:
Distance = Speed × Time
Distance = 300 mph × 4 hours
Distance = 1200 miles
Step 2: Determine the direction of the plane's heading:
110 degrees indicates the heading of the plane measured clockwise from due north.
Step 3: Use trigonometry to find the components of velocity:
The wing of the plane is flying at a speed of 42 mph out of the southwest. This can be divided into its eastward and northward components.
Eastward Component = Speed × cos(Angle)
Northward Component = Speed × sin(Angle)
Since the wing is out of the southwest, which is approximately 225 degrees, we can calculate the components using the above formulas:
Eastward Component = 42 mph × cos(225 degrees)
Eastward Component ≈ -29.7 mph
Northward Component = 42 mph × sin(225 degrees)
Northward Component ≈ -29.7 mph
Step 4: Calculate the final position of the plane:
To find the final position, we need to add the components of velocity to the starting position of the plane.
Eastward Position = Starting Eastward Position + Eastward Component × Time
Northward Position = Starting Northward Position + Northward Component × Time
Since we don't have the starting position of the plane, we will assume it's (0,0) as a reference point.
Eastward Position = 0 + (-29.7 mph) × 4 hours
Eastward Position ≈ -118.8 miles
Northward Position = 0 + (-29.7 mph) × 4 hours
Northward Position ≈ -118.8 miles
Therefore, after 4 hours, the plane's approximate position is (-118.8 miles, -118.8 miles) relative to its starting position.
To find the distance back to the airport, we can use the Pythagorean theorem:
Distance = √(Eastward Position^2 + Northward Position^2)
Distance = √((-118.8 miles)^2 + (-118.8 miles)^2)
Distance ≈ √(14112.54 miles^2 + 14112.54 miles^2)
Distance ≈ √(28225.08 miles^2)
Distance ≈ 167.95 miles
Therefore, the distance back to the airport is approximately 167.95 miles.
To find the angle from due east, we can use the inverse tangent function:
Angle = arctan(Northward Position / Eastward Position)
Angle = arctan((-118.8 miles) / (-118.8 miles))
Angle = arctan(1)
Angle ≈ 45 degrees
Therefore, the angle from due east is approximately 45 degrees.