Two planes flying opposite directions (north and south) pass each other 80 miles apart at the same altitude. The northbound plane is flying 200 mph (miles per hour) and the southbound plane is flying 150 mph. When are the planes 200 miles apart? (Round your answer to one decimal place.)

Let after t hours planes will be 500 miles apart.

Then EB = 200t
BC = 150t
Therefore, EC = EB + BC = 350t
It's given that DC = 200 miles
By Pythagoras theorem again,
DC² = EC²+ DE²
(200)²= (350t)²+ (80)²
40000 = 122500t² + 6400
400 = 1225t² + 64
1225t² = 336
t² = 0.27438571
t=0.52 ≈ .52 hours = 31 mins
this is wrong. where did I go wrong?

check with your previous post of this