posted by manny .
a dog runs back and forth between its two owners, who are walking toward one another. the dog starts running when the owners are 10 m apart. if the dog runs with a speed of 3 m/s and the owners each walk with a speed of 1.3 m/s how far has the dog traveled when the owners meet
when I did this problem, I divided half of the total distance (10 m) by 1.3 m/s because I thought that if the owners were going to meet each other they would each travel 5 m. the way this problem is worked out, though, the owners' speeds are combined and then divided by the total distance. I got the right answer, but I was wondering if anyone knew if my way could also be correct or if I just got lucky. Thanks-
How long does it take for the owners to meet? time= 10/(2*1.3) seconds
What velocity did the dog travel during this time?