A motorist makes a trip of 180 miles. For the first 90 miles, she drives at a constant speed of
30 mph. At what constant speed must she drive the remaining distance, if her average speed for
the total trip is to be 40 mph?
The average speed for the whole trip is simply given by:
Va = DISTANCE TRAVELED/TIME TAKEN = D/T
Va = average speed, D = distance traveled (total), T = total time traveled.
It is easy to think that
Va = (v1+v2)/2, but this is incorrect, use the formula above. From the problem we can ...
This solution explains how to determine a constant speed based on previous and average speeds.