The length of time it takes to drive 100 miles depends on the rate of speed driven and whether or not that rate is consistent. The basic formula for determining rate of travel is distance equals rate multiplied by time.

If a driver travels consistently at 50 miles per hour, the driving time is determined by setting up the equation in the following manner as "100 miles = 50 miles per hour / t." To solve for time, the equation is time equals distance divided by rate. Time then equals two hours. Of course, this assumes no variation in speed and no stopping for traffic signals or other vehicles.

Reference:

Similar Questions

People Also Asked

Top Related Searches

Wikipedia on Ask.com

A machine consists of a power source and a power transmission system, which provides controlled application of the power. Merriam-Webster defines transmission as an assembly of...