A cat starts to walk straight across a 100-meter-long field to her favorite tree. After 20 m, a dog sees the cat and chases it across the field and up the tree. If the average speed of the running cat is 10 m/s, how much time did it take for the cat to get to the tree?
A. 0.8 s
B. 2 s
C. 8 s
D. 10 s

Respuesta :

Answer:

8 seconds, Answer choice C.

Explanation:

The information they give us about the speed of the cat, is from the point at which the dog started chasing it (that velocity being 10 m/s).

Notice that the actual distance the cat run is: 100 meters minus 20 meters (100 - 20 = 80 meters). Therefore, we have information on the distance covered by the cat (80 meters), and its speed (10 m/s), so we can use the definition of speed to find the time it took the cat to get to the tree:

[tex]speed=\frac{distance}{time} \\10 \frac{m}{s} = \frac{80\,m}{time}\\time=\frac{80}{10} s=8 \,s[/tex]

Since all units for the physical quantities involved were given in the SI system,  the answer comes also in the SI units of time: "seconds"