a radio signal travels at 3.00 * 10^8 meters per second. how many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 * 10^7 meters? show all the steps that you used to solve this problem.

1

Respuesta :

Answer:

0.118 seconds

Step-by-step explanation:

Data are given in the question

Radio travels meters per second = 3.00 × 10^8

Distance = 3.54 × 10^7 meters

Based on the given information, the number of seconds required to takes radio signal from the satellite to the surface of the earth is

As we know that

[tex]Speed = \frac{Distance}{time}[/tex]

So,

[tex]Time = \frac{Distance}{speed}[/tex]

[tex]= \frac{3.54 \times 10^7 meters}{3.00 \times 10^8}[/tex]

= 0.118 seconds

We simply applied the general formula to determine the number of seconds taken