A radio signal travels at 3. 00  10 8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3. 54  10 7 meters? Show your work.

Respuesta :

Answer:

[tex]0.1157\ \text{s}[/tex]

Step-by-step explanation:

[tex]v[/tex] = Velocity of radio signal = [tex]3\times 10^8\ \text{m/s}[/tex]

[tex]d[/tex] = Distance  = [tex]3.47\times 10^7\ \text{m}[/tex]

Time is given by

[tex]t=\dfrac{d}{v}[/tex]

[tex]\Rightarrow t=\dfrac{3.47\times 10^7}{3\times 10^8}[/tex]

[tex]\Rightarrow t=0.1157\ \text{s}[/tex]

Time taken by a signal from the satellite to reach Earth is [tex]0.1157\ \text{s}[/tex].