A radio signal travels at 3.00*10^8 meters per second. How many seconds will it take for a radio signal to travel from a satelite to the surface of Earth if the satelite is orbiting at a height of 3.54*10^7?

Respuesta :

Speed = distance/time 
so time taken = distance/speed 

Time = 3.54x10^7 / 3x10^8 
Time = 0.118 seconds 

The time taken by radio signal to reach on earth is 0.118 second.

The relation between distance , time and speed is given as,

                [tex]Distance=speed * time\\ \\ time=\frac{Distance}{Speed} [/tex]

Given that, Speed of radio signal is [tex]3*10^{8} m/s[/tex]  and Distance between satellite and earth is [tex]3.54*10^{7} m[/tex].

The time taken by radio signal to reach on earth is,

                                  [tex]time=\frac{3.54*10^{7} }{3*10^{8} } =0.118s[/tex]

Therefore, The time taken by radio signal to reach on earth is 0.118 second.

Learn more:

https://brainly.com/question/4931057