Time it takes to transmit signal to satellite (modern physics)

1. The problem statement, all variables and given/known data

A communication satellite is orbiting 36 000 km above Earth’s surface. Two cities, 3500 km apart are transmitting to and receiving signals from each other. Find the time required to transmit a signal from one city to the other. They are equidistant from the satellite.

h = 36000 km = 3.6 x 107 m
d = 3500 km = 3.5 x 106 m
t = ?

2. Relevant equations

Pythagorean theorem to find l, but I’m not sure what equation to use to find time! Any help?

3. The attempt at a solution

I found the length of the hypotenuse between each town and the satellite:

l = √((3.6 x 107)2 + (3.5 x 106)2) = 3.60 x 107 m.

How do I go along from there?


Show Buttons
Hide Buttons