Some people are very concerned that radio wave power coming from radio transmitters’ antennas will harm people. In particular people are troubled by mobile phone masts causing long term health problems when they are near to their homes.
I just want to make it clear how much a transmitter’s power diminishes as it spreads out from a transmitter. I’m sure some feel it must follow a simple linear law such as:
- double the distance and get half the power,
- triple the distance and get a third of the power,
- quadruple the distance and get a quarter of the power.
When in fact it follows an inverse square law which is nonlinear and gives this type of result:
- double the distance and get a quarter of the power,
- triple the distance and get a ninth of the power,
- quadruple the distance and get a sixteenth of the power.
This means the power (and damaging energy) is much lower than might be expected at any particular distance. Even within a short distance the power can drop considerably.