Written By: Ryan McCaughey, PhD.
Modern wireless technology, including cell phones, cordless phones, Bluetooth devices, Wi-Fi networks, smart utility meters, and radio and TV transmission, rely on radio frequency (RF) fields to communicate. Medical scanners, radar systems and microwave ovens also use radio frequency fields.
RF broadly refers to electromagnetic waves oscillating between one hundred thousand times per second (100 kHz) to three hundred billion times per second (300 gigahertz). Cell phones transmit and receive signals in the range 600 megahertz (MHz) to 3 GHz.
The intensity of electromagnetic radiation can be expressed as microwatts per square-centimeter (µW/cm2) which refers to the power exposing a surface area. A microwatt is one thousand times less than a milliwatt.
Microwave ovens use RF waves at 2.45 GHz to cook food in a process known as dielectric heating. Microwaves are designed to confine the RF energy within cooking chamber, but a small fraction can leak out – the FDA sets a safety limit of 5 milliwatts/cm2 at a distance of 2 inches from the oven. The intensity of a RF field decreases rapidly as you get farther form the source. Generally, it is inversely proportional to the square of the distance from the source, i.e. when you are twice as far away, you receive one-quarter the energy. Two feet from a microwave you could be exposed to about 125 microwatts/cm2. At 120 microwatts/cm2 of RF exposure experiments have shown leakage of the blood-brain barrier. The exposure from a cell phone right next to your head can be over 20 times higher than that. And biological effects of RF exposure have been observed as low as 0.00034 µW/cm2 (reduced sperm count).
The metric for the rate at which RF energy is actually absorbed by the body is Specific Absorption Rate or SAR. The FCC sets a SAR limit of 1.6 W/kg for cell phones.