In my experience, one of the problems with so much research on the biological effects of electromagnetic energy at any wavelength is that most of the testing is performed by biologists who often don't understand the electromagnetic technology they're working with. The result is that the conclusions they draw are often misleading if not downright wrong. Then you have the general hype that tends to accompany any such topic. Both issues are evident in the above mentioned article.
That said, Bluetooth uses a non-ionizing radiation (meaning there's not enough energy present to break molecular bonds, so claims of direct impact on DNA or the like are generally nonsensical). However, Bluetooth operates at the same frequency as your microwave oven, which was reserved for that application due to the fact that water molecules absorb that frequency readily and thus water heats up. Bluetooth and Wi-Fi are unlicensed technologies that are allowed to operate at that frequency as long as they can live with the interference produced by microwave ovens and other unlicensed products. So yes, the RF energy from Bluetooth will heat your cells somewhat. That's the purpose of the SAR testing mentioned in the article. At the frequencies used by your wireless devices, the goal is to minimize the heating produced by the device. In general, radiation exposure is governed by level and duration. Reducing either the level or duration of exposure reduces the exposure accordingly. The good news is that a relatively small percentage of the energy produced by a device penetrates the skin. The intention of the device is to radiate away to communicate with something else. Energy lost inside the user is lost to the network and thus of benefit to no one (unless we're talking about implanted radios, which do exist!).
At any rate, back to the original question, since Bluetooth Low Energy (BLE) is designed to reduce the energy consumption of Bluetooth communication and increase battery life, by definition it will reduce the exposure level compared to a standard Bluetooth device. If you're consuming less energy you're radiating less power over the same amount of time.
One footnote here that is mentioned in the referenced article is the potential for non-ionizing radiation to possibly influence the migration of ions and somehow hinder a particular cellular process. I've heard this concern expressed in relation to power lines and the ELF of traditional CRT monitors for many years. This concept does at least sound plausible, which may be why so many people jump on the concept, but I've never seen the results of any sort of study that proves such an effect exists. If it happens, it should be measurable. Statements like the one in the previous article appear to be mere conjecture. We can certainly test and measure the heating effects of microwaves, but the referenced paper from 2008 that mentions a "plausible theory" is still being referenced in 2014 without any measured results to back it up. The 2008 paper then goes on to ascribe every common malady of the electronic age to this theoretical link to electromagnetic waves, Thus the hype.
Bluetooth headsets are now used with cellphones, keyboards, printers, personal digital assistants (PDAs), personal media players, GPS, gaming equipment, the list goes on and on—over 6,000 products. But are they safe? A growing body of evidence suggests they may be damaging your body.Add your answer
In my experience, one of the problems with so much research on the biological effects of electromagnetic energy at any wavelength is that most of the testing is performed by biologists who often don't understand the electromagnetic technology they're working with. The result is that the conclusions they draw are often misleading if not downright wrong. Then you have the general hype that tends to accompany any such topic. Both issues are evident in the above mentioned article.
That said, Bluetooth uses a non-ionizing radiation (meaning there's not enough energy present to break molecular bonds, so claims of direct impact on DNA or the like are generally nonsensical). However, Bluetooth operates at the same frequency as your microwave oven, which was reserved for that application due to the fact that water molecules absorb that frequency readily and thus water heats up. Bluetooth and Wi-Fi are unlicensed technologies that are allowed to operate at that frequency as long as they can live with the interference produced by microwave ovens and other unlicensed products. So yes, the RF energy from Bluetooth will heat your cells somewhat. That's the purpose of the SAR testing mentioned in the article. At the frequencies used by your wireless devices, the goal is to minimize the heating produced by the device. In general, radiation exposure is governed by level and duration. Reducing either the level or duration of exposure reduces the exposure accordingly. The good news is that a relatively small percentage of the energy produced by a device penetrates the skin. The intention of the device is to radiate away to communicate with something else. Energy lost inside the user is lost to the network and thus of benefit to no one (unless we're talking about implanted radios, which do exist!).
At any rate, back to the original question, since Bluetooth Low Energy (BLE) is designed to reduce the energy consumption of Bluetooth communication and increase battery life, by definition it will reduce the exposure level compared to a standard Bluetooth device. If you're consuming less energy you're radiating less power over the same amount of time.
One footnote here that is mentioned in the referenced article is the potential for non-ionizing radiation to possibly influence the migration of ions and somehow hinder a particular cellular process. I've heard this concern expressed in relation to power lines and the ELF of traditional CRT monitors for many years. This concept does at least sound plausible, which may be why so many people jump on the concept, but I've never seen the results of any sort of study that proves such an effect exists. If it happens, it should be measurable. Statements like the one in the previous article appear to be mere conjecture. We can certainly test and measure the heating effects of microwaves, but the referenced paper from 2008 that mentions a "plausible theory" is still being referenced in 2014 without any measured results to back it up. The 2008 paper then goes on to ascribe every common malady of the electronic age to this theoretical link to electromagnetic waves, Thus the hype.