Try not to be insulted by the headline. I wrote this because it is slowly becoming mainstream to be scared of 5G, which is – honestly – really stupid. Most people complaining about the bad effects of 5G, 4G, microwave, wireless internet etc. have not been enjoying any education in basic physics. I’ll try and explain why in simple to understand language;
“How can we be so sure that many diseases are not being caused by the radiation poisoning now occurring from cell phones, 5G towers, EMF’s in the home (wifi, microwave ovens, refrigerators, digital meters etc) at the office, on a ship (many now have onboard 5G), at a concert (with tens of thousands of cell phones in use in close proximity)?”
We can be quite sure of that because the radiation you refer to can easily be measured. Both in strength and in frequencies used by these devices.
Prior to the growth of cable-TV, digital radio, wireless internet, smartphones, 2/3/4/5G UMTS, the air around us was literally loaded with actual TV signals (VHF/UHF) and FM/AM radio transmissions everywhere. Really powerful ones, from local radio- and TV-stations all the way up to nationwide transmitters. Most of those transmitters pumped from a kiloWatt (1000 Watts) up to 200 kW (200000 Watts) RadioFrequency power into their antennas, where the latter would sometimes amplify their signals to even higher levels in specific directions, to cover certain areas of land/sea. Note that EACH TV-channel or radio-station required ONE of those transmitters, and all our countries were full of them. The frequencies used for those signals don’t differ that much from the ones we absorb from our smartphones or wireless router today. In fact, most battery-operated devices, like our smart-phones, have one main goal: To be as efficient as possible with their power-source. They only transmit when it is absolutely required to operate sufficiently for its usage, and they transmit only as much power as is needed, minimizing their output when and where-ever possible. This is one of the advantages of 5G over its predecessors; It uses less RF power to achieve the same result.
And precisely that is where all arguments against these radiation sources fall flat; Why was nobody complaining back when we were radiation poisoned a tenfold in strength compared to what we face now? The HF (High Frequency) and LF radiation we had to endure in the 1980s was way worse than what we face now, because many of those TV/radio stations no longer use those high power open-air transmitters. And all this is not even mentioning the radiation coming from CRT screens (old TV-tubes), tube-amps, tube-radio-receivers, oscilloscopes, and so on. It was all pretty horrible, compared to what we measure today. And interestingly: The devices we measure those radiation levels with are still the same, still work the same way they did back then, and still measure the same frequencies used. So, unless you deny their functionality, or physics, a fair comparison makes sense.
Smartphones, IoT and other devices emit tiny amounts of radiofrequency (RF) radiation when compared to those kiloWatt transmissions from back in the day. Humans can absorb this radiation when the smartphone is being used or is lying dormant anywhere near their bodies. The parameter used to measure phone radiation emissions is the Specific Absorption Rate (SAR). It is the unit of measurement that represents the quantity of electromagnetic energy absorbed by the body when using a mobile device. The Council of the European Union has set radiation standards for cell phones at 2 watts per kilogram, measured over the 10 grams of tissue that is absorbing the most signal. SAR values are calculated at the ear (speaking on the phone) and at the body (kept in your pocket). You can check the SAR values from your own model phone right here.
Important to know: They measure these SAR values in a shielded room with thick steel around it, in order to NOT include other/interfering sources of radiation in those measurements. If we would do the same measurement outdoors in, say, a random village in 1984, we would see SAR values ranging from 0.2 to 4 Watt almost everywhere 24/7. Because there was no internet back then, it’s hard to even find reliable measurements, but there were agreements on what was allowed as a maximum SAR, and those norms were way more lenient, and most importantly *less carefully monitored*, than they are now.
So, when you say “we’re being bombarded with IoT, wireless and smartphone signals everywhere now”, just think what that actually means in comparison to the way our free to air radio and TV signals could just as well have bombarded us all our lives with way more.
https://antenneregister.nl/Html5Viewer/Index.html?viewer=Antenneregister%5Fextern