There is no definitive answer to this question, as the effects of Wi-Fi on a newborn baby are still being studied. However, some experts believe that using Wi-Fi could be harmful to a baby’s health. Some experts believe that using Wi-Fi could be harmful to a baby’s health. For one, it can cause interference with their brain development and can even cause problems with their hearing. Additionally, it can also lead to problems with sleep quality and concentration. So far, there is no scientific evidence that proves that using Wi-Fi during the early stages of development is harmful to a baby’s health. However, it is important to keep in mind that there is still much unknown about this technology and its effects on children. So until more research is done, it remains difficult to say for sure whether or not using Wi-Fi during the early stages of development is safe for babies.


Wi-Fi has become such an ingrained part of our everyday lives that we tend not to give it much thought unless it has stopped working. But what if your family has a newborn baby in the house? Are there any dangers that new parents should be aware of?

Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.

The Question

SuperUser reader avy wants to know if Wi-Fi could actually be harmful to his family’s newborn baby:

Could Wi-Fi pose a danger to a newborn baby or is it just a bit of unnecessary paranoia?

Now before people start writing that Wi-Fi is safe because they use it in hospitals and schools, let me be clear, I’m aware of all that, but the idea of having it 24/7 for years to come around this little person that is our responsibility to look out for makes me want to have a definitive answer to the subject.

I will put on my tin foil hat and await some well thought out/educated answers.

The Answer

SuperUser contributors NothingsImpossible and Bob have the answer for us. First up, NothingsImpossible:

Followed by the answer from Bob:

Radiation can be separated into two categories: ionizing radiation and non-ionizing radiation.

In layman’s terms, ionizing radiation is radiation that can “break” the molecules that make up things.

Non-ionizing radiation, on the other hand, just passes through objects or is converted to heat when it hits them.

Wi-Fi networks operate on the same frequency as a microwave oven. It uses non-ionizing radiation and when it hits the objects, it is just converted into heat, it does not change the composition of the object itself. It is harmless, at most it will heat your body, but a very, very, veryyyy tiny amount that is not even measurable.

Ionizing radiation is dangerous. Examples of it are ultraviolet rays and nuclear radiation. They not only heat you, but they also change the composition of the molecules that make up your body. They can modify the DNA of your cells, causing cancer.

Example: Sunburns. It burns after long, unprotected exposure to the sun, not because your skin got hot. The UV rays of the sun damaged the DNA of the skin cells, and the body reacts with the burning sensation.

Conclusion: Wi-Fi is harmless.

Make sure to look through the rest of the lively discussion on the topic at SuperUser via the link below!

The term “radiation” is often used to scare people. Let’s get it straight. There are two factors – frequency and intensity. Frequency has a far larger effect on how damaging radiation is. Wi-Fi and other radio communications use a very low frequency – far below visible light.

Radiation that actually causes issues, could potentially cause cancer, etc., is usually ionizing radiation. It has a very high frequency and can cause mutations in DNA, possibly leading to cancer (more info on that process). The frequency required to be ionizing? At least 1,000,000 GHz. That is literally a 500,000 times higher frequency than what Wi-Fi transmits on, 2.4 GHz or 5 GHz. Non-ionizing radiation, which Wi-Fi falls under, does little more than transfer heat.

Did you know light is also EM radiation? Yup. In fact, light (~500,000 GHz on the near-infrared side, ~750,000 GHz near-ultraviolet) is much closer to ionizing radiation than Wi-Fi. Sunlight actually contains some ionizing radiation (UVB, UVC – UVA can also cause DNA damage, but it’s not in the same way). But you’re not going to hide in your house for the rest of your life, are you?

Apart from frequency, there is intensity. Non-ionizing radiation can also be damaging – but this really only applies to higher intensities. And ionizing radiation is not always dangerous – our bodies can cope with lower intensities, which is why we don’t all die in the sun (vampires are another matter). Wi-Fi has a transmitting power usually far under 1 Watt (I’ve seen figures for 200 mW). And most of that energy never reaches you – by the inverse square law, you only get about 1/distance squared of that. In layman’s terms – the energy spreads equally in all directions. 10 meters away? 1/100 * 200 mW = 2 mW. That’s nothing.

Microwave ovens (which operate on a similar frequency as Wi-Fi) transmit ~1000 Watts, and it’s highly focused inside that metal box. Only maybe 1 Watt can be released through the shielding, and even that is considered perfectly safe. To put all this in perspective, sunlight (which is a higher frequency, and therefore more energetic) is about 1000 Watts per square meter when it hits the ground, half of which is visible light or higher.

You might also find some interesting sources and studies cited on a similar question on Skeptics.SE.

Have something to add to the explanation? Sound off in the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.