May 01, 2018

Radiation from imaging tests

Putting risk into perspective

There are wide-ranging benefits to the use of modern X-ray imaging technology, which uses radiation to generate images of the inside of your body. It can provide a quick and painless diagnosis or guide treatment, such as determining artery-opening stent placement.

With so many benefits, it's not surprising that the use of X-ray imaging has dramatically increased in the past 30 years, mainly as a result of technical advances and an increased use of computerized tomography (CT) and positron emission tomography (PET) scans. Likewise, the average amount of lifetime radiation exposure also has increased.

The downside is that radiation can cause damage to and mutation of DNA. This might lead to the development of certain cancers. But at what level does radiation exposure from medical imaging increase the risk of future cancer? The answer to this question isn't clear-cut, and it's important to weigh the potential risk against the known benefits of medical imaging.

Radiation levels

Radiation is naturally present in the environment, coming from sources such as the sun and radon in rocks and soil. The average annual exposure in the U.S. from all sources of natural radiation is estimated to be about 3 millisieverts (mSv) per person. However, you may be exposed to more or less depending on where you live. Exposure from natural radiation varies from 1 to 20 mSv in the U.S.

It's estimated that the average annual amount of radiation exposure — a combination of natural and medical exposure in the U.S. — has roughly doubled since the 1980s to about 6.2 mSv. This value is an average over the entire population. If you've not had any medical exams that use radiation, your radiation exposure has not increased.

It's not known at what levels radiation begins to significantly increase cancer risk. Below 100 mSv, an increase in risk has not been shown...