Children Who Watch Lots Of TV May Have Poor Bone Health Later In Life
Last updated July 25, 2016
Approved by: Maulik P. Purohit MD, MPH

Hours of television watching per week were recorded by parental or self-report at 5, 8, 10, 14, 17 and 20 years of age in 1181 participants. Those who consistently watched ?14 hours/week of television had lower bone mineral content than those who watched less television, even after adjusting for height, body mass, physical activity, calcium intake, vitamin D levels, alcohol, and smoking (all at age 20).
Consistently watching high levels of television during childhood and adolescence were linked with lower peak bone mass at age 20 years in a recent study.
In the Journal of Bone and Mineral Research study, hours of television watching per week were recorded by parental or self-report at 5, 8, 10, 14, 17 and 20 years of age in 1181 participants.
Those who consistently watched ≥14 hours/week of television had lower bone mineral content than those who watched less television, even after adjusting for height, body mass, physical activity, calcium intake, vitamin D levels, alcohol, and smoking (all at age 20).
"Since attainment of optimal peak bone mass is protective against osteoporosis later in life, reducing sedentary time in children may have long-term skeletal benefits," the authors wrote.
The above post is reprinted from materials provided by Wiley. Note: Materials may be edited for content and length.
Disclaimer: DoveMed is not responsible for the adapted accuracy of news releases posted to DoveMed by contributing universities and institutions.
Primary Resource:
McVeigh, J. A., Zhu, K., Mountain, J., Pennell, C. E., Lye, S. J., Walsh, J. P., & Straker, L. M. (2016). Longitudinal Trajectories of Television Watching Across Childhood and Adolescence Predict Bone Mass at Age 20 Years in the Raine Study. Journal of Bone and Mineral Research.