Medical Definition of WHI
WHI: The Women's Health Initiative, a long-term health study sponsored by the National Institutes of Health (NIH) focused on strategies for preventing heart disease, breast cancer, colorectal cancer and osteoporosis in postmenopausal women.
Last Editorial Review: 9/14/2016
Back to MedTerms online medical dictionary A-Z List
Need help identifying pills and medications?