Medical Definition of Western medicine

Reviewed on 3/29/2021

Western medicine: Conventional medicine, as distinct from an alternative form of medicine such as ayurvedic or traditional Chinese medicine.

CONTINUE SCROLLING OR CLICK HERE

QUESTION

What causes tooth decay? See Answer

Health Solutions From Our Sponsors