Medical Definition of Mainstream medicine
Mainstream medicine: Medicine as practiced by holders of M.D. or D.O. degrees and by their allied health professionals, such as physical therapists, psychologists, and registered nurses. The term "mainstream medicine" implies that other forms of medicine are outside the mainstream.
Last Editorial Review: 5/13/2016
Back to MedTerms online medical dictionary A-Z List
Need help identifying pills and medications?