Medical Definition of Informed consent

Informed consent: The process by which a patient learns about and understands the purpose, benefits, and potential risks of a medical or surgical intervention, including clinical trials, and then agrees to receive the treatment or participate in the trial. Informed consent generally requires the patient or responsible party to sign a statement confirming that they understand the risks and benefits of the procedure or treatment.

CONTINUE SCROLLING OR CLICK HERE FOR RELATED SLIDESHOW

QUESTION

What causes tooth decay? See Answer

Health Solutions From Our Sponsors

Reviewed on 12/11/2018