The enhancement of one's health through the prevention, diagnosis, treatment, amelioration, or cure of disease, illness, injury, and other physical and mental impairments in humans is known as health care or healthcare. Health professionals and associated health fields provide healthcare.