History of medicine in the United States

The History of Medicine in the United States encompasses a variety of approaches to health care in the United States spanning from colonial days to the present from home remedies to the formalization of modern biomedicine. This history also consists of systemic medical abuse inflicted onto marginalized communities as a result of eugenics and scientific racism, starting from the 18th century. The medical history in the United States also includes the work of midwives and nurses of colors.