Considered the definitive history of the American healthcare system, The Social Transformation of American Medicine examines how the roles of doctors, hospitals, health plans, and government programs have evolved over the last two and a half centuries.