I’ve just begun reading The Social Transformation of American Medicine, Paul Starr’s Pulitzer prize winning history of the rise and triumph of health professionals in America. According to Starr, doctoring was viewed quite differently not too long ago:
Doctors in America were not always the powerful and authoritative profession that they are today. A century ago they had much less influence, income, and prestige. “In all of our American colleges,” a professional journal commented bitterly in 1869, “medicine has ever been and is now, the most despised of all the professions which liberally-educated men are expected to enter.” Although a few eminent doctors made handsome fortunes, many before 1900 could hardly scrape together a respectable living. […]
Beginning in the 1760s, some educated doctors took the initial steps to reproduce in America the professional institutions that in England gave physicians a distinct and exclusive status. They succeeded in organizing medical schools, and in some fields of work, such as obstetrics, doctors gained ground against rival practitioners. But they failed in their larger efforts to establish themselves as an exclusive and privileged profession. The licensing authority doctors secured had little more than honorific value, and during the Jacksonian period in the 1830s and 1840s, their claims to privileged competence evoked a sharp backlash that crippled their ambitions for the next half century. State legislatures voted to do away with medical licensure entirely. No profession was being allowed, Oliver Wendell Holmes told the graduating class at Harvard in 1844, “to be the best judge of its own men and doctrines.” Lay practitioners, using native herbs and folk remedies, flourished in the countryside and towns, scorning the therapies and arcane learning of regular physicians and claiming the right to practice medicine as an inalienable liberty, comparable to religious freedom.
What exactly was it that changed? I’m hoping this book will tell me.