Sunday, September 18, 2016

The First Female Medical College: "Will you accept or reject them?"

From: doctordoctress.org


How were the first women physicians of the 19th century perceived?

The 19th century was a period of rapid social change and experimentation. The reform movements that swept through American society after 1820 were built on a new vision for the young nation and were reactions to a range of factors: slavery; the abuse of alcohol; the transformation of the American economy through industrialization; urbanization; and lingering goals of the American revolutionary period. These reform movements included: the Public Schools Movement to ensure public education for all; better care and treatment for the mentally ill; the Temperance Movement to control alcohol abuse; and most famously, the abolition of slavery and the promotion of women’s rights.

Two significant reform movements of the period, the abolition of slavery and the promotion of women’s rights, were inextricably tied together. Women were very active in the anti-slavery movement, even though their contributions were limited by their legal and societal status as second-class citizens. By participating in the abolition movement, women moved beyond their traditional domestic world of home and child-rearing and entered public life. They attended meetings, strategized, spoke out, and raised money for the anti-slavery cause.

The experience of advocating for equal rights for African Americans taught women the power of organizing and acting for change. Women used what they learned in the abolition movement to publicly fight for their own equal rights, including access to education and employment opportunities.

Prior to 1850, women practiced medicine in their communities and worked, informally, as nurses and midwives, but there was no opportunity for formal medical education. Many thought that it was improper for women to study medical subjects alongside men, and were opposed to women entering medicine as professionals. Others, though, thought it would be more appropriate for a woman physician, instead of a male physician, to treat women patients.

It was at this time that a small group of Philadelphia Quakers began to imagine a future where women were professional physicians, appropriately educated, bearing a medical degree, and serving their communities equally with their male counterparts. In 1850, they founded the Female Medical College of Pennsylvania, the first medical school in the nation for women.

19 comments:

Cool article. Such articles give me writing inspiration and i start to write on my own.

Thanks for the interesting thoughts. I think women doctors are no worse than men and in general it is bad to divide the professions by gender. Medicine, treatment was primordially a woman's occupation. Since ancient times, albeit at the household level, women have provided medical assistance, accumulating and passing on vital knowledge and skills to the next generations. Subsequently, men took over medicine as a professional occupation, while women still preserved the most ancient traditions. It is to them, healers and sorcerers, that many generations on all continents have resorted to in everyday life. However, many of them were persecuted and accused of witchcraft. So, with the tacit approval of the role of a woman in the simplest provision of medical care, she, at the same time, was deprived of the opportunity to legally practice medicine, undergo training on an equal basis with men, and work. I learned this information from posts on Instagram in which the authors tell about the history of different professions. There you can find a lot of posts on this topic and I noticed that most often these posts are published by accounts that have more than 50 thousand subscribers! I am sure that in order to achieve such numbers, the owners of these accounts used the services of https://soclikes.com/buy-instagram-followers to cheat subscribers.

Hello. Articles like this one and this one https://www.iiste.org/tag/domyessay/ help me find a writing inspiration.

Thanks for the interesting thoughts. I think women doctors are no worse than men and in general it is bad to divide the professions by gender. Medicine, treatment was primordially a woman's occupation. Since ancient times, albeit at the household level, women have provided medical assistance, accumulating and passing on vital knowledge and skills to the next generations. I think In order for more interested people to read your thoughts, you should post it on Instagram, and use the services of https://viplikes.net/ to quickly increase their ammount.

Dating for everyone is here: ❤❤❤ Link 1 ❤❤❤


Direct sexchat: ❤❤❤ Link 2 ❤❤❤

rs.

Apparently, the first medical college for women was actually founded in Boston in 1848. The college in Philadelphia was founded shortly afterwards.

https://www.wgbh.org/news/2016/11/04/how-we-live/doctresses-medicine-worlds-1st-female-medical-school-was-established-boston

Post a Comment

Share

Facebook Twitter Delicious Stumbleupon Favorites