Monday, December 7, 2015

Nursing Men

As nursing is perceived as a mostly nurturing field, it comes as no surprise that it is deemed as predominantly women career. However, men are again surprising society in actually showing an interest in what these jobs have to offer. Even though many are not aware, most of the duties that are taken on by a nurse are not even evaluated by a doctor. Merely keeping the patient alive and sustaining their basic medical needs are just some of the responsibilities they must take on. The term 'nursing' literally refers to a more feminine and inherently mother-like career. Due to the fact that society just really does not want to admit that men can in fact have emotional, ironically men are encroaching on this type of "nursing" labor field. 

Below is an article I found on just that: 

No comments:

Post a Comment