The constant theme of comments as of late go something like, “Amy, you must really enjoy taking care of those mamas and babies. You have such a fun job.” I have to bite my tongue before I blurt out that I don’t really like taking care of babies—I make sure they are stable and send them to the nursery OR if they are crunking out, I get the NICU peeps in the room to do their thing. I do, however, LOVE to take care of laboring women, and I make sure they don’t die during the process. Sweet, huh?
Then there was the friend who told me my job is so great because I can “just do it anywhere” and “drop it if you need to so you can raise your kids or even do something else like become a doctor.” Yeah, it would be great if all nurses just dropped their jobs—and better yet, stopped nursing to go to med school! (Insert sarcasm.)
Amazingly, people still do not see nursing as a profession!!! We are so misunderstood.
This makes me wonder how much the media is responsible for the portrayal of nurses. I constantly am told, after a mom gives birth, that, “Wow, that was nothing like it is in A Baby Story” or some other such “reality” show about childbirth. I love how on those shows, labor takes about an hour, you never see the nurses, and everything turns out peachy in the end.
It would be wonderful to do a twelve hour real “reality” TV show and go through a nurses day, hour by hour. I think it would surprise the public. After all, my patients also tell me over and over, after they have their babies, that they had no idea how much impact the nurse has on the whole experience. But that sentiment tends to get lost; I don’t think lay people really comprehend our role. How can they? Even my husband says he feels disconnected from my work because he has never seen me in action. He just does not know what I really do!
It would be great to hear more about nurses as the true heroes they are—and not that I want people to sing my praises. We need more awareness so we can better do our jobs as the professionals we are, don’t you think?