How the Public’s Perception of Nursing Impacts the Profession
Nurse.com
SEPTEMBER 14, 2023
This originated from the idea that women were natural caretakers, and nursing became an extension of this societal standard. In 2019, members of the Royal College of Nursing told their annual congress that the portrayal of nursing in mainstream media undermined their professionalism and increased the risk of abuse by the public.
Let's personalize your content