This question has been asked for years within our culture. The one constant has been that American men have found it their job to “protect” our women, as much as possible, from the horrors and physical depravities of war.
Must read