First off, romance is dead.
All my life I've heard the words "Where are all the nice guys?" from varying people in varying places. Each time I've had to play the role of the reassuring guy, the one who says "Oh, they're around, just keep looking". Meanwhile, I've been told by the men around me that the only way to hold a woman's interest is to treat her like shit.
I resisted this idea very doggedly, because I am an optimist at heart and I hate the idea that there are people out there who simply want nothing to do with those who will treat them kindly. I attributed it to bitter men who were already assholes to begin with; now I wonder if they may have been right.
Women do not, in my experience, like "nice" men, and they certainly don't want them, in spite of what they say. I will tell you where all the nice guys are, they are dying off, very rapidly. Why? Because it isn't beneficial to be nice. Whether you can attribute it to daddy complexes or the simple lack of interest in the "Boring" nice guys is anyone's guess. These nice guys were forced to become assholes because the women around them did not give them the time of day when they were caring, considerate. This is a highly misogynistic post, I know, but I've yet to see any evidence to prove me wrong.
You listen to problems, do things to brighten their day, and you are forced into the friend zone because any sign of care or concern seems to equate to "not-sexual". Men can't be nice because women do not give a shit about them if they are. All my life, I've tried to be the gentleman, and I've often been shot down because I was simply "nice". I hate this realization, and I hate that I've lost my faith in women to the point where I am even considering this to be truth.
Please, someone, tell me why the hell it is that this has to be? What happened to people actually being civil to each other instead of calling each other bitches?