Romantics, we are a dying breed. "Horny" has seemingly destroyed any romance left in society. We live in a society that prefers one night stands to a relationship yet complain we are treated unfairly when this happens. Sex is now a commodity, a statement, a symbol of how "cool" we are, instead of a symbol of love. Women love romcom's for the "romance" yet when they are faced with the prospect of love in real life the majority choose the easier option of a simple "shag". The choices women make to go after the "bad boy" instead of a man that will love them and keep them safe. The choices guys will make to go after a girl who is "easy" instead of the girl that truly cares for him. Their is a distinct lack of trust, respect and love between men and women in today's society.
What ever happened to the idea of "wooing" of love and respect? Do books and Hollywood lie to us? Part of me believes there are still people like myself out there, who believe in true love, in "making love" instead of "fucking", who would rather have a relationship with someone they care for rather than a simple lust filled one night stand. The ideas of romance have seemingly been destroyed by the superficial day and age we live in. The amount of times I've sat and listened to my friends talk about sexual acts as if it is something as simple and care free as turning on a light really makes me question where has the romance gone? Is it really dead?