I have no problem with men and women who wish to buy and sell sex. Heck, the more chauvinistic are firmly of the belief that that is what dinner and a movie equals (or a diamond ring, for the traditional, Christian chauvinist). I have no desire to take part. I believe that those that do, by and large, have issues. Mostly, however, I find it very strange how affronted our nation is by the notion of sex for money, or, for that matter, sex in general. Especially when you consider that the vast majority of Americans has had sex. A large majority has even had sex with someone to whom they are not married. So, I don't get it. Why is it more offensive to see a penis or vagina (or even a friggin' breast) than it is to see someone get their head blown off? Why is it ok for a president to send our troops away to war, to lie to the American people and to disregard the Constitution, but it's morally repugnant to the point of impeachment for a president to get a blowjob? Theories abound, but none that really explains it. There is something horribly wrong and backwards about the way this country (and, really, most "religious" individuals/countries) views sex--especially when compared with violence--and I just don't understand why.
That's pretty much it...this was kicked off by this article about prostitution, Germany, and the World Cup.