I just saw a commercial on TV about men teaching young boys that violence towards women is wrong, and it got me thinking.. Why did that ridiculous idea become so widespread?  How is violence towards women somehow worse than violence towards men?  Violence of ANY kind is bad, but it doesn’t somehow become worse just because the victim is female and the perpetrator is male.  That double standard REALLY pisses me off.