I don’t normally get political- especially on social media, but this one here... I’ve got something to say.
It’s really disgusting some of these conversations I will be having with my daughter about HER rights as not just a woman(girl), but as a fucking human. My entire family is 95% women. And it makes me really disappointed in our country. Land of the free? Psh! Land of the sheep. It’s truly sickening that(some) men still feel they have the rights to a woman’s mind, body and choices! They tell us what to think, how to look, how to act. RAPE US! And then tell us to live with whatever consequences came from this unconsensual “sex”! To call us crazy, or overly emotional. Too emotional to make a decision for ourself. But expect us to make them for a child, one did not want to bear in the first place?! We carry the weight of everything, and yet you see us unfit to chose what we do with our own body? I’m just completely overwhelmed with how fucking ridiculous this country has gotten. We as woman- as a human race have worked SO HARD to get where we are now, and we are still having to fight for equality. And now you’re pushing years, decades of fight backwards. Back to when woman weren’t allowed to work or even wear fucking pants?! Like what the actual fuck is going on in America?