I believe it's time that that women had a choice in birth in America and I'm happy that we, as a society, are talking about it again. Either out of fear or ignorance (or both), we have given up our rights to have choices in birth. To experience birth is a right of passage for women and should be considered sacred and handled respectfully, as should the newborn child. We all agree that we want our babies born safely, every mother wants that regardless where she gives birth. Every woman should give informed consent for every procedure that is done to her or her baby.
How has giving birth changed you?