Moto
Tank
- Joined
- Feb 2, 2004
- Messages
- 1,559
- Reaction score
- 0
I agree they need to know it, but I don't think it is the schools place to teach it. It should be done in the home.
It is absolutely the school's place to teach it.
It is awkward as hell to talk about sex with your parents. If it were left completely to my mom and dad to teach me about sex, I wouldn't get all of the information I need. Plus, I would have been creeped the shit out listening to my parents explain the process.
Instead, I got to hear about everything from my science teacher in 7th grade and again from my health teacher in 9th grade. It wasn't awkward at all, save for some giggling here and there. And, everybody got a solid education. Believe it or not, some good questions were asked by students and answered by the teacher. We all walked away more informed. Knowledge is power!
Seriously, though, what I learned in school I wouldn't have learned from my parents for a couple reasons. First, I was too afraid to ask at that point in my life. It's actually still pretty weird to talk about it with parents even at 19. Second, my parents are not qualified to teach me everything about sex and neither are most parents. My parents are not even religious and they still didn't tell me anything about it when I was younger. I'd imagine a lot of religious parents would have extreme difficulty discussing it or simply wouldn't discuss it all in an effort to keep their children "pure" or something like that.
So the schools teach it and everybody wins. Teens don't get corrupted and sinful when schools teach them about sex, they get informed. They use that knowledge to their advantage, to their parent's advantage, and to society's advantage (although, of course, some choose to ignore it). The idea that schools shouldn't teach it is absurd.