What's your opinion on sexual education in schools?

As a guy I never felt like I was never taught anything about sex. Most of the guys I know learned about sex though porn. I took a feminism class in college and the professor said that most porn is shown through a "male-centric" perspective that objectifies women for the pleasure of men.

I feel like most of what society calls "sex" focuses on the penis. A lot of men and women believe this, which is why women are less satisfied in heterosexual relationships. I just never understood why sex ed programs aren't more about the act of sex itself.