As more women graduate from college, the teaching profession becomes more female

One of the great accomplishments of the late 20th century was to bring women onto a more equal footing in the labor market. Salaries became more equal. Employers opened up jobs for women. Educational opportunities became more gender-equal. And for college-educated women, all of this meant that careers outside teaching and nursing became possible. One … Continue reading As more women graduate from college, the teaching profession becomes more female