It’s time to declare the end of adolescence. As a social institution, it’s been a failure. The proof is all around us: 19% of eighth graders, 36% of tenth graders, and 47% of twelfth graders say they have used illegal drugs, according to a study by the National Institute on Drug Abuse and the University of Michigan. One of every four girls has a sexually transmitted disease, suggests a recent study for the Centers for Disease Control. A methamphetamine epidemic among the young is destroying lives, families, and communities. And American students are learning at a frighteningly slower rate than Chinese and Indian students.
The solution is dramatic and unavoidable: We have to end adolescence as a social experiment. We tried it. It failed. It’s time to move on. Returning to an earlier, more successful model of children rapidly assuming the roles and responsibilities of adults would yield enormous benefit to society.
Prior to the 19th century, it’s fair to say that adolescence did not exist. Instead, there was virtually universal acceptance that puberty marked the transition from childhood to young adulthood. Whether with the Bar Mitzvah and Bat Mitzvah ceremony of the Jewish faith or confirmation in the Catholic Church or any hundreds of rites of passage in societies around the planet, it was understood you were either a child or a young adult.
In the U.S., this principle of direct transition from the world of childhood play to the world of adult work was clearly established at the time of the Revolutionary War. Benjamin Franklin was an example of this kind of young adulthood. At age 13, Franklin finished school in Boston, was apprenticed to his brother, a printer and publisher, and moved immediately into adulthood.
John Quincy Adams attended Leiden University in Holland at 13 and at 14 was employed as secretary and interpreter by the American Ambassador to Russia. At 16 he was secretary to the U.S. delegation during the negotiations with Britain that ended the Revolution.