Ask most students on campus why they came to college and the knee-jerk response is get a job that pays well that (ideally) we don’t hate. There are other expectations, too: meeting new friends, exploring your identity and seeking new experiences. But,at the end of the day it seems impossible to separate what you study and finding your future path.
Has this school-career relationship always been indivisible? What’s the societal purpose of higher and public education in the United States—and what does this reveal about American culture?
These are the questions that haunted me for the last six months as I burned through historical, political, economic, sociological and poetic texts on the development of formal schooling in our country.
The oversimplified rundown is this:
In our colonial days, the first American colleges were established primarily to serve the sons of well-off, white families. Not surprising, right? White men got college first, but what is striking is that higher education originally had nothing to do with job prospects.
Instead, it was a place for boys to become well-read, free-thinking young men groomed for civic leadership.
After graduating—or not, no biggie at this point—the men moved on to professional training by holding apprenticeships under doctors, lawyers or religious leaders. Educational reformers like University of Nashville President Philip Lindsley would try to break this mold by advocating for overtly “practical” courses, but the simple format of studying dead languages and philosophy held up through the Civil War.
Obviously, times have changed: now, being in a major without a linear path to employment yields half-hearted parental grins littered with confusion and fear, and not to mention political scorn across the ideological spectrum—everywhere from KY Governor Matt Bevin to former U.S President Barack Obama.
The contempt for English and Anthropology even shows up in our architecture. Take a glance at Landrum compared to the BAC and Griffin Hall.
The “education as job training” paradigm began to develop with the passage of the Morrill Land Grant Act of 1862, which appropriated land to state legislatures under the condition that proceeds from selling said land were used to build colleges that included agricultural and mechanical studies. The bill’s sponsor, Justin Morrill, stressed the urgency of adjusting school to meet the needs of our industrializing economy.
Because you don’t get federal funds when you secede from the union, Congress passed a second Morrill Land Grant Act in 1890 for the South with an additional provision that administrators prove race played no factor in the admissions process, or that they offered “separate but equal” campuses.
The A&M programs gained some support, but what colleges really needed to boost admissions was a public education system that readied the masses for advanced studies.
Although our current K-12 system is still a work in progress (at best), every state offered public schooling by the turn of the 20th century. Paired with the closing of the frontier, the new “party culture” on campus and the rise of football, colleges were for the first time considered “cool” in the public eye.
Magazines, memoirs and even board games depicted collegiate life as this now-familiar wild ride, where classes are merely obstacles on the journey. However, even with their new reputation, admissions rates remained extremely low. Then came World War I.
By using campuses as military training sites and calling on professors to guide war strategy, the war turned out to be great PR for higher education.
Enrollment jumped dramatically, further straining the already-ailing professor-student relationship. This period between the wars is the first time honors colleges and admission caps emerged as hiccups of resistance to academic hedonism, but these small forces could not trump the enrollment explosion that followed World War II.
Congress sent two million veterans to school via the GI Bill and the government became the unofficial research contractor of the nation, investing billions in military advancements.
But as the university attempted to juggle fire, it lost steam. Enrollment dropped for the first time since WWII in 1975. Administrative panic manifested in pre-professional majors, career service programs and extravagant gymnasium and dorm renovations. Schools started reaching for nontraditional students by setting up satellite campuses, which is how NKU got its start:as an extension of the University of Kentucky.
Colleges essentially started running like businesses; in effect, students were treated as customers and campuses like shopping malls. Why else do you think we have a 40-person hot tub when we can’t provide our professors job security and decent salaries?
Let me be clear: this school-as-business issue is not unique to NKU, or even higher education. Public education bleeds all the same.
We must consider who wins and who loses in this setup? Who profits when public schools are overcrowded classrooms of standardized material and state colleges pride themselves on graduating laborers into the workforce?
There’s no doubt universities end up being more than job prep, just as school teachers go miles above and beyond monotonously piling testing strategies into students’ brains But, imagine the outcomes if the learning outcomes were personal development, healing and critical discussions rather than just an occasional byproduct.
Would young people’s skyrocketing rates of anxiety and depression prevail? Would young white guys continue to shoot their peers on a semi-regular basis?
Sociologically speaking, education serves as a primary vehicle of socialization, but be wary and wise of what type of person the culture and curriculum rewards you for becoming—and maybe dare to ask again why did I do college?