Certainly we’re all aware of the scrutiny higher education has been under lately. Along with the ongoing “budget fatigue” many of us have experienced here at SIUE, it seems as if there are a never ending host of issues beleaguering American universities, public and private: rising tuition and fees, falling public support, administrative bloat, soaring student loan debt, underpaid adjuncts—the list goes on. For those of us in CAS, you might also add concerns about enrollments and sagging numbers for majors in the liberal arts and sciences. If you are a regular reader of the Chronicle of Higher Education, or follow these issues on social media, a new “crisis” emerges in higher ed seemingly every week (as a historian, I would be remiss if I did not point out that very little of this is new. As long as there have been universities in North America, there have been persistent questions about their utility and viability).
There are numerous explanations for the current climate surrounding higher education. As Linda Markowitz observed in her column for this space, increasingly it seems that education, and especially higher education, has become a transaction, with the students as consumers. Thus, students and their families (and politicians) view college as a return on an investment, and students face the dreaded “What are you going to do with THAT?” question for any major that is not deemed practical (i.e. leading to an immediate and obvious career path). Further, as debates over state funding for higher education grow more intense, politicians openly question the value of humanities and social sciences degrees, as Florida Governor Rick Scott did in 2011, when he asked, “Is it a vital interest of the state to have more anthropologists? I don’t think so.” Scott went on to summarize this transactional view of higher education perfectly, explaining “If I’m going to take money from a citizen to put into education then I’m going to take that money to create jobs. So I want that money to go to degrees where people can get jobs in this state.” Even President Obama has made similar comments, joking in 2014 with an audience in Wisconsin, “I promise you, folks can make a lot more potentially with skilled manufacturing or the trades than they might with an art history degree” (Obama later apologized for the remark, to the Art History Department at the University of Texas, no less).
Increasingly, critics are calling into question the very roles that full-time, tenure-track faculty play in our universities. Earlier this year, after Wisconsin Governor Scott Walker proposed substantial cuts to the University of Wisconsin system, in defending those proposed cuts, Walker suggested that “Maybe it’s time for faculty and staff to start thinking about teaching more classes and doing more work and this authority frees up the [University of Wisconsin] administration to make those sorts of requests.” Walker is not alone. A recent bill proposed in the North Carolina state legislature would require all faculty at state universities in North Carolina to teach what amounts to a 4/4 teaching load (at least 8 academic courses per year). Another proposed bill in Iowa would require all University faculty to teach at least one academic course per semester (thus eliminating any form of paid research leave). Taking the transactional nature of higher education to its logical extreme, this bill would also require that professors, regardless of tenure status, meet a minimum threshold on student evaluations or lose their job (nevermind that factors like gender, age, and physical appearance can influence how students evaluate their professors, or that student participation in such evaluations is routinely low). Most outrageous of all, this bill would then require Iowa universities to publish the names of the five professors with the lowest scores who exceeded the minimum bar, and then allow students to vote for who should keep their job.
Clearly, higher education, and especially the humanities, arts, and sciences, face numerous challenges going forward. We are asked to increase affordability and accessibility, find new revenue streams, all without sacrificing quality in teaching or research while simultaneously promising a greater return on students’ investment in the form of future economic promise. Despite these challenges, I think we are in fact doing many of those things. For those of us in CAS for example, this has included streamlining programs, revising curriculums, offering more opportunities like internships, putting more courses online, and so on.
In the current political climate, all of these things are probably necessary and important ways in which we have to adapt. Yet, I would also add, nay, urge, and urge strenuously, that we simply cannot forget the things we do well. Furthermore, it is up to us to remind the world outside of higher education—politicians, students, parents—what those things are, and why they are important. Let me give you one quick example.
I teach in the Department of Historical Studies here at SIUE, and as I am sure many of my colleagues also experience, I have to continually fight the assumption among students that studying history means memorizing long lists of names and dates and events. The study of history, of course, is much more than that. I do have exams in my courses, but they are usually in the form of written essays that ask students to argue interpretive points about the past, and support those arguments based on evidence drawn from course materials. That said, being able to argue about the past requires some knowledge of the past, so there is some of that dreaded memorization involved. In past semesters, I have often provided my classes with a study guide of key terms to help students prepare for exams, but I found that students fixated so much on memorizing those terms, it was counter productive.
So, one semester, I tried a different approach. I still provided students with a list of key terms, but I told them the exam would be “open notes.” My hope was that this approach would ease their anxiety over memorization, and help them to focus on broader themes while studying. As I began grading the exams, I was surprised by the students’ answers. I started to notice that the answers included a good deal of extraneous information, entirely irrelevant to the questions I had asked on the exam. Furthermore, many of the answers included word choices and phrases that were not only extraneous, but identical to the language on other students’ exams
At first I was suspicious some students had been cheating, but I quickly realized what had happened: in allowing students to use their notes, they had prepared for the exam by essentially printing off whatever online materials they could locate, whether that was from Wikipedia or other online articles on the topics I had told them to study. The extraneous information and identical words and phrases I found were contained in the first couple of sentences of the Wikipedia entry on the subject at hand.
This experience was instructive for me in several ways (not least of all in pushing me to continually adapt how I approach my assessment strategies). Among other things, what it told me was that even having the wealth of knowledge of the Internet in front of them did not do much to help students perform well on a written exam. In fact, all of that information appeared to be overwhelming. Students were frequently unable to discern what was and was not important or relevant to the question at hand, or relevant to building an argument to answer an essay prompt.
In other words, it showed me that what we do is more than simply passing on the collected knowledge of our chosen discipline. Anyone with an Internet connection can access Wikipedia and obtain knowledge. We teach students how to gauge and interpret (and question) knowledge; how to discern what is and what is not important when asking and answering questions. All of the disciplines contained in CAS—humanities, arts, hard and social sciences—teach students to grapple with complexity, express themselves clearly, and think analytically. (As luck would have it, these are all among the most desired characteristics employers look for when hiring college graduates.)
As we are encouraged (and required) to think outside the box, stretch our boundaries, and adapt to changing times, we must also continue to stand up for the value of what it is we do, and why it is important. Most important, it is up to us to make the case that we are not stodgy academics who like the way we do things because we are stubborn and old-fashioned, but that there are valid and important reasons why traditional, classroom learning works, and often works best. As we continue our focus on experiential learning, let us not forget that a face-to-face discussion or a lecture is also an experience. As we race to put more and more classes online, like Arizona State University recently announced, perhaps we should stop and ask why so many students overwhelmingly indicate (in my experience) that they dislike online courses, and, more important, why the completion rate of online courses is substantially lower. As we embrace e-textbooks, we should stop and question why the sale of e-books has plateaued in the publishing industry, and consider the compelling research that demonstrates, pretty conclusively, that retention and learning skyrockets when reading physical print over digital books.
Finally, we also have to make clear the value of research, and its connection to teaching. When asked if we should be working harder, or teaching more classes, it is up to us to demonstrate what we do—how research informs teaching, and vice versa. Back in the 1990s, the Virginia state government called into question what it is Professors do with all of their time if they are only in front of a class for 6 or 9 hours a week, posing similar questions about teaching loads that we are hearing today (see, none of this is new). One of my mentors from graduate school, Edward L. Ayers (now the outgoing President of the University of Richmond), wrote an eloquent summary of how he spent his time each week, from teaching, to research, to advising and mentoring, and committee work. Twenty years later, his essay still holds up, and I am reminded of his closing thoughts:
“What’s the common denominator, then, in what professors do all day? Translation. We translate from a field of knowledge to people who want to know about it. . .We all live in at least two worlds. One of those worlds is a world of ideas, of print and numbers, a world almost limitless and impossible to master, growing every time we turn our backs. The other world is the immediate and human world of classes, committees, office hours, deadlines, budgets, advising. Without being a citizen of both worlds, an active participant in both worlds, we are diminished, our ability to teach diminished. The dichotomy between teaching and research is no dichotomy at all if we understand that a professor journeys back and forth between two worlds, translating among many people.”
I would suggest that the importance of translation applies not only in translating our chosen disciplines to our students, but also translating the value of what we to do, and why we do it, to those who would call it into question. Our very survival may depend upon it.