The problems plaguing higher education run deeper than the way learning is measured
The credit hour that dictates our calendars and structures our days was devised by Andrew Carnegie at the turn of the last century. Now “Cracking the Credit Hour,” a report released by non-partisan think tanks, the New America Foundation and Education Sector, aims to take down the time-honored unit.
The report suggests the credit hour ought to be replaced given the changing texture of higher and online education. But the credit hour is a secondary concern to the greater issues that plague higher education, and the report is unconvincing as to why changing this standard would improve our education system.
As academic currency, one credit hour typically equates to one hour per week of attending a lecture for a semester of 15 weeks. Critics have railed against this notion of “seat time” as a basis for credit. Credit hours, they argue, do not recognize alternative forms of instruction occurring outside the typical classroom. Critics contend that students who cannot afford to attend college, or working adults who don’t have the time, should still receive credit by a metric not based on the number of hours in class. Others have advocated for a self-paced education where students gain credit for learning certain amounts of material.
Online courses, which are often self-paced or do not require a certain “seat time,” have made the debate more urgent. How can credit hours certify learning done independent of time spent in the classroom?
In 2010, the Department of Education tried to answer this question by redefining an hour of credit. One credit hour could mean one hour of class attendance or “the equivalent amount of work over a different period of time,” according to this definition. More than 70 college associations protested, calling this a federal interference which resulted in a “complex, ambiguous and unworkable definition.” The regulation was revoked, and “Cracking the Credit Hour” is the latest to take up the mantle.
“Cracking the Credit Hour” contains the usual objections to the concept of “seat time.” Moreover, it notes that credit hours do not always translate equally between schools, resulting in a loss for transferring students. Finally, the report clusters a host of research about college graduates — from employer dissatisfaction to poor standardized test performance — to inveigh against the credit hour as a fair or accurate measurement.
In its place, the report considers new options. Essentially, it argues that credit should not be based on the hours of “seat time” but the sort of material learned. To assess this, more standardized tests could be taken. Or, college courses could upload their syllabuses and classwork so employers might know what a student learned in each of his individual classes. Or, an agency could set a uniform curriculum, such that a student would receive certification for mastering a particular knowledge. The report calls for more experimentation; none of its ideas are concrete.
The report is right that credit hours are not an ideal measurement. But the report doesn’t acknowledge that “credit hours” are basically placeholders having little to do with “seat time.” Secondary students get college credit for doing well on standardized tests. And, ask any undergraduate — taking a three-credit course does not mean you attend lecture for three hours a week. It means learning three hours’ worth of material, in or outside of the classroom.
“Cracking the Credit Hour” details the faults of higher education only to blame them on something as tangential as credits. Credit hours, like any measurement, are imprecise and to a large extent arbitrary. But they are the tools, not the problem itself. Studies like this should grapple more with why standards are lagging, employers are complaining and students are dissatisfied instead of offering a fix that is quick and irrelevant.