How colleges can measure up in teaching ‘critical thinking’

A new project shows professors can design ways to assess the success of colleges in teaching ‘critical thinking skills.’ Yet early results show students need colleges to better produce that important ‘learning outcome.’

Michael Bonfigli/The Christian Science Monitor
Purdue University President Mitch Daniels speaks at a Monitor-hosted breakfast for reporters in Washington, D.C., in 2013.

After he became president of Purdue University in 2013, Mitch Daniels asked the faculty to prove that their students have actually achieved one of higher education’s most important goals: critical thinking skills. Two years before, a nationwide study of college graduates had shown that more than a third had made no significant gains in such mental abilities during their four years in school.

Mr. Daniels, a former governor of Indiana, needed to justify the high cost of attending Purdue to its students and their families. After all, the percentage of Americans who say a college degree is “very important” has fallen from 75 percent in 2010 to 44 percent today.

Purdue now has a pilot test to assess the critical thinking skills of students as they progress. Yet like many college teachers around the United States, the Purdue faculty remain doubtful that their work as educators can be measured by a “learning outcome” such as a graduate’s ability to investigate, reason, and put facts in context. Many still prefer the traditional system of course grades in specific fields or overall grade averages, despite serious concerns by employers about “grade inflation.”

The professors need not worry so much. This week, the results of a nationwide experiment were released showing for the first time that professors can use standardized metrics to assess the actual coursework of students – across many schools and disciplines – and measure how well they do in three key areas: critical thinking, written communication, and quantitative literacy.

The project involved more than 125 professors judging 7,000 samples of students’ class work from 59 institutions in nine states. It was initiated by the Association of American Colleges & Universities (AACU) and the State Higher Education Executive Officers.

The idea partly derives from the frustration among colleges over the many attempts by “outsiders” – from U.S. News & World Report to the Obama administration’s new “College Scorecard” – to rank or rate schools for consumers of higher education. Rather than continue to have professors remain on the defensive, the AACU took the offensive to show that faculty can define the generalized “rubrics” of what students should be learning.

Despite the success of the project in showing that teachers can design such assessments, the actual results are worrisome, and mostly confirm earlier studies. Of the students who attended a four-year institution and who completed most of their coursework, fewer than a third of their finished assignments earned a high score for a critical thinking skill (“using evidence to investigate a point of view or reach a conclusion”). Less than half of their coursework drew “appropriate conclusions based on quantitative analysis of data.”

The project organizers summed it up this way: “Of the three outcomes evaluated, far fewer students were achieving at high levels on a variety of dimensions of critical thinking than did so for written communication or quantitative literacy.” And that conclusion is based only on students nearing graduation. The project did not measure the work of students who have completed less than 75 percent of their coursework.

American universities, despite their global reputation for excellence in teaching, have only begun to demonstrate what they can produce in real-world learning. Knowledge-based degrees are still important. But given the pace of discoveries in many fields, employers are demanding advanced thinking skills from college grads.

If faculty can now work with college presidents, government leaders, and others to accurately measure the intellectual worth of a college degree, more people will seek higher education – and come out better thinkers.

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.