Are We Smarter Yet? How Colleges are Misusing the Internet

by Akim Reinhardt

Photo Credit, Chess dot comWe should all probably be a lot smarter by now.

The internet, more or less as we know it, has been around for about fifteen years. So if this magical storehouse of instantly accessible information were going to be our entrepôt to brilliance, we should all be twinkling like little stars. You and I should be noticeably smarter, and anyone under the age of twenty should be light years ahead of anyone who was ever under the age of twenty prior to the 21st century.

But, ya know, en masse, we're all about as fuckin' stupid as we've always been. After all, if we'd been getting smarter these last 15-plus years, you'd expect that humanity might have formed new and deeper insights into the nature of existence, and used those insights to update our collective goals: world peace, eliminating hunger, and flying taxis driven by cats wearing little chauffeur's caps. But not only haven't we gotten wiser and developed new collective goals, we haven't even gotten any cleverer and moved closer to achieving the same old ones we've always pined for. There's still the endless butchery of war and the terminal ache of starvation.

Of course, none of it's a surprise. There are at least two obvious reasons why the existence of a cheap, and even free storehouse of knowledge, the likes of which could not have even been imagined by most people a generation ago, has done little to make us all a whole helluva a lot smarter.

For starters, people can be lazy and superficial. Whether you prefer a Marxist interpretation, an existential one, or something equally incisive but less Eurocentric, the conclusion is the same: Lots of people are largely obsessed with chasing pleasure and shirking meaningful work. They'd rather read about celebrity gossip than learn about mechanical engineering or medicine. They'd rather indulge a neurosis or compulsion than work towards the common betterment. And they'd rather watch funny cat videos than try to figure out how those ghastly little beasts can better serve us.

This is why when you plop an unfathomably rich multi-media warehouse of knowledge in front of them, they'll mostly use it to wile away the hours on Facebook and Twitter. In much the same way that if you give them an actual book, and eliminate the social stigma that says books are sacred, instead of reading it they might be inclined to rip out the pages and make paper airplanes. The creative ones might set them on fire before pitching them out the window, in a quest to create a modern, aerial Viking funeral.

This helps explain why the internet is dominated by low-grade pornography.

That assessment is partly tongue in cheek, of course. Many people, perhaps most, really do treasure a lifetime of learning, at least to some degree. But putting that aside, there's another reason, beyond the too easy targets of human sloth and gluttony, which helps explain why the world wide web isn't making us that much smarter.


PHOTO CREDIT Texas A&M UniversityIt's because simply handing someone a library card, or an internet connection for that matter, and expecting them to just “get smarter,” is a really shitty pedagogy

Beyond the various motivational techniques a good teacher can use to inspire pupils, students can also profit immensely from curricular guidance. This is why teachers don't just turn you loose in the library for a semester, but instead actually assign materials to read. Turning someone loose in the library can of course result in in genuine wonderment. It can and often does lead to all sorts of learning at the intersection of curiosity and chance. But there are real limits to autodidacticism, which is why we have teachers to begin with.

But if a teacher-student relationship can be essential for gainful learning by students, then that begs the question: What is good teaching?

Pedagogy, the technical craft of teaching, is continually developing. It must react to and complement the historical and cultural circumstances of any given place and time. What passed for good teaching in Ming China (1368-1644) or the late Ottoman Empire (turn of the 20th century) may not cut it in modern America. And for that matter, what we see as good teaching in Modern America might not have even been possible in Ming China or the late Ottoman Empire, much less welcome.

So what then is good teaching in modern nations like the United States, which have a wealth of resources to devote to education, including dedicated professional teachers, healthy students subject to reasonably high expectations, established infrastructure, and the latest in available technologies?

To some extent the answer is up for grabs, and that's actually a good thing. A truism in any discipline, ranging from the humanities to the sciences, is that debate and questioning are essential to learning. The same holds true for what we know about learning itself. And so sound pedagogy must always be subject to rigorous examination, analysis, and reconceptualization.

However, all of the disciplines also inform us that truly sound knowledge can be aggregated, often in a multitude of ways, and that some fallacies can be dismissed. So while pedagogy must be flexible and open to new developments and improvements, it's also important to pinpoint those approaches that do not work well enough to be kept, or at the very least needs to be substantially improved upon.

I won't address K-12 pedagogy. In many respects, I'm simply not qualified to talk about, for example, the best and worst ways of teaching basic science to a roomful of twelve year olds. Instead, I want to discuss something I have quite a bit of experience with: teaching college students, particularly at the undergraduate level.

While ivy infested institutions like Harvard and Yale date back to the 17th century, and some European schools such as Bologna (1088) and Cambridge (1209) are so ancient as to be downright creepy, the modern university as we understand is actually not all that old. Modern universities are really a 19th century development, with Germany leading the way in Europe and the United States doing the same in the Americas.

That means modern, post-secondary education and accompanying pedagogies are not very old in the big scheme of things. Initial pedagogies in modern universities were of course inherited from an earlier era. And they could be quite mind-numbing. In an age when actual books were prized, cherished, and even rare, “learning” might consist of sitting down and writing furiously as a professor read aloud from pages students may not have otherwise had any access too. For these and other reasons, rote memorization was also considered a vital pedagogic technique.

Photo Credit UCLAThis older pedagogy, which dominated 19th and early 20th century universities, gave rise to the popular image of the stuffy old professor and his droning, or possibly entertaining lectures, depending on your view and his oratorical skills.

Come, sit at the foot of a great learned man, and pay rapt attention as he shares his wealth of knowledge with you.

There's actually something to be said for the wise old man (or woman) on the mountain approach to teaching, and lectures are still certainly a part of modern pedagogy. But they're only a part. Because pedagogy has come a long way over the last several decades, and good college instructors now incorporate a variety of techniques, inside and outside of the classroom, to facilitate student learning.

Integral to most of these modern pedagogical developments are forms of student-teacher interaction that move beyond the rather passive model of students listening to an uninterrupted lecture. Walk into a modern college classroom, and it won't be the least bit surprising to see students talking to the professor instead of visa versa, or to even see students talking to each other. A variety of approaches to in-class writing assignments are also used to facilitate interactions between students and teachers and among students.

There have been ebbs, flows, and gradual transitions in the culture of college pedagogy. But the vast majority of today's professors understand that there are far more effective ways to teach than just the ancient practice of lecturing, and most of them experiment with a variety of techniques and incorporate the ones that work best for them and their students.

Sigh. In a perfect world, maybe. But it's never really that simple, is it?

There is a tension within academia. Truly innovative and superior pedagogical techniques tend to be expensive. Why? Is it because they rely on expensive technology? No. It's because many of the best pedagogies are most effective, or sometimes only even possible in smaller classrooms. And that gets pricy. It's just a lot cheaper to cram hundreds of students into a single lecture hall and have them all listen to the wise old professor spout forth his or her nuggets of knowledge and pearls of wisdom.

This labor cost helps explain the rise of various non-tenure track instructors, including a veritble flotilla of part-timers, but that's another issue altogether. Either way, it has been common knowledge in higher education for decades now, both among administrators and professors, that strict adherence to lecturing is an inferior pedagogy.

No wonder then that lectures are often complemented by discussion sections. The typical formula is two or three hours of lectures per week in a large hall, and then one hour per week in which those hundreds of students are broken down into smaller discussion groups.

At first, this seems like sound, innovative pedagogy: mixing lectures with intimate discussions of the course material. Except those discussion groups are generally not led by the great man or woman who officially teaches the course. Instead they are typically led by other students. Graduate students, to be sure, but students nonetheless, who are certainly not as qualified from a content or pedagogical standpoint. In fact, it's often the very first time they've ever taught at all. And of course they get paid next to nothing, which is what makes it so cost effective.

That model, as it is often executed, is a bit half-assed. Or to put it more politely, wildly inconsistent. The truth is, there's no getting around the fact that direct interaction with your actual professor is one of the hallmarks of a superior college education.

Want more proof? Just ask yourself this: Aside from a smattering of exceptionally vain and wealthy people, who on earth would ever consider paying fifty- to a hundred-thousand dollars a year to send their kid to a small, private college if they could get the same quality of education at a large public college for a fraction of the cost? Yet year after year, thousands of poor and middle class families work extra jobs, take on second mortgages, and have their children go into hock so that they can attend very small and very expensive colleges. And the reason is simple. Students really do typically get a better education at a small school with small classes.

I mean if you really have any doubt at all about this, just consider that graduate courses, where there's a lot on the line and everyone involved knows better, are typically capped at about a dozen. Half that is hardly unusual. Twenty would be considered an absolute monstrosity of a course. And one-on-one tutorials are available for credit in most programs.

All of that brings us to the rise of MOOCs. For those of you not up on your academic and/or pedagogical jargon, which at times can be almost as bad as military jargon, a MOOC is a Massive Open Online Course. It's very new, but it's all the rage. It's already even got a Wikipedia page.

In a nutshell, MOOCs are a return to the quaintness of the dark ages, at least from a pedagogical point of view. First, film a professor delivering lectures to a massive lecture hall. Then, make tProfessor lecturinghose lectures available on that shiny new device which is supposedly making us all so much smarter: the internet.

Now, before I eviscerate MOOCs, let me say that there are some truly amazing things about them. In the same way that modern printing presses made books readily available, MOOCs can make really excellent lectures by top notch scholars readily available. That's absolutely wonderful. It really is great that any anyone can watch a great professor lecture. Just like it's great that anyone can have a library card.

But a library card, or an internet connection, isn't enough by itself.

The problem, and oh boy is it a problem, comes in using MOOCs to award college credit. Doing so is nothing short of taking a giant pedagogical step backwards. Why? Because a MOOC is composed solely of lectures, usually with no actual interaction whatsoever between student and instructor. Aside from an online bulletin board or chat, there's also no discussion. And the grading is conducted by . . . actually, in the world of MOOCs, it's not even clear yet who's going to be doing the grading. In a lot of cases, the students are actually grading each other. I shit you not. It's called “peer evaluation.”

In the 21st century, that's disgraceful. I mean, let's be perfectly honest about this. Awarding people college credit for a MOOC is not very different than giving them college credit for watching TV. It's about using a shiny new technology to peddle a dated and decrepit pedagogy. And using degraded assessment (grading) techniques, such as “peer review” or at-home multiple choice quizzes that are entirely vulnerable to cheating, is utterly lacking in credibility. To pretend otherwise is fairly reprehensible.

Now here's the real kicker. This kind of thing sort of goes on at some colleges anyway, just without the TV part of it. There really are classes where students simply go to large lectures, have no discussion sections, and no meaningful interactions with the instructor. Though it's important to note that student exams and papers in large on-campus classes are tpyically graded by graduate student teaching assistants, instead of shady and even corrupt practices like peer- or self-grading. Not that graduate students grading undergraduates doesn't pose its own set of problems, but at least it's not fraud.

The ongoing use of this thoroughly dated pedagogy is disgraceful, but it happens, usually at the freshman level and particularly at large research schools (ie, a lot of the famous ones) that have many thousands of students, and professors who research more than they teach. Historically, this hard little bit of reality is something that university administrators have not bragged about, because it really is quite shameful and, in some respects, represents a fleecing of tuition money from people.

But that open secret is now becoming less secretive. Some administrators are overreaching. Some elite institutions, in league with upstart for-profit corporations, are trying to legitimize this backwards pedagogy in the form of MOOCs.

Why? Because there's potentially a boat load of money in it for them. The overhead costs are a fraction of actual teaching. And what about on the consumer side? Yes, many college adminstrators have been referring to students as “consumers” or “customers” for some time now. And the speculation is that millions of “customers” would jump at the chance to “purchase” college credit from Stanford or Harvard or some other prestigious school, simply by paying tuition, watching internet TV, and engaging shoddy and even crooked grading rubrics.

Sign me up!

And none of this even addresses the ethical or economic concerns and stunning hypocrisy of MOOCs potentially wreaking horrific degradation upon the professional labor market. On that count, see this open letter from the San Jose State University Philosophy Department, which offers a more cogent, eloquent, informed, and insightful summary of that aspect of this issue than I could ever muster. Or read this clever satirical piece about the potential for MOOAs: Massive Online Open (college) Administration.

Thanks to modern pedagogy, we're long past the point of conceptualizing quality education as mere rote memorization, as little more than cramming a bunch of knowledge into one's head. Or at least we're supposed to be. Of course acquiring new knowledge is a bedrock to learning. But we now subscribe to an educational ideal that says we as professors should also teach students how to teach themselvPhoto Credit Cats Driving things dot Tumbler dot comes; how to think critically and creatively; how to investigate a library with purpose, instead of on a lark. We are not here to tell students what to think. Rather, we're here to teach them how to think so that they have the skills to decide for themselves what to think, and to do so in a rigorous and sophisticated way. That's how you produce responsible and informed citizens, skilled and productive workers, educated and thoughtful people. And maybe even cat chauffeurs of the future.

Akim Reinhardt blogs regularly at ThePublicProfessor.com