Thursday, May 2, 2013

Teaching, and the educational role of the university in the 21st century



When I went to college in 1988-1992, my experience was similar to how many science-oriented students experience college today.  I went to classes, wherein a professor stood in front of the room and told us facts and perspectives about their subject.  In most biology courses, my role was clear-- listen, take notes, and occasionally (genetics, especially), learn approaches to solving particular kinds of problems.  Students asked questions occasionally, but probably over half the class never asked the professor a single question all semester, and no one had the audacity to ask the professor questions after every class.  They may have asked the teaching assistant a few questions, but these TAs were often only marginally more familiar with the material than the students they oversaw.  Honestly, much (though not all) of this so-called learning involved rote, short-term memorization of facts.  If we students got something wrong on an assessment, it was our fault for not having understood the material from its single presentation (irrespective of how well it was presented or not reinforced).

The world is different today.  There are very few facts in the world that I cannot find in mere seconds using my computer or smartphone.  Virtually everything known is accessible to the world on the internet, though there are also a lot of misunderstandings on the internet portrayed as facts.

As I see it, universities have two potential educational roles in this new era (I'm not addressing research roles here).  The first is a service role to the community.  Universities have always been the storehouses of knowledge and understanding, but it's both arrogant and short-sighted for universities to perceive this role as exclusive to their students or in-field colleagues, particularly given the amount of public funding that they receive.  University faculty can serve their communities and the world by providing or "authenticating" facts, evidence, and diverse perspectives in their study areas through the internet and other media.  A university's role in dissemination is not symbolic of arrogance-- university faculty are regularly consulted by the media to interpret new findings or perspectives in their areas given their expertise and training, and this role is merely to be more pro-active.  The public's interest for such reliable sources is there-- if one is diagnosed with cancer, would one prefer to just Google "cancer" and look up information on whatever site comes up (perhaps "homecancerremedies.com")?  Or would one prefer to get information from the National Cancer Institute of the NIH?  Presumably the latter, and this example illustrates the public's perceived value of "authenticated" information.  Similarly, if I wanted to learn about genetics or psychology or economics or art history more broadly, I'd love to take a free online course (or "MOOC") from a practitioner who has an advanced degree in the area and was hired by a university as an expert in that area.  Neither of these features guarantee that the information will be presented coherently or that the presenter won't be wrong or that there aren't better resources, but it's a safer way to start the road to learning than a random internet search.  MOOCs are not the only means of public dissemination, but they are a good one that is both effective and engaging.  Freely providing knowledge is not only an important gesture by universities to their communities, but arguably an obligation, and it can also facilitate learning of their on-campus students (see below).

The second role I discuss is the university's primary one-- to help their enrolled students learn.  This learning can no longer be rote, short-term memorization of facts-- such "learning" trivializes the role of the university relative to the internet.  Instead, we need to engage with students directly, and in a manner that far exceeds what is possible through the internet or free online MOOCs.  Our courses need to go beyond fact dissemination-- we need to engage students both individually and in groups to assess how well they are interpreting and applying the concepts we're presenting them.  The flipped class is one means of achieving this goal-- students get the primary content in some way outside the class period, and their understanding is assessed.  This assessment step is critical-- students learn what elements of the material they didn't correctly interpret or apply the first time, and faculty receive feedback to correct frequent student misinterpretations and misapplications in their presentations.  The faculty then spend the class period clarifying areas of confusion directly in response to the student feedback, and then reinforcing true understanding of the material with new problems, applications, and engaging discussions.  The format forces faculty and students to interact bidirectionally in the learning process, and this bidirectionality has obvious benefits both to student understanding and faculty teaching strategies.  It's also personally satisfying for both parties, as faculty become less "lecturers" and more "facilitators" in the classroom, they work with the humanity of students rather than treating students as consumers of prepackaged products.  Relatedly, I've become a firm believer in "open-book" assessments, too, for two reasons-- 1) the world is essentially "open-book" so assessing in a situation where simple facts cannot be quickly checked is (usually) unrealistic, and 2) it forces the faculty member to produce questions that are not merely regurgitation of facts presented in the course, and thus better assess student "understanding" on a higher level.

None of what I've said above is novel or revolutionary.  However, many faculty and students are too comfortable with standard lecture formats for our classes (especially in the sciences and social sciences-- less-so in the humanities and interpretive social sciences) and are resistant to changing the roles, particularly given the upfront work involved.  While our time is limited, the goal of all universities (both students and faculty) should be to promote the best learning possible, so isn't this worth the investment?  Similarly, a lot of people view MOOCs as a threat to our universities-- we're giving away for free what students paid thousands of dollars to receive.  Some have said that MOOCs are a means for the "elite" universities to secure their position and displace others by disseminating content possibly (and often incorrectly) perceived  to the broader public as "better" than what a good state or liberal arts school may provide.  I argue that, if colleges or universities fail to provide opportunities for MUCH more mentoring and learning in their on-campus classes than what happens in topically equivalent MOOCs, they're wasting their students' time and money.  MOOCs can educate the public and can be a tool to enhance or supplement available on-campus classes, but they are no replacement for an on-campus undergraduate education should be.

Times have changed, and forward-thinking universities are beginning to change accordingly.  It's up to universities and their faculty to keep up with these changing times.  If universities don't change quickly, prospective students will soon figure out which schools are least likely to provide a return on their investments...


2 comments:

  1. Hi Mohamed,
    I've been teaching in a "flipped classroom" since I started as faculty at U. Iowa, and it's absolutely my favorite thing about teaching. The students read outside of class, then are assessed (usually with IF-AT quizzes; http://www.epsteineducation.com/home/about/default.aspx), and I can use those assessments and that work they've done outside of class to really build on bulking up their long-term conceptual knowledge for any given topic. I lecture for 10 minutes at most, then we all work on problems or activities while I circulate around to all of the groups. At the end of an activity, we go through the big take home messages, clear up confusion, and summarize. I get to interact with every student in class every day and I know which students are struggling and I can actually help them work through their problems.

    Iowa is investing heavily in what they call TILE classrooms (our promotional video: http://www.youtube.com/watch?v=yvEN4jJ4WUM), which really facilitate these teaching styles. I'm VERY happy to have been on the ground floor for this transition, and it has really made me question the value of straightforward lecture-based classes.

    That said, I admit to being wary of MOOCs, but I think most of that feeling comes from a fear of the unknown. I have many, many concerns about how a MOOC might improve learning as opposed to just exacerbating some of the problems that large lecture based classes have. I don't really know how one "successfully" implements a MOOC, and, given my experiences in these TILE rooms, I wonder what the effect of increasing the physical divide between professor and student might be. But if what you're advocating is that MOOCs represent a way for the general public to learn about topics, and not necessarily a way for us to teach our undergraduates, then maybe I feel differently. My sister (a lawyer) took the first iteration of your MOOC and reported that she really did learn some things about genetics and evolution that she hadn't known before.

    Anyway, great blog post.

    ReplyDelete
  2. Great post, Mohamed! I agree with you that the educational landscape is changing and that, to remain competitive, top-tier universities must offer interactive courses that do not just rely on lecture-based memorization.

    That said, here's what I'm wondering:

    What do you think it will take to transition university courses into more interactive, learner-centered environments? My impression is that the lecture format takes less input-time for the instructor than an interactive format. Given the pressures of research on university faculty, I find it hard to believe that we need to merely train our faculty to teach with interactive strategies. If they don't NEED to, why would faculty go to the extra lengths to redesign their courses and put in more energy to make their courses interactive? How can the university, which relies so much on funds from research-generated grants, incentivize good, interactive teaching & learning?

    ReplyDelete