Inside Higher Ed reports today on Teaching with Classroom Response Systems: Creating Active Learning Environments (Jossey-Bass, 2009), a new book by Derek Bruff, Assistant Director of the Center for Teaching at Vanderbilt. Actually, IHE interviews Bruff about the book’s topic—classroom response systems or “clickers”—instead of presenting a book review.

The interview raised some important questions for teachers and institutions using clickers, such as the matter of standardization. Teaching with Classroom Response Systems just released on February 17, so I haven’t seen the book or had time to read it. Therefore, I won’t pretend to comment on the book itself. Two items within the interview, however, caught my eye and call for a quick response.

When asked about brand standardization, Bruff replied (in part),

Most faculty and staff members with whom I talk about clickers are concerned with the cost to students of the devices. This has led many campuses to adopt particular brands of clickers so that students need not purchase two or three clickers for different courses. Not only does this save students money, but it makes it easier for staff to provide technical and pedagogical support for faculty members using clickers.

I agree with this important concern, but read on. When asked what features he’d like to see added to existing clicker systems, Bruff answered,

Clickers do a great job of collecting and aggregating student responses to multiple-choice questions. Existing technology does not, however, work quite as well with free-response questions. I am hoping to see the development of input devices that allow students to quickly and easily respond with words, phrases, or sentences.

Devices that allow students to quickly and easily respond with words, phrases, or sentences already exist, and most students have already incurred the cost of purchasing these devices before they ever show up for the first day of their first college course:

Having used classroom response systems (clickers) for the past couple of years, I recently abandoned them in favor of a polling and free-response system that students can access online or using SMS. Our institutional research people here at Pepperdine tell me that 95% of our incoming students bring laptops with them to college, and virtually all bring cell phones with text-messaging (SMS) plans. Using these existing tools addresses both of the issues above: students (or more likely their parents) have already purchased these tools, thus addressing the cost issue, and free-response questions are easy to capture, thus addressing the usage issue. Such schemes must always contend with poor cell reception, of course, though an institution that wanted to standardize could always roll an iPhone into the standard entry package, as Abilene Christian University has done (and others have since followed suit). Using laptops assumes that the college or university maintains a reliable wireless network, too, but that assumption should hold good at reputable schools these days. (I mean “should” as a moral claim, not an ontological claim, for those of you keeping score at home.)

Bruff knows about these solutions, of course:

I have spoken with several instructors who have started to use systems that allow students to submit responses via various mobile devices — cell phones, smart phones, and laptops — that make it easier for students to do so. These developments are exciting, but there is a need for tools that will help instructors quickly make sense of responses to open-ended questions. Development of such tools would open up a lot of possibilities for these systems.

No “tool” can substitute for actually reading free-text responses, of course, but some strategies already exist for quick processing, and others are surely coming. Consider, for example, the popular “word cloud.” For this example, I asked my students to supply one word that could complete the sentence, “The Bible is …” They texted their responses to a short code issued by my polling service provider, PollEverywhere. Using PollEverywhere’s tools, I immediately generated a .csv file with a list of student responses. I copied the responses out of the .csv file and pasted them into Wordle, and got the following result (the size of the word indicates the frequency of response):

Wordle: The Bible Is

Naturally, I would prefer a tool where I didn’t have to export, copy, and paste the data, but I imagine that such tools will become widely available before long. And I imagine that Bruff’s book, at 240 pages, includes discussion of “best practices” for clicker use that would apply to any sort of live polling, whether using clickers or text messaging. Maybe I’m jumping the gun here, but it seems to me that narrow-band, highly specialized gadgets like clickers are the trailing end of the instructional technology wave, and the future of IT lies in finding new ways to turn students’ existing gadgets from liabilities (distractions) into assets.