TLTG graphic
TLTG graphic
TLTG graphic
Flashlight Program
FLASHLIGHT INFORMATIONTOOLSEVENTSPARTICIPATING INSTITUTIONSTLT/FLASHLIGHT SUBSCRIPTION INFO
articlesslideshowscase studiesworkshop materialsFAQs
TLTG graphic
line graphic
LEARN ABOUT TLT GROUP
EVENTS AND REGISTRATION
in PROGRAMS section
FREE RESOURCES
LISTSERV AND FORUMS
CORPORATE SPONSORS
RELATED LINKS
HOME
line graphic
line fade graphic


F-LIGHT


E-Newsletter for the Flashlight Program

March 2004 ISSUE

Increasing Response Rates for Online Surveys – A Report from the Flashlight Program’s BeTA Project

Robin Etter Zúñiga, Principle Investigator, BeTA, and Associate Director, Flashlight

The Flashlight Program’s BeTA Project is developing a powerful new online approach to gathering student feedback about courses and faculty. As part of that effort, we are describing a set of best practices for increasing response rates to online surveys.

 

There are very real advantages to moving from paper to online surveys. For example, you save printing costs. Students are not restricted to answering in a classroom. Many different forms of a survey can be created and administered.  And because there is no data entry, the data can be analyzed and used far more quickly.  Unfortunately, respondents, whether they are students, faculty or the general public, seem to be better able to ignore an online survey than a paper one.  So response rates can be a problem.

 

We are still in the process of studying strategies for increasing response rates, but there are some insights that we can already share that will be useful to campuses.

 

Some of the things we have learned thus far, include:

  1. Push the survey – we have found sending an e-mail with the link to the online survey works better than expecting students to go to a web site.
  2. Frequent reminders – one of our ten BeTA pilot sites, St. Edward’s University, set up databases using Flashlight Online, Access and Word Mail Merge to send out reminders to those who have not responded via e-mail. At least three reminders go out during the survey period. Look on the TLT Group web site soon for a step-by-step description for doing this for your surveys.
  3. Faculty Involvement – Nothing helps more than regular reminders to students from faculty. As part of its pilot work on BeTA, St. Edwards gives faculty a script to use when talking to their classes. They are asked to tell students to fill out the surveys at the beginning of the survey period and are reminded to remind their students again each time we send out the e-mail reminder to non-respondents.
  4. Rewards help -  Many institutions have found that a drawing for a prize of general interest (make sure the gift isn’t something that will bias your response), or even 1 point earned for the course also works well even though it is not enough to change any individual students grade. Sometimes this reward is given to individuals, and sometimes to the whole class if more than a certain percentage of students responds. (See the article below for an interesting example of such rewards.)
  5. Respondents need to believe that their responses will be used – we’ve seen a number of cases of faculty getting higher response rates when they (a) regularly seek feedback through mid-term evaluations and classroom assessment techniques and then (b) tell students about what they’ve learned and how those insights are being used to improve the course. BeTA will work with institutions to help create procedures not only for using data but also for publicizing the ways findings have been used to improve courses.
  6. Help students understand how to give constructive criticism - a frequent lament of faculty is that student evaluations are more a popularity contest than actual evaluations of teaching and learning. One reason for this is that students are rarely educated about constructive criticism.  Orienting students (e.g.,  in orientations for new students or through freshman studies programs) is one way to develop trust and knowledge among students, as well as laying the groundwork for constructive criticism of courses by students.
  7. Create surveys that seek constructive criticism: Feedback forms ought to help students describe what has been happening in teaching and learning, and how they think those activities can be improved, rather than asking for summary judgments about satisfaction. If students can see that their advice may help improve things, response rates may improve. BeTA will help create surveys that can be tailored to each individual course where that's desired, while also have some questions that are common to that department and others common across the institution.

Look for more information from the BeTA Project on ways to improve course evaluation questions and processes, working with students to educate them on their role in evaluating teaching and learning, developing systems for formative faculty evaluation, and using course evaluations to diagnosis faculty development needs. Primary BeTA subcontractor Washington State University is developing the software for this project, which will also become the platform for the new version of Flashlight Online, coming in 2005.


More Tips on Increasing Response Rates to Surveys

I just stumbled across this web page on SUNY Stony Brook's site (if that link is no good anymore, I took the precaution of making a copy of the page on our site.) I don't know anything about the survey, but several features of this announcement caught my eye as examples of good practice that I rarely see:

  • There's a brief explanation of the purpose of the data gathering (instead of "please respond by xx")
  • The authors assume (I'm guessing) that there's not enough immediate benefit to the respondents from responding to get the response rates that they want, so they're paying the respondents for their time by making this a raffle. There's a clever formula which also provides a little incentive for respondents to persuade their colleagues to respond -- the formula increases one's chances of winning, the more people respond.

We've built a web page of strategies for increasing response rates. Please send us ideas for improving and extending it!

-Steve Ehrmann, Editor


Upcoming Assessment-Related Online Workshops and Conferences

Online workshops on Assessment and Dist. Learning

The TLT Group is considering whether to offer a series of online workshops for faculty and administrators. The general theme: how to use surveys and other data to improve distance/distributed courses. One would be aimed at individual courses, the others at program improvement. Two would describe a variety of models for gathering and using data to pinpoint how to improve learning, a third will focus on our benchmarking program in nursing, and the fourth will deal with cost analysis.

Want to get an e-mail if we do offer them? Equally important, do you want to give us advice on the timing, format and whether we should arrange for academic credit? If you'd like to learn more, please click here and then tell us what you think; responding will also allow you to get on the mailing/waiting list.

Thanks!  If you're at all interested, this will help us give you what you need. If we do offer these workshops they won't start before June 2004.

Flashlight Online training - Subscribers Only!

We'll continue to webcast periodic training sessions for Flashlight Online users, administrators, and trainers. The next online training session is scheduled for April 5.   Click here for more information.   If you're not sure if your institution is a current subscriber, click here.

For details on this and other Flashlight and TLT Group events, both face to face and online, keep an eye on The TLT Group calendar.


TLT/Flashlight Subscription Programs and Subscribers

  • The TLT/Flashlight Basic Collection includes site licenses to dozens of program development and assessment tools and resources, discounts to online workshops, and one hour of free consulting, while the 
  • The Comprehensive Collection includes all that, plus unlimited institutional use of Flashlight Online, our powerful, web-based survey system that includes validated items and peer-reviewed templates for typical educational studies; Comprehensive subscribers get two hours of consulting. 
  • Network membership includes all the Comprehensive benefits plus two free days of consulting/training and sharply reduced rates for additional days: the consulting can be used to aid planning, do external evaluations, train your staff, take part in projects, and a variety of other purposes. 

Benefits are added almost weekly. This web page links to recent notices we've sent to subscribers about updates and additions.

Over 130 institutions, systems, boards of regents, and multi-institution projects now subscribe to these TLT/Flashlight services.  One benefit for you, the reader: their subscriptions support F-LIGHT and we thank them for that. Is your institution one of them? Check our list of participating institutions. Institutions subscribing, or resubscribing, since March 1 are:

  • Bethel College, Minnesota

  • Bucks County Community College

  • Butler University College of Pharmacy

  • California Lutheran University

  • California State University - Fullerton

  • California State University - Northridge

  • California State University - Sacramento

  • Clark Atlanta University

  • Colby-Sawyer College

  • Fox Valley Technical College

  • George Washington University

  • Gettysburg College

  • Hibbing Community College

  • Houston Community College

  • Howard University

  • Lewis University

  • Loras College

  • Louisiana Board of Regents

  • Mount Royal College

  • Niagara College of Applied Arts & Technology

  • Nicolet Area Technical College

  • Olivet Nazarene University

  • South Dakota State University

  • Stanford University

  • SUNY Stony Brook

  • Tulane University

  • University of Detroit Mercy

  • University of Florida

  • University of Kansas Medical Center

  • University of Maryland University College

  • University of Missouri - St. Louis

  • University of South Florida

  • University of Tennessee - Knoxville

  • University of Texas - Austin

  • University of Vermont

  • Vanderbilt University, School of Nursing

  • Washington & Lee University

Top of Page


Ehrmann's Web Log ('Blog)

I've been in our Takoma Park office much of the time since my last log entry.  I've begun working on some new themes:

  • Formative evaluation of ePortfolio initiatives (in other words, what kinds of data should a program collect to help assure that its investment in ePortfolios actually results in improvements in learning and without undue cost?). If you're interested enough to promise to offer me your criticisms and ideas, e-mail me and I can send you a draft paper.

  • Smart Classrooms, Course Management Systems, and Other Learning Spaces: How to do (Formative) Evaluations of Them.  Those of you who have followed the Flashlight Program won't be surprised to hear that the analysis begins with a description of the activities that are supposed to be carried out in the learning space. Any given learning space will make some such activities particularly easy or fruitful while almost certainly hindering other such activities. For example, a lecture hall with rows of chairs bolted to the floor makes lecturing easier while making splitting the class into 5 person groups rather cumbersome. The evaluation tools we'll develop will begin with questions about ways in which particular classrooms (physical or virtual) are supporting and hindering specific activities. We'll go from there. If you want to see the beginnings of the framework that will shape this evaluation strategy, see "In What Ways Can "Classrooms" be "Smart?"

  • The new workshop series on assessment and distance/distributed learning described earlier in this issue of F-LIGHT.

All three of those themes, and F-LIGHT itself, share a central conviction: we can be safer, less stressed, and more successful if we selectively ask the right kinds of questions and gather data to help guide what we're doing. It's especially ironic that ePortfolios - themselves an assessment tool -- are typically implemented (so far) with little or no formative evaluation.  People plan new learning spaces without evaluating the old ones, and with no budget or plan to evaluate the new ones.  (Again, when I say something like "evaluate the new ones," I mean "gather the kinds of data that will help faculty and students get more value out of the new facilities."

Do you agree that this perception of assessment as threat (or waste of time) is a serious problem?

- Steve Ehrmann

 


About Flashlight (including free demonstration accounts),
The TLT Group, and F-LIGHT
(starting and stopping subscriptions)

The Flashlight Program for the Study and Improvement of Educational Uses of Technology is part of the non-profit TLT Group, Inc. Flashlight was created by Annenberg/CPB in 1993. The TLT Group is headquartered in Takoma Park, Maryland, just outside Washington DC, with additional staff in Texas, Richmond VA, and Pittsburgh; and Senior Associates around the world.

Our thanks to Washington State University for their many ways of supporting Flashlight, including developing and administering Flashlight Online and providing the listproc for distribution of F-LIGHT notices.  We are also grateful to St. Edward's University for extensive support for Flashlight; to the corporate sponsors of The TLT Group; and to funders whose dedication to higher education has aided the TLT Group's work, including Annenberg/CPB,  the birthplace of the Flashlight Program.

If your institution needs to get a better look at Flashlight Online, the best way is for someone at your institution to request a temporary, free demonstration account.  Send e-mail to Flashlight@tltgroup.org with the header "Free Demo Account" to ask for details. One account per institution, please.

The TLT Group publishes F-LIGHT every month or two. Click here to see case studies and other articles in back issues.

If you know someone else who would like to be alerted to new issues of F-LIGHT, please suggest that they send e-mail to LISTPROC @ LISTPROC.WSU.EDU  with the one line message
   SUBSCRIBE F-LIGHT (the subscriber's first and last name)
Make sure that your e-mail is set to send plain text, not html or RTF.

Do the same for yourself if you have changed e-mail addresses.

To stop receiving the bulletin about F-LIGHT, please send e-mail to LISTPROC @ LISTPROC.WSU.EDU with the one line message
   SIGNOFF F-LIGHT
There should be no other text in the message (e.g., no signature file) and no subject header. If there are problems signing off, make sure your e-mail is set to send plain text, not html or RTF.

Top of Page


Stephen C. Ehrmann, Ph.D.
Director of the Flashlight Program and
  Editor, F-LIGHT
The Teaching, Learning and Technology Group
One Columbia Avenue
Takoma Park, MD 20912
http://www.tltgroup.org 
301-270-8311 (v)  
 

 

 

 

learn about tltg || events & registration || programs || resources || listserv & forums || corporate sponsors || related links || home