|
Return to
"Beyond Computer Literacy:
Implications of Technology for the Content and Outcomes of a
College Education"
The single best source of
guidance for improving a liberal education is close study of
student work (projects, online discussion, etc.). Those
insights can be complemented and deepened through feedback from
students. The TLT Group’s
Flashlight Program
has been working since 1992 to help faculty members learn how to
gather data in order to better understand their own courses in
order to improve them, while also helping departments and
institutions gather evaluative data about how their programs as
a whole are operating in order to make improvements at that
level.
One type of survey of special importance to
the improvement of entire programs of liberal education, and of
each course in such programs, is the student feedback survey. With
the support of the Fund for the Improvement of Postsecondary
Education (FIPSE), The TLT Group is working with a dozen of its
subscriber institutions to create a new model for such surveys:
the
BeTA (Better Teaching through Assessment) Project.
BeTA is intended to
strengthen the use of student feedback about courses and faculty
in at least three ways:
a)
Designing feedback:
To help faculty, administrators and students plan the content of
course feedback surveys, BeTA project workshops and materials
will help them agreement about key issues. These sessions will
help faculty decide, among other things, which questions (if
any) should appear in all student feedback surveys. BeTA
surveys will usually include a mix of sources: some common to
all courses, some designed for specific types of courses, some
from specific institutional programs or colleges, and some
authored by the faculty member for his or her own students.
b)
System for Designing and
Administering Online Surveys with Single or Multiple Authors:
The new survey system being designed for BeTA (Flashlight Online
2.0) will multiple authors to collaborate in developing a
survey; each author can keep some of the resulting data from
other authors of the same survey, if need be. For example,
faculty member could add a question to the course feedback
survey that asks her students, and only her students, to provide
feedback on her use of PowerPoint without needing to worry
whether that data might hurt her chances for promotion.
Meanwhile, on the same survey might be questions from the
writing program about whether feedback on student writing is
helping students become more skilled writers and/or helping them
with the content of the written work. Other questions might be
added by the department. BeTA is designed to encourage people
and programs to ask the kinds of risk-taking questions needed
for real improvement.
c)
BeTA surveys will typically be
delivered online. Because low response rates to online surveys
are a widespread concern,
BeTA is developing strategies to increase response rates.
One reason for low response rates: as far as the typical
respondent is concerned, spending time and thought on a typical
course feedback survey produces no visible result. So BeTA
will recommend that individual faculty, departments and the
institution each report to students frequently on what they each
are doing as a result of student feedback.
Return to
"Beyond Computer Literacy:
Implications of Technology for the Content and Outcomes of a
College Education"
|