|
Introduction l
Activities for which SRSs can be Used l Study
Designs l
Matrix Survey l
Flashlight Evaluation Handbook
These
materials are for use only by institutions that subscribe to
The TLT Group, to participants in TLT Group workshops that
feature this particular material, and
to invited guests. The TLT Group is a non-profit whose
existence is made possible by subscription and registration
fees. if you or your institution are not yet among
our subscribers,
we invite you to
join us, use these materials, help us
continue to improve them, and, through your subscription,
help us develop new materials! If you have questions
about your rights to use, adapt or share these materials,
please ask us (info @ tltgroup.org).
A student response system (SRS) provides a
way for a faculty member (and usually students) to get
information from all students in the course, usually in real
time. SRSs have special value in larger courses,
where all students can't be quickly questioned one at a
time. Personal response may be gathered in many ways.
In physical classrooms, the feedback is often provided
through handheld consoles ("clickers") but laptops and even
cell phones could e used. In virtual classrooms,
conferencing systems, survey systems, shared whiteboards,
and other software provide options for student feedback and
voting. These options are discussed, with examples, in
TLT Group subscriber
materials on learning space design.
This Handbook section describes evaluative studies that can
help increase the benefits of an investment in personal
response systems. Section I and II of this chapter
summarize some of the
material on student response systems from the Learning
Space Design part of our web site.
A Flashlight approach to formative
evaluation of personal response systems begins with this
question: what's the educational activity? There are
several families of activities for which SRSs are
commonly used:
- Stimulating deeper exploration of
difficult ideas. This pattern of using a SRS
combines a) conceptually challenging questions that
can't be answered by simple memorization or calculation,
b) Peer instruction (students are asked the question,
see the course response, confer with a partner, and then
respond again). It's this pattern of using
'clickers' that has produced impressive, lasting
learning gains. (One of many attributes of clickers that
makes them a good tool for this: the safety of anonymity
or semi-anonymity. That's especially useful when the
questions are politically or socially controversial.)
Note: many faculty use SRSs to poll students on
questions that have no single right answer, including
questions of opinion and judgment, e.g., how credible
was the evidence cited in the student presentation you
just heard?
- Asking questions to see who
remembers what they read in the homework or
remembers what they just heard in the lecture, (e.g., "which of the following is the
equation of simple harmonic motion?
Which of the
following motions can be described with the equation of
simple harmonic motion?"
- Grading student
progress with paperless quizzes and tests (when the
system creates a record of each student's response)
- Taking
attendance (ditto)
- Gauging progress and agreement/disagreement:
Allowing students to see how their
understanding and opinions relate to those of the other
students in the course.
Most of the foregoing activities can have two
simultaneous goals:
- Encourage the student to do the activity
that is being measured (e.g., do the reading, think
about what has been read or heard, pay attention the
lecture) and
- Provide guidance to help the instructor
figure out
what to do next (e.g., if a point seems well
understood, build on it; if students have differing
interpretations of a reading, stage a debate or some
other test of those interpretations).
Here a few types of study that could be
useful in increasing the benefits of SRS use at your
institution:
I. Test your assumptions about the
activities in use. Do students interpret and react
to the activity as you hope? as you fear?
Click here to see a
prototype Flashlight Online 2.0 survey for gathering feedback for this purpose.
Click here to see materials for
a faculty workshop. Its goal: help faculty quickly
and easily do such studies with their own students in order
to improve their own use of personal response systems.
2. Look for factors that may be
hindering some students from engaging fully and effectively
in the activity. Does the student have the
SRS? does it work? if the SRS is being used to discover
whether the student remembers what was read, a key question
is whether the student did the reading and, if not, why not.
This kind of diagnosis is designed to help instructors
change some of the factors that hinder involvement and
thereby increase the percentage of students who can
successfully engage in the activity. In a future rewrite of
this chapter, we'll include more examples of such feedback
questions (perhaps questions that you send us!)
3. Study the suitability of technology
for the activity: if some people at your institution are
already using SRSs in one of these ways, ask faculty and
students to evaluate the strengths and weaknesses of the
technology for this particular activity, or components of
the activity. For example, faculty and student doing
Activity B ('interactive engagement') above might be asked
to evaluate the strengths and weaknesses of the current
technology as a way of displaying patterns of student
thinking, and to suggest ways of better displaying patterns
of student thinking.
4. Compare a course or course using
personal response system with a course/courses that don't
use the technology. The value of a SRS will be realized
only if students act and think differently in a course. So
it should be possible to compare such activities across
courses, to see if those activities really are influenced.
This
Flashlight Online 1.0 item bank is designed for that
purpose (ZS70578)
5. Study the role of SRS use in
learning gains. Many studies have illustrated important
learning gains from Activity B ('interactive engagement').
To get ideas for study designs, do a web or library search
on 'interactive engagement.'
- Micro studies: These are
studies that occur within a course and focus on the
immediate value-added of this use of a SRS. Dennis
Jacobs of Notre Dame has been studying how student
responses converge toward correct answers as discussions
continue.
- Macro studies: These studies
might focus on the long-term development of
understanding of a set of concepts and look at SRS use
along with other instructional activities as ways of
developing understanding. For example, the investigator
might compare learning gains as they relate to the
frequency of use of SRS for interactive engagement over
a period of time. Or the investigator might ask students
to evaluate the strengths and weaknesses of such
activity for learning this set of skills and insights.
The investigator could also see whether different types
of students respond differently: the students with the
greatest and least learning gains? by gender? by
standardized test score?
6. Study reliability and costs.
During a pilot test, it's especially important to look
at what it takes to keep the technology working at high
levels of reliability with low levels of support. The
investigator should gather information about 'down time'
from the student point of view as well as from faculty. It's
also likely to be useful to study uses of time that are seen
as wasteful: faculty, students, and IT support staff.
We would like to develop this
chapter by adding case studies from institutions using
personal response systems. Network member institutions might
like to use some of their consulting time to have us work
with them, or for them, to do such studies. In the
process we can develop survey templates and other research
tools that can be re-used, as well as publishing our
findings.
-Stephen C. Ehrmann,
Flashlight Program
Return to
Top of this Chapter
Flashlight Evaluation Handbook Table of Contents
|