|
Matrix Survey Examples:
Workshops &
Courses l SoTL; Information Literacy
l ePortfolios l
Student Polling l
Classroom Technologies
One of the simplest uses of
matrix surveys
is to produce feedback forms for workshops or courses
offered over a period of time, where reports are needed for
each workshop, and also summarizing data from many
workshops. The questions and text could be identical
for every workshop: the same question group and text for
each respondent pool.
Or, as Cal State Sacramento did in one of
the earliest uses of Flashlight Online 2.0, one can tailor
the evaluation form a bit by including the name of the
workshop and instructor in each form. Notice how
this version of the form,
for a workshop on WebCT, differs from
this version
for a workshop on Microsoft Word. Each
respondent pool (i.e., the definition of a particular
response form, tailored for one group of people) can be
created one at a time, or meta data defining respondent
pools can be uploaded as a batch, defining response forms
for dozens, or thousands, of courses at a time.
Similarly,
here is a basic workshop evaluation form in use by The
TLT Group. This response form was used for our MERLOT 101
workshop. The name of the workshop is automatically inserted
but only core questions are asked. But
here is a different response form, part of the same
matrix survey, with an added question specific to this
particular workshop (on diversity). And here is a
third
response form that uses even more matrix features. It
includes one question group that can go to any event in our
FridayLive series, plus another question group that would
only be seen by participants in our FridayLive on April 3 on
matrix surveys. This third response form also directs
respondents to a report on data gathered so far from this
particular group. Each time the TLT Group runs an
online event, we can use tailored response forms created by
this same matrix, and add more data to this survey's
database. Because this is a matrix survey, we can pool
responses to the question about whether people are attending
in groups, getting a bigger "N" in order to see better
what's going on. We can analyze the same data over time, to
see whether group participation use is increasing over time
(for all events, or just for FridayLive).
Washington State University,
developer of the Skylight Matrix Survey engine that powers
Flashlight Online, is the first large-scale user of the
system for course evaluation. They take advantage of
the matrix survey's ability to offer different questions to
different respondent pools.
For more information on using
matrix surveys on student course evaluation,
click
here.
Other assessment efforts in information
literacy focus on outcomes - the skills, knowledge and
values that comprise information literacy. Our
prototype matrix survey gathers feedback from students about
course activities that are designed to develop students'
information literacy. Faculty can request creation of
a feedback form for their students by filling in a form like
this brief prototype. That data is fed into the matrix
survey to create student response forms. If one faculty
member requests a form, then one form is created. If 100
faculty fill out the form, each describing the activities in
3 courses, then the system creates response forms each
tailored for one of those 300 different courses.
For example, this
first response form is tailored for a course whose
instructor is concerned about:
-
whether students all
understood required terminology well enough to
understand information literacy assignments:
-
an assignment to use
library databases to do a research paper;
-
a Google Earth mashup
assignment; and
-
a team activity in which
student teams were asked to research a question, present
an argument using sources, and then have the credibility
of their argument and sources rated anonymously by the
whole class (using clickers, for example).
The same prototype matrix survey was also
used to create
this second response form for a different course, a
course whose instructor is concerned about:
-
assignment to create an
annotated bibliography,
-
required terminology
(same issue as for the previous course)
-
That same activity to do
research, create a presentation, and rate the
presentations of others. Notice that the name of the
topic for the research and presentations is different in
this second response form. The first form asked about
"Causes of the War of 1812," while this second form asks
about research and presentations about "How Austen
influenced Dickens?".
For more on our work on information
literacy, click
here.
ePortfolios: Feedback
for Faculty and Program Leaders
The more empowering an innovation, the more
ways it can be used (differently) in different courses.
Electronic portfolios (ePortfolios), for example, can be
used for over a dozen different activities (reflection,
focusing attention on program outcomes, gathering feedback
on student learning from professionals in the field...). But
in any given course or setting, it's unlikely that the
ePortfolio is supporting for more a handful of those
activities.
This section of the Flashlight Evaluation Handbook
shows two contrasting response forms created with the
same matrix survey for students in two different courses.
The two forms pose different questions and in several places
use different text (the name students use for their
ePortfolio, for example).
Another application of
matrix surveys for ePortfolios is collaborative authoring
and response. Many stakeholders are involved in implementing
use of ePortfolios:
-
faculty;
-
students;
-
technology support
staff;
-
teaching center/faculty
development staff;
-
professionals in the
field who supply feedback about student portfolios;
-
employers;
-
graduate school
representatives;
-
(etc.)
Representatives of any of these groups can
participate in authoring, a matrix survey aimed at members
of any of these groups. For example, students could author
questions for other students, or even for professionals who
provide feedback. Faculty can participate in
authoring, and can also be respondents. Aside from the
benefits of the survey findings, this kind of matrix design
for the research team can help to empower and engage the
groups involved in the evaluation.
To see our thinking about evaluating,
planning and implementing ePortfolios,
click here.
Student Polling (e.g.,
Clickers): Feedback for Faculty and Program Leaders
For
this class's feedback form, the instructor calls the innovation "clickers" and wants feedback on their use for
peer instruction. The same matrix survey
produced
this very different feedback form for a
different class, whose instructor wanted
feedback on whether to use the results as part
of students' grades (so student responses
wouldn't be anonymous); this class referred to
the technology as a 'polling system' as you'll
see.
For
more on our use of matrix surveys to guide use of student
response systems,
click here.
Surveying faculty about classroom technologies: If you compare these two instructor feedback forms, both
created with the same matrix survey but tailored for
different classrooms (actually for all the instructors
teaching in each of those rooms) you'll get a sense of what
we can now do with a matrix survey:
Because it's a matrix survey, all the instructor
responses will stream to the same database (even if the
opinion sampling goes on over several terms), so that an
institution could, for example, see if their smart boards
were being used more productively as the years go by.
Surveying students about classroom technologies: Here's a pair of response forms, both generated from a
different matrix survey, this one aimed at students:
|