Matrix Survey for Improving the Use of an Innovation:
Example of SRS

Handbook and Other Materials l Asking the Right Questions (ARQ) l Training, Consulting, & External EvaluationFAQ

Matrix Surveys Explained l Flashlight Online 2.0 l Evaluating Student Response Systems

Matrix surveys can be used to gather feedback from students about an innovative practice being tried in dozens of courses: feedback that can be used by each instructor to improve the course, and also by the staff who support the academic program.

In this example, imagine that we need information about an innovative practice -- using technology to poll all students in a class quickly and anonymously.  Such technologies are sometimes called student response systems (SRS).  An innovative practice of this sort is quite difficult to study with a traditional survey. That's because SRSs are often used differently in different courses. The more empowering and generative is the instructional idea, the more variety there will be in its implementation across courses, and the more difficult it will be to evaluate using the same measures for all those courses.  As often happens with such innovations, SRS's are even called different things (e.g., "clickers", "PRS", "cell phone polling", "the polling system") in those different courses. More complications: instructors may have different decisions to make and therefore need different kinds of feedback. For example, some instructors may be deciding whether or not to grade student responses next term and want feedback on that topic; meanwhile, other instructors who also use an SRS may not be concerned with that issue.

All these variables require that the student feedback forms be tailored to each class, while still allowing data to be analyzed across courses. For example, if there are 20 course using some form of SRS, and 12 of them are using the SRS to foster discussion by pairs of students after they've been polled, it would be useful to analyze the shared experience of those 12 courses (even though 8 of them are using clickers in a classroom, 4 of them are online classes using a polling system, and another 8 aren't using SRSs in this way at all).  We can diagram what we need as a table (matrix).

Step 1 in creating tailored feedback forms is to ask interested faculty to fill out a short online form for each class from which they'd like feedback about the SRS.  Here is a demo version of such a form for faculty.  The instructor of Course 1 indicates that issue A and B are relevant for his class, but not issue C. Meanwhile the instructor of course 2 has flagged issue B, and the instructor of course 3 wants questions for her students about issues A and C.

  SRS Issue A
(questions about)
SRS Issue B
(questions about)
SRS Issue C
(questions about)
SRS Issue...
(questions about)
Course 1 X X    
Course 2   X    
Course 3 X   X  
Course ...  

(It was a table like this that led us to christen this kind of inquiry a "matrix survey.")

Those faculty choices can then be directly uploaded into the student matrix survey in order to automatically generate response forms tailored for each course.  Matrix surveys can be tailored using many kinds of metadata about respondents, e.g., their course, where they live, gender, technology experience, ... any kind of information that one has in advance and that is useful in deciding which questions to ask that person and how to word the response form.

Using our demo matrix survey, here are several different versions of the student response form. Notice that the questions are somewhat different in each form, as is the terminology:

Matrix surveys have additional, powerful features for evaluating innovative practices: 

  • Different stakeholders in the innovation can contribute their own questions. For example, a vendor could contribute questions that only go to classes using that vendor's products. IT support staff might have different questions than do the teaching center staff. Or faculty who teach in large lecture courses using an SRS might have some questions only for those colleagues who also teach a large lecture.  Each stakeholder can receive a report of relevant findings.

  • That ability of different stakeholders to participate needn't be limited to SRS.  Perhaps there are three different innovations being studied, each used by an overlapping set of faculty.  Those studies could be combined into a single matrix survey. Or, if matrix surveys are used for end-of-course evaluation at the institutions, all three inquiries could add their questions to the end of course evaluation; obviously only courses using an SRS would have the SRS questions added to their end of course survey.

  • The different response forms don't need to all be answered simultaneously. The survey of SRS-using courses could go on for several years.  Each time a faculty member asked for an SRS response form, the new data would flow into the same database. At any time along the way, stakeholders can analyze the data accumulated thus far, making it much easier to study trends, whether problems are being solved, and whether opportunities are being pursued.

Matrix Surveys Explained l Flashlight Online 2.0 l Evaluating Student Response Systems

 

PO Box 5643
Takoma Park, Maryland 20913
Phone
: 301.270.8312/Fax: 301.270.8110  

To talk about our work
or our organization
contact:  Sally Gilbert

Search TLT Group.org

Contact us | Partners | TLTRs | FridayLive! | Consulting | 7 Principles | LTAs | TLT-SWG | Archives | Site Map |