|
Flashlight Evaluation Handbook Table of Contents
l
Faculty
Development/Support
Subscriber Materials
These
materials are for use only by institutions that subscribe to
The TLT Group, to participants in TLT Group workshops, and
to invited guests. The TLT Group is a non-profit whose
existence is made possible by their subscription and registration
fees. if your institution is not yet among
our subscribers,
we invite you to
join us, use these materials, and help us
continue to improve them! If you have questions about
your rights to use, adapt or share these materials, please
ask us (info @ tltgroup.org).
What kinds of evidence can help guide improvement of
faculty development/support services whose goal (in whole or
in part) is to improve teaching and learning with
technologies? Here are some suggestions, guided by the
Flashlight Approach.
-
Ideas and Concepts for using data to improve faculty
development/support
-
Cases - Examples of useful studies of faculty
development/support
-
Flashlight services to help plan
and carry out your evaluation
I. Ideas and Concepts
for Using Data to Evaluate and Improve Faculty
Development/Support
If you're going to evaluate a program that
helps faculty use technology to improve teaching and
learning, your answers to a few of the following questions
should help you focus
your plan. (The most overwhelming fact about designing such
a study is how many things, potentially, you might possibly
study and how few you can actually afford to take a look at.)
This section of the Flashlight Guide is intended to help you
consider some of your options.
1.1 What's the organizational frame of the study?
Is this evaluation being done by and for one unit? Would
it be better to work in collaboration with one or more other
units that also support teaching and learning with
technology (e.g., the library? teaching center? IT unit?
facilities? audio-visual? department heads or deans? faculty
learning communities?) Your collaborator's needs and
views may affect the answers to the questions below.
1.2 Summative? Formative? Needs Assessment?
- Summative evaluation: what has been the impact or
effectiveness of your program? how have teaching
activities changed as a result of your service? (The
ultimate, and most difficult, summative questions have to
do whether learning activities have changed as a result of
changes in teaching, and whether learning outcomes have
changed as a result of those changes in learning
activities. They're "difficult" because the causal link is
harder to establish as you move out along the chain of
cause and effect. It's hard to prove that faculty
development/support service caused a change in teaching
activity, harder to argue the linkage to learning
activity, and hardest of all to establish a causal
connection between faculty support/development and changes
in learning outcomes. That's trebly true if the faculty
development/support was not focused on some specific
change in learning outcomes. (see 1.3)
- Formative evaluation: what evidence could help you
decide how to improve the service?
- One kind of formative evaluation is a needs
assessment, a study focused on asking faculty what kinds
of services they most need in the future. Here is a
first draft set of questions for faculty focus groups,
written with Flashlight Online 2.0 (if you'd like to
import a copy into your Flashlight account, email us at
flashlight @ tltgroup.org). We'd also love to have your
help in improving this and other sample surveys linked to
this page!
- There is typically only a small overlap between
evidence needed for summative evaluation and evidence
needed for formative evaluation. And it's rare to
choose just one of those goals. So deciding the relative
importance of summative and formative evaluation is a
crucial decision to make early in the design of your
evaluation plan.
1.3 Are the desired outcomes the same
for all faculty? do all your clients use the service in the same
ways?
For faculty development and support
programs usually have two faces, each of which requires its
own approach to evaluation (using data to improve the
program's effectiveness)
- Uniform impact: to some
degree, all participating faculty are being helped for
the same purpose with the same kinds of expected
outcomes, e.g., satisfaction, skill at using a course
management system. And they all use the service in the
same way (e.g., they all attend a workshop on how to
create syllabi using the course management system in
order to learn how to do that). Evaluation begins by
developing a way to assess the impact or value-added,
e.g., how good are they at using the CMS to create a
syllabus? or how much better have they become as a
result of the training?
- Unique uses: to some degree,
each faculty member is being helped to do different
things (e.g., some faculty are improving presentation
skills while others are learning how to facilitate
online discussion and still others are working on topics
of their own choosing). Evaluation begins with case
studies of individual faculty to discover how that
individual has benefited from the support. Only after
each case has been documented does the evaluation look
for patterns across subjects.
If the program's emphasis is split half
and half between these two emphases, then the evaluation
effort ought to be split half and half as well. [For more
on these ideas and their implications for assessment of
impact and evaluation of process, subscribers can consult
the
Flashlight Evaluation Handbook.]
1.4 What's important to study?
Issues likely to be pertinent to
evaluation of a faculty development or support program
depend on its goals. Some are relatively generic; they could
be featured in the evaluation of almost any program:
- Because you'll need to make choices,
you will probably need to decide the relative importance
of a) estimating whether and how faculty are teaching
(differently?) versus b) estimating how much your
services and facilities have helped them.
- In which ways should your service
have helped
academic staff teach more "effectively?"
What teaching/learning activities should be on the
upswing, if your service has been effective? One option: use the
seven principles of good practice (or ideas of
comparable general importance in evaluating quality of
instruction). Flashlight Online contains several
hundred questions for students about these kinds of uses
of technology. The Flashlight Faculty Inventory contains
parallel questions aimed at faculty.
- Formal versus informal learning by the academic
staff: many evaluations focus only on changes in
teaching resulting from organized support. But an
evaluation could also include informal processes of
faculty support (e.g., progress academic staff make on
their own, using resources they find online; the role
played by compassionate pioneers, i.e., helpful
colleagues). (Here is a
draft survey, created with Flashlight Online 2.0, to
evaluate and enhance the role of compassionate pioneers.
If you're a Flashlight user, let us know and we'll help
you import this survey to your account.)
- Does the scale of your service fit
the scale of the needs? or is the service providing
service to only a small fraction of the faculty who want
and need it?
- Why are some faculty not using your
service in a given year? (The reasons may be different
for different people, and provide clues about how to
increase the level of use, if that's one of your goals).
- Is it important to consider the costs of your
service and how to get more value from available
resources? Faculty support tends to be time-intensive,
which means a study of costs requires gathering evidence
of how people spend their time (including the faculty
themselves - they too are participants in the process of
problem solving and teaching improvement.)
Is it more important for you to assess the total impact
of your program? or to zero in on one or more
elements?
- If, for example, the program uses
student technology assistants, part of the inquiry might
well focus on whether students and faculty are learning
in distinctively valuable ways because of their work
together
- If the program is focusing on
'broadcasting' low threshold applications and activities
to large numbers of faculty, part of the evaluation
would focus on what fraction of the messages are
reaching various members of the faculty, and the issues
that affect whether each message is heard and acted
upon.
- If your program has a specific goal
for improving teaching/learning in a specific way (e.g.,
online learning; writing across the curriculum; student
and faculty research), then you'll want to focus a
significant part of your effort on those outcomes and
how you hope to be achieving them.
- One way to use evaluation to discover how to get
more value from resources is to ask your clients about
their views of the different units that support them.
Here is a
first draft of such a survey, also created with
Flashlight Online 2.0.
Make sure your study design also addresses any concerns
that you, or your stakeholders may have about the program.
- For example, does the program attract mainly faculty
who are already excellent teachers, without affecting
the skills of typical instructors?
- Does the program have any influence on the quality
of the teaching and its likely outcomes? or is it just
helping faculty teach in the old ways (probably with
unchanged outcomes) but using new and expensive tools to
do so (e.g., putting their yellowing course notes onto
PowerPoint slides)?
One way to make sure that your study addresses such
concerns is to work closely with your stakeholders in
designing the study.
1.5 Whom will you study?
- Is your focus only on full-time tenure track staff?
other faculty members? adjuncts? teaching assistants?
1.6 Time Frame
- Most evaluations focus on the past,
sometimes several years worth of the past, in order to
learn lessons for the future.
- Some evaluations might focus just on
the present, gathering data that can be used as a
baseline and aid to future planning. Such a study might
focus on how all faculty have recently been improving
their teaching (regardless of whether they have used
your service). Such findings could be used both to
adjust your strategy and as a point of comparison when
you do a similar study in a year or two. Such
evaluations can also function as needs analyses, to help
create the case for fresh budgets and new energy to be
applied to current problems.
1.7 What Else Do You Need to Know? Why?
- Think hard about why you're doing the
study. For example, you might decide that it's
important, in order to justify a budget, to get a vivid
sense of how much faculty have learned to value your
service. If you suspect they do value it, you might
offer a survey that briefly names and describers your
services and then asks the respondent a question or two
like these:
- If you were trying to persuade
someone to consider a faculty position at this
institution, would you mention this service? What
would you say about its value?
- The budget provided for this
service has been reasonably stable for the past few
years. But imagine for a moment that there was a
real question about doubling it, keeping it the
same, or eliminating it. The provost has asked a
few faculty, including you, for an opinion. What
would you say and what examples from your own
experience with our service would you mention to
make your case about the future budget for this
service?
We're collecting successful formative
evaluations of programs that help faculty use technology in
their teaching. We're especially interested in uses of data
that guide efforts to improve faculty development or
support, and/or help it become more cost-effective. If you
know of a study that might be linked here, please e-mail
ehrmann@tltgroup.org
)
Instructional Design at Washington
State is helping improve distance learning courses while
cutting costs
Some institutions see "instructional
design" as an expense that may need to be cut in times
of tight budgets. But Tom Henderson, Gary Brown, and
Carrie Meyers found in a series of studies at Washington
State University that up-front instructional design both
improved teaching-learning practices in courses and also
helped control development and delivery costs. These
findings are helping define policy and practice at WSU,
one of the founding institutions of the Flashlight
Program.
Click here to see what they discovered.
Faculty-Librarian Partnerships
Effective in Developing Information Literacy in Minnesota,
Dakotas
Project
JSTOR was a three-year grant initiative from 1999-2002,
supporting 35 public and private colleges and
universities in Minnesota, North Dakota and South
Dakota. Its goals: strengthen digital library use and
scholarly research, particularly through the acquisition
and use of the JSTOR
digital library collection. Through the program, 20
colleges and universities became participating JSTOR
members, joining a network of 15 other member
institutions in the region. The Flashlight Program
conducted an external evaluation which included
extensive interviews and surveys of program
participants.
Perhaps
the most significant finding was the power of
Faculty-Librarian Instructional Partnership (FLIP)
grants. Small grants helped at least one faculty member
and a librarian at an institution to team up and improve
a course's ability to develop information literacy among
students. The grants seemed to help advance a new
working relationship between faculty and librarians,
while triggering substantial institutional increases in
the use of the JSTOR collection and other online
resources.
One
role of faculty support at SE Missouri State is to
prepare faculty to teach online. David Starrett and
Michael Rodgers studied who was being served by their
institution's online courses. The University's
investment in helping faculty use technology had been
justified in large part by the hope that the resulting
courses would serve students across the University's
service area, students not close to campus. Their data
indicated that online courses were serving precisely
these students.
Click here to see a summary of their study, and an
e-mail address to get more data.
University of Missouri, St. Louis case
study
Cheryl Bielema reports that
these reports (done annually) are used as fodder for
planning faculty development.
Almost every TLT Group subscribing institution has some
prepaid consulting time, a good way to get some help in
planning your evaluation of faculty support/development
services and planning improvements. We can help you develop
a plan and, if you like, help you carry it out. Need a
visiting committee who can review your reports, interview
faculty, and provide an outside view? We can help arrange
that, too.
We have extensive experience in evaluation and planning
for faculty support. In addition to examples cited above,
this web page summarizes more of our previous
consulting.
|