|
What is an
Institutional Portal? Why Study It? l
Educational Goals l
Laying the Foundation l
Baseline Data l
Evaluation for Debugging (inc. cost control) l
Monitoring Outcomes l
Ways in Which the Flashlight
Program Might Help You l References l
Notes l Return to Flashlight Eval
Handbook Table of Contents
These
materials are for use only by institutions that subscribe to
The TLT Group, to participants in TLT Group workshops that
feature this particular material, and
to invited guests. The TLT Group is a non-profit whose
existence is made possible by subscription and registration
fees. if you or your institution are not yet among
our subscribers,
we invite you to
join us, use these materials, help us
continue to improve them, and, through your subscription,
help us develop new materials! If you have questions
about your rights to use, adapt or share these materials,
please ask us (info @ tltgroup.org).
This
text is adapted from a chapter,
written by Stephen C. Ehrmann, published in
Designing Portals:
Ideas and Challenges (Jafari and Sheehan, eds.).
Institutional portals have many purposes.
For the purposes of this chapter of the
Flashlight Evaluation Handbook, we
will look only at the educational support role of the portal
in order to develop a framework for formative evaluation.
What kinds of evidence are most likely to help guide changes
in the symbiotic relationship between portal and institution
in order to improve the educational value of that portal?
This chapter should be of interest to any educational
institution that has, or is considering building, a portal,
if that investment is being partly justified by the claim
that the portal will be valuable for improving education at
that institution. For a summary of the fundamental
ideas used in this and other topical chapters, see "The
Flashlight Approach."
WHAT IS AN ‘INSTITUTIONAL[i]
PORTAL’? WHY BOTHER TO STUDY IT?
Let's define an
“institutional portal” as a tailorable user interface that
provides efficient access to an extensive set of
institutional resources, communications channels, and
external resources.
Without a study no
one can really tell whether a portal is educationally
valuable. So (skeptics might argue)
it’s safer and cheaper not to do the study and to simply
assert that your portal is educationally successful.
Besides (their argument might continue) if you do a study
and find out that your portal has been a waste of money and
effort, it might cost your job.
Read this chapter and
then decide for yourself whether to do a study. As you’ll
see, the chapter argues that evaluation can play the same
role for a portal that headlights play for a car driving on
a twisting road at night: the right kinds of
evaluation can help increase the portal’s chances of success
and efficiency.
Like a cabinet full of
flasks, test tubes, and chemicals, a portal can potentially
be used for several different educational purposes,
depending on choices made by the institution and the users.
That will determine the shape of these studies, so we need
to define the portal’s purpose. Which goals are most
important for your institution? Here are a few candidates:
ü
Enable faculty to offer
instruction that is more spontaneous, flexible, and adaptive
(because they know that all their students are logging on at
least once a day)
ü
Create a foundation for
learning communities (by providing effective groupware and
providing multiple reasons for people to log on at least
once a day)
ü
Help the users and
providers manage an increasingly large and diverse
constellation of information for the purposes of teaching,
learning, and research
ü
Save users time and/or
increase their use of services (due to gains in personal
efficiency)
ü
Reduce institutional
costs of service delivery by consolidating, reducing, or
eliminating traditional ways of providing services and using
the portal instead (e.g., offering online registration
rather than staffing to handle face-to-face registration of
all students)
ü
Help the institution
reduce the costs of system change by creating an operating
environment that allows systems old and new to interact
smoothly with one another
ü
Strengthen the bonds
with alumni and others outside the community; increase
support from these groups for the institution
ü
Change student, faculty,
and staff attitudes toward the institution (the institution
is seen as transparent, helpful, and supportive rather than
opaque and a barrier)
Of course, the portal
alone cannot achieve any of these goals. The relation of
portal to purpose is somewhat analogous to the relationship
of yeast to bread. It’s hard to bake
bread without yeast, just as it’s hard to communicate daily
with students if they don’t log on, but neither yeast nor
portals are the only ingredients in those recipes.
[ii]
It’s tempting to claim
“all of the above” as goals for your institutional portal.
But remember that actually reaching each of these
goals requires a different series of action steps
(“ingredients”), and a different set of studies to
guide[s1]
the effort. The more goals your portal seeks to achieve, the
greater the expense will be.
The rest of this chapter
describes the different kinds of studies that, in
combination, can provide a useful and efficient way to guide
your institutional portal to functional success.
Select those studies that make the most sense for
your institution.
If your institution is
still considering whether to create (or totally revamp) its
portal, it makes sense to find out what other institutions
are learning from their experiences with portals. If you
can’t find a study on this topic, you could do your own.
For example you could send an initial set of
candidate goals to peer institutions that have had portals
for a year or more. Follow up with phone
interviews. Ask the respondents to assess the success of
their portals in each of those areas. What evidence do they
have for citing such a success? (Unless that institution is
doing an exceptional job of helping its staff do studies,
expect anecdotal information here; it can at least be
suggestive, even if it is rarely compelling.)
Also ask them about areas of stress and cost during
development and operation of their portals.
Studies such as these
can help you develop an action plan for your portal project
and guide your early work on the other ingredients needed to
achieve the highest priority goals. You
might learn from this study, for example, that learning
communities can be supported and even created with the help
of a portal. You might also discover
that successful learning communities require many other
ingredients, too, some of which may not currently be present
at your institution. These might include ways of
coordinating student registration in multiple courses,
faculty development on how to grade work done by students in
teams, or creation of new courses. The non-portal
ingredients for a learning community, such as those listed
above, can take longer to put in place than the creation of
a portal. If learning communities are a
major reason for creating the portal, it makes sense to
begin putting the other ingredients in place as soon as
possible so that, as soon as the portal is in operation, it
can help create and support learning communities.
The necessity of other
ingredients such as faculty development, new online
services, or new course designs may seem obvious but many
colleges have invested in technology and found disappointing
results because they followed this route:
- Some
peer institutions bought a new
technology (let's call it technology "A"); there was
lots of buzz about it. Enthusiasts
said Technology A could be used to support learning
communities, learning communities are important, so it
would be important to use Technology A, just like the
competitors do.
- So this
institution bought Tech A, too; discussions of learning
communities were then put on the backburner until the
system could be made operational
- Two years
later, after Tech A was deployed and reasonably
reliable, discussion returned to learning communities.
Someone pointed out that faculty development would be
necessary so, after a few more months, the first small
workshops were offered.
- A year later,
other needs had become apparent: new recruitment
brochures were drafted, for example, to try to attract
students who liked learning communities. Some fixes were
needed in space scheduling systems.
Change was slow and uneven, however. Money for these
investments was in scarce supply. No one had thought to
raise such funds, and the new technology had soaked up
most of the available funds.
- Two years
later, interest in Technology A had almost disappeared.
It seemed slow and outdated. The attention of
technology enthusiasts had turned to Technology B, which
had ‘visualization’ as a strength. Learning
communities had never really gotten off the ground.
Those who noticed this failing tended to blame
Technology A which (when compared with Technology B)
seemed old-fashioned and weak
[iii]
To put this another way,
studying what has happened at other institutions can help
you define just what the innovation is that you need to
plan, and evaluate. Usually, the
innovation isn't (just) Technology A; in the hypothetical
example above, the innovation also included the learning
communities and the institutional context supporting them.
So, to learn how to make successful educational use of
Technology A, it's also important to study that context,
including faculty skills, goals, and support; recruitment of
students, and space planning. For planning your
next step in portal deployment, your study of
institutions ought to help you understand the portal-enabled
educational activities you value most.
Such a study should also
help you discover what problems other institutions
encountered. Your findings can help you
avoid some of those problems while preparing users for
difficulties that (you discover) are inevitable;. people are
more likely to endure problems if they have been warned in
advance!
It is always nice to be
able to report that, “we have evidence that our institution
is doing (something) much better than it did three years
ago,” but such statements require that a similar study have
been done three years earlier: the
“before” part of the “before and after” comparison. The
“before” picture is called a “baseline study.”
Baseline studies are ideally done before the portal
effort begins, or at least before the portal has had time to
begin influencing the outcome of interest. But it’s never
too late to do a baseline study if gains in the outcome are
intended to continue. (Some people may
object to the baseline because it’s likely to show bad news.
But that’s the point of taking a “before” picture – to see
if the system can help transform ‘bad’ to ‘good’, or ‘good’
to ‘better.’)
The baseline study
should focus on the behaviors and attitudes that portal
availability is intended to influence.
That’s what determines ultimate benefits and costs of a
portal: what students, faculty and staff choose to do with
the portal.
For example, if one
important benefit is to help instruction become more
adaptive and spontaneous (because faculty can communicate
with students on a daily basis, for example), how adaptive
and spontaneous is instruction before the portal goes into
use? How frequently and how effectively do faculty
communicate with students before the portal is available?
The point here is to identify the
educational activities that you expect will be carried out
differently when a portal is in full, effective. Start
gathering evidence as soon as possible about the state of
those activities. Then, as the portal or its support change,
you can get some idea of whether the activity is benefiting
from those changes. That's your value proposition.
When a program fails to work, there are
bugs that need to be fixed (debugging). The same thing is
usually true about a technology and the educational
activities it is to support. Between the technology, the
activity, and the goals for the activity, there are always
bugs: problems that, when fixed, will allow the activity to
make more successful use of the technology in the
achievement of the goals.
Debugging studies ought
to begin early in portal development and operation. Some
debugging focuses on the software and its operational
support. These studies attempt to identify system
malfunctions, interface problems, problems in training
people to use the system, etc. A system
debugging study ought to focus on finding bugs that would
be:
ü
Important barriers to
one or more of the goals of the portal,
ü
Uncertain (they may
happen, or they may not), and
ü
Invisible without a
study.
Educational debugging refers to problems
in using the portal to accomplish an educational purpose.
A portal may appear to work smoothly and yet be found
to be buggy when users try to employ it for a specific
educational purpose. Such bugs are important to discover
because portals only have an educational benefit when they
enable actual (not just potential) changes in the nature of
educational activities.
For example, one
educational goal for an institutional portal might be to
help instruction to become more responsive and adaptive
(because the portal has helped insure that students check
their Web sites and mail on a daily basis).
If students are discovered not to be using the portal
daily, the next step is to investigate potential causes for
this educational bug. Likely candidates
in this case:
ü
Not enough important
services are easy to use on the portal so some students are
not logging on;
ü
A small number students
are having problems with their Internet service providers,
enough students to disrupt faculty plans that depend on
quick interaction with all students in their
courses;
ü
Some faculty have not
yet realized how they could modify the basic structures and
strengths of their courses once they begin to use the portal
to interact rapidly with students between class meetings.
Some of the bugs
discovered may be easy enough to fix. Other cases may be so
severe and stubborn that the goal itself must be revisited,
redefined, or eliminated.
Debugging to control
costs and stresses: Portals
are likely to create a shifting pattern of stresses on time
and budgets. What’s most dangerous about a ‘stress bug’ is
that it can sometimes lay hidden by the enthusiasm of early
adapters and the expectation that things will be difficult
at first. Studies can provide important
early warning. Without a cost study, users may have become
exhausted and resentful, and budgets may have been
exhausted, by the time that the problem becomes obvious.
The aim of such studies
is to “unstretch” resources: to provide early warning of
activities that are demanding disproportionate and
unsustainable amounts of money, time, or good will. You then
can use the study’s insights to redesign those activities
before it’s too late.
The typical approach to
such studies is called activity-based costing.
[iv] The study’s objective is to
gauge all the resources required to carry out a particular
activity, no matter which budget and institutional unit
those resources come from. For example, a study might focus
on costs of online registration for, and dropping, of
courses. These costs might be distributed among the offices
of the registrar, bursar, IT services, student affairs, and
others.
This type of study is
perhaps the single most important way to improve the
benefits of investments in an institutional portal: track
and analyze the activities that the portal is intended to
enable.
For example, an
institution might study whether portal use is contributing
to community building. Here are some key
activities and outcomes the institution will want to track
over time:
ü
Do users employ portal
features to find or work with other people?
ü
With whom? People they
would have worked with before?
ü
Does use of the portal
seem to alter the interaction in ways important to community
building? For better? For worse? For example, do the
communications seem to help build an appropriate feeling of
obligation among those who work together?
ü
Are there barriers
hindering or preventing this type of communication?
One crucial point:
even if the portal does help people work and play together
in ways that build community, those changes in behavior will
probably be apparent months or years before desired
community outcomes appear (e.g., increased alumni giving).
For that reason, early
studies will focus more on activities (behavior) while later
studies will begin to collect more data on outcomes that can
then be compared with baseline data.
Some of the tools of The
Flashlight Program, which I direct, may be helpful in
studying portals. Flashlight currently
offers several kinds of tools to subscribing institutions
including the Flashlight Current Student Inventory (almost
500 validated questions for use in surveying or interviewing
students currently enrolled in a course), the Flashlight
Faculty Inventory (items for surveying or interviewing
faculty), and Flashlight Online (a web-based system for
tapping items such as those to help create surveys which can
then be administered either on paper or online).
Flashlight Online, for example, could be used to
create studies about the portal that could be offered both
through the portal and also on paper.
Site licenses for the Flashlight Cost Analysis Handbook are
also given free to subscribing institutions.
Flashlight also works with interested subscribing
institutions to help them develop tailored studies; by the
time you read this chapter, Flashlight may be working with
subscribers to develop study packages for improving
institutional portal use. Check our
Web site.
All too often in the
past an institution bought a technology because the
technology is ‘in’ and enthusiasts demanded it. “We can’t
compete without it,” they might have said.
The educational goals (or other institutional goals)
for the investment may never have been made clear. And there
often was never an evaluation to help the innovation
navigate safely through the shoals of implementation.
Technical failures are sometimes easy to detect and fix.
Educational bugs are often more subtle, and may be
experienced by people who don’t have the information or
budgets to fix the problems.
It’s
difficulties such as these that have sometimes prevented
previous innovations from having much impact on
institutional teaching and learning. Portals have an interesting set of features:
- they're expensive and time-consuming to create and support
- they have many purposes, educational support and improvement are often mentioned among them.
- but the actual value of a portal for
supporting and improving education is exceptionally difficult to
discern, evaluate or improve because it's so diffuse.
Evaluations of the sort sketched in this chapter could go a long way to
assuring that the educational value of an institutional portal is real,
and growing.
Ehrmann, Stephen C.
(2002), “Viewpoint: Improving the Outcomes of Higher
Education: Learning From Past Mistakes,” EDUCAUSE
Review (January-February), pp. 54-55. The article is
also available online at
http://www.tltgroup.org/resources/Visions/Improving_Outcomes.html.
Ehrmann, Stephen C.,
Joseph Lovrinic, and John Milam, The Flashlight Cost
Analysis Handbook, Washington, DC: The TLT Group, 1999.
[i]
I do not ordinarily use the term “campus” portal.
I don’t think that “campus” should be used as
a synonym for institution, for the same reasons that
“classroom” is not a synonym for “course”: much of
the important resources and much of the important
activity does not take place in the physical space
of the campus or the class’s room.
[ii]
For more on implementation and evaluation of
long-term, technology-enabled educational
improvements, see Ehrmann (in press).
[iv]
For a handbook and cases on how to do activity based
cost models of educational uses of technology, see
the Flashlight Cost Analysis Handbook. For
information on the current edition and how to obtain
it, see
http://www.tltgroup.org/programs/fcai.html
|