|
These
materials are for use only by institutions that subscribe to
The TLT Group, to participants in TLT Group workshops that
feature this particular material, and
to invited guests. The TLT Group is a non-profit whose
existence is made possible by subscription and registration
fees. if you or your institution are not yet among
our subscribers,
we invite you to
join us, use these materials, help us
continue to improve them, and, through your subscription,
help us develop new materials! If you have questions
about your rights to use, adapt or share these materials,
please ask us (info @ tltgroup.org).
1. Select Activities l
2. What other ingredients needed? l
3. Monitor Activities l
4. Debug Activities l
5. Diagnose Barriers to Participation
l 6. Control Costs l Summary l
Attachment: List of Activities
l
Part II: Using Student Feedback to Improve ePortfolio
Activities l Flashlight Evaluation Handbook Table of Contents
Abstract
Part I of this chapter of the Flashlight Evaluation
Handbook is designed for people who lead and support
ePortfolio initiatives at an institutional or wider level.
It describes an activity-focused approach to
guiding and accelerating ePortfolio use at an institution. It
includes suggestions for how to
a)
Select which activities to improve;
b)
Accelerate the pace at which these activities can
develop,
and
c)
Limit the costs, stress and risk associated with
carrying out those activities.
Part II is aimed more an academic staff and those who
educate and support them; it describes how individual instructors can use student
feedback to figure out how to fine-tune use of ePortfolios
in individual courses. (This section is still being
written.)Definitions
An electronic portfolio
(ePortfolio) is a collection of the author's works,
-
stored online and accessible for
viewing (and sometimes commentary);
-
often accompanied by reflective
commentary about how those works collectively provide
evidence of what the author can do and/or about what the
author has learned; and
-
organized to support one or more
activities, each involving a particular audience (e.g.,
showing instructors whether the author has met their
learning goals, stimulating the author to reflect about
past learning in order to motivate and guide future
learning, helping the readers such as instructors or
accreditors understand what an academic program’s
strengths and weaknesses are, helping the author get a
job, ...).
Notice
that we define ePortfolio as the product of an author (in
some ways similar to an anthology - a work that has its own
character but is also a collection of other works) -- that
has been designed as an ingredient for one or more
activities. We do not define "ePortfolio" as software. A
word processor can be used to create files that are then
posted on a web site with a file transfer rogram. And so-called ePortfolio software can be used
for purposes other than creating ePortfolios. This
distinction between product and influences almost every
element of this chapter.
“Formative evaluation” is
defined here to mean any and all studies designed to
discover information that can guide improvement of a program
and its outcomes. (In contrast, “summative evaluation” is defined to
mean any study designed to indicate whether a program has
succeeded or failed.)
A Cautionary Tale
We have heard the following story too
many times in recent years:
- A few faculty members, offices, and students each
began to use something they called an ePortfolio (or some
synonym for that term). These early users each employed different
software packages and, in fact, used them for different
activities: personal development, job applications,
projects within courses, tracking student development
toward a degree, fostering reflection, work with
potential employers of their students, improving
articulation with schools, program evaluation and
accountability, ...
- The IT staff begins to support a few of the
software packages, providing technical support and
training. Others are ignored.
- Responding to complaints about software inadequacy
and/or to assure links to other institutional systems,
the IT staff might also begin building its own
ePortfolio software.
- As use of ePortfolios and the support burden grow,
someone, quite possibly Information Technology, decides
to standardize on one ePortfolio software package. There
are several sensible reasons for this decision. IT can
focus its support resources on just one system. Also IT
believes that ePortfolio software, like email, is
probably destined to be an integral part of the
institution's internal communications and information
infrastructure. For support, records security and
articulation with other systems, one software system is
selected.
- Some academic staff accuse the IT department of
making arbitrary decisions without adequate consultation.
Their anger is fueled by the belief that the chosen
software actually is a poor fit for what they've been
doing.
- Years go by, and the sense grows that, while
ePortfolio software has figured in some real
achievements, it has not had the transformative impacts
that its various pioneering supporters had hoped for.
There are many reasons for this tale of frustration. Here
are three:
- The people in this story believed that the key
ingredient in ePortfolio success was software: get the
right software and their purposes could be achieved.
That assumption is almost certainly false.
- The people in this story assumed that the right
ePortfolio software package can serve all ePortfolio
purposes. That assumption is also almost certainly
false.
- The people in this story didn't see a need to
collect any data, except perhaps about the experience of
other institutions with available software.
A Series of Formative Evaluations:
Summary
Part I
of this chapter describes a series of formative
evaluations designed to help ePortfolios improve
activities. Several of the studies below may be
carried out simultaneously, if you so choose. You may also
choose to do just a few of these studies.
-
Identify activities that are the primary uses of the
ePortfolio system – the major reasons for the
investment in this use of software. Are there particular
outcomes that improved activities are supposed to
foster?
-
Assess wants and needs: Of the various activities
and goals that could be advanced with ePortfolios,
which are needed most widely? Most deeply?
-
Study “recipes” for improving activities, without and
with ePortfolios: How are those activities carried
out without ePortfolio at your institution and
elsewhere? What are the forces, strategies and factors
that influence success (the “recipe”)? What kinds of
outcome seem to result, good and bad, as the activity
changes?
-
Measure key activities periodically and, when
appropriate, measure their outcomes. Such studies
can help focus attention and guide investment of
resources over the years. If possible, begin before the
ePortfolio is implemented so that, later, you can see
whether, when and how the activities and outcomes
improve.
-
Debug the activities, i.e., discover factors that
frustrate most or all users. Discover how to increase
the incentives for those activities.
-
Develop and use diagnostics to reduce barriers to
100% participation, i.e., discover factors that
frustrate individual users and develop a process that
can assess and aid such users so that participation and
success rates with ePortfolio use approach 100%.
-
Study use of time and of money in order to reduce
stress on staff and budgets as portfolio use widens
and deepens
-
Test your theories: are the activities indeed
improving? Because of ePortfolio use? Is there evidence
that this portfolio-aided improvement in the activity is
indeed aiding the desired improvements in outcomes?
Ideally you should take steps 1-5
before your first ePortfolio pilot implementation begins,
even before you select the software. However, if
you’ve already begun, you can still begin step 1 at any
time. In fact, you’ll probably change software in a
few years, so you can look at this work as preparation for
the next generation of technology while simultaneously
helping you get more value from the software you’re already
using!
Which activities – which patterns of
use of ePortfolios -- should most influence your planning
and your formative evaluation? The
Attachment to this Flashlight Guide provides a
collection of candidate activities.
Notice that each activity has its own
ingredients for success. Software is often neither the
most expensive nor the most difficult of these ingredients.
Second, notice that each activity needs
somewhat different functionality from its software. For
example, some uses (e.g., ePortfolio use in a single course)
could use simple software for creating an online project and
reflection, software that might disappear the following year
with no adverse consequences. For this activity, important
criteria for picking software might be whether students
already know how to use it, and can use it for no additional
cost. Meanwhile other activities imply use of the ePortfolio
for permanent academic records; for these activities the
ability to interoperate with other academic systems and
vendor independence might be crucial.
One implication of this table: the
institutional administration has a bigger stake in some
activities than in others. Some ePortfolio uses can be
handled at the departmental or course level with only modest
needs for central support. Others will need to be led or at
least coordinated centrally.
One way to gather data and to stimulate
discussion about where to focus attention is to do a survey
of potentially interested academic staff. Here's a
crude first draft of such a survey, using a few of the
activities from the attachment to
this chapter.
It's rare that the right ePortfolio software is the only
missing ingredient: find the package and, voila!, your
activities will succeed perfectly! It's far more
likely that, in addition to software, other moves will be
needed: training? curricular change? attracting students or
staff who need and like the activity? partnerships outside
the institution? changes in policy? The attachment
suggests just a few of these other ingredients for
your recipe. Whether you're granting funds, planning
support, or developing ePortfolio activities yourself, how
can you best plan? how can you get help in figuring out what
these other ingredients are?
A. Other institutions' uses of ePortfolios
For many institutions, the first step
is to support some pilot tests, and then observe (more or
less closely) how they succeed and fail. This is sometimes a
rather wasteful and time-consuming way to start.
It's a lot cheaper and much
faster to evaluate what happened when other institutions ran
their own pilot tests. First identify
institutions with similar cultures, students, and IT
infrastructure – institutions that have already tried
ePortfolio pilots for various activities. Which pilots began
to grow? Which faded out?
Failures are
sometimes the easiest place to discover that an ingredient
was important - because lack of attention to that ingredient
is what caused the failure. It's probably easier to
get a sense of the ingredients needed for an activity to
succeed by taking a quick look at pilots using ePortfolios
for that activity in several kinds of programs.
What do we mean by other ingredients to
the recipe? Let's think about using an ePortfolio that
includes videoclips to assess and guide student development
of performance skills. What factors need to be in
place for this activity to succeed?
- A curriculum to
develop these skills
- Faculty and
external experts who can assess these skills; this kind
of learning and assessment take time so some other
themes may need to be de-emphasized or eliminated in
order to make ‘space’ for this work;
- Audiences who
are prepared, substantively and technically, to see and
understand this material (e.g., the employers and
colleges where students might be taking these
portfolios);
- Technologies
and facilities for producing and editing the digital
video: cameras, studios with good sounds and lighting,
and server space for the video
- Training to use
those facilities;
- Standards for
creating the video, to decrease the chance that, a
decade from now, no one will be able to view the video
or the annotations because technology has changed
- Policies for
assuring that everyone has access to the technology
(lending equipment for use off-campus?)
- Rights
agreements, including people other than the author who
appear in the video or who helped create it.
Note: study issues that occur when a pilot “scales up” to
operation across courses, across departments, and over
decades.
Here’s an essay by Steve Acker of Ohio State on several
problems that can grow as more and more programs and
students use ePortfolios: the students’ intellectual
property, student motivation, and faculty time.
Note: some ePortfolio activities
involve more than just the academic program itself.
The ePortfolio might be used for job applications, by
companies planning their future professional development
programs, for sharing records between schools and colleges,
and for external agencies with responsibilities for program
review and accountability. To identify ingredients for
a successful activity, you'll need to get information from
these other organizational units and individuals.
B. Study Your Own Uses
You wouldn't be reading this chapter if your institution
didn't already have some people using ePortfolios: faculty,
students (some of whom almost certainly have ePortfolios
they've created on their own), and staff. And you
might have resources to fund some additional pilot tests.
So learn from your own people's experiences, too, activity
by activity:
For example, if in a local use of an ePortfolio,
one activity was to engage outside
professionals in assessing student progress toward a degree,
does that practice seem stable? is the advice of the
professionals seen as valuable by students and faculty? do
the professionals seem willing to do this repeatedly? if
the activity is fading, what factors interfered
with success? what was missing? what got in the way?
C. Ask potential users how important it is to improve
these activities, and what's needed to do so
From these pilot tests, at other institutions and your
own, gather some data you can show to other folks in your
own program about how ePortfolios can enhance activities
that your folks are likely to find important.
Gather small groups of potential leaders and users. Show them
the evidence - ePortfolios (artifacts, reflections,
assignments, etc.); video testimony from faculty, students,
and support staff. Ask your people to assess the
strengths, weaknesses, opportunities, and threats they see.
Provide some way for them to ask questions of the people
behind the pilot. After the discussion, poll them
about what would need to be done to implement and scale up
such an activity in their own program. And, finally, ask
them what priority should be given to doing just that.
D. What are the important policy questions for your
institution to answer?
(NEW, added April 23, 2008)
During these conversations you can begin to examine
policy questions that are likely to arise as ePortfolios
become more widely used by institutions and learners. Here's
a first draft list of some 'straw man' policies (some of
which conflict with others) that a team of us devised in
preparation for an ePortfolio planning workshop at the
University of Queensland in Australia. It's intended to
suggest some of the policy questions that institutions will
need to consider.
- The University should store (for at least the lifetime
of the learner) certain types of artifacts and
reflections, no matter what the learner says and at no
cost to the learner).
- The University should store (for the lifetime of the
learner) certain types of artifacts and reflections, if
the learner requests it (and with some charge to the
learner?).
- The learner is responsible for maintaining (or seeing
that the artifacts are maintained) their learning
activities over time, independent from any institutional
context.
- The learner has sole control over these artifacts and
reflections.
- The University has sole control (ownership) over these
artifacts and reflections.
- The University needs to support a way of displaying
student portfolios that match regulator requirements.
- The University should not rely on proprietary
standards or software that would interfere with its
ability to carry out the commitments in #1 or #2.
- Students should be able to selectively display any of
their artifacts to their peers, faculty, or external
audiences.
- The University should embed a marker of of
authenticity in work that the University has certified.
- When adding an assessment to a student's portfolio,
each academic staff member has the right to control
whether outside audiences can see that staff member's
comments.
This
list of questions reflects the assumption that artifacts and
reflections should be stored separately from any software
used to display the ePortfolio as a whole. That's because
the software used to display and analyze the ePortfolio
("ePortfolio software") seems more likely to be transitory
and local, while many ePortfolio activities require
longevity and, in some cases, use of the artifacts and
reflections as the author moves among educational
institutions, jobs and phases of life.
E. Choosing ePortfolio
Software/Services
There is no reason to assume a one-size-fits-all
solution; as this table
indicates, different ePortfolio activities have different
technological requirements, yet everyone needs simplicity
and interfaces tailored to their needs in order to get users
on board with minimal support costs and opposition.
Equally important, the activities (which ought to be your
real focus, not the software) can take many years to
develop. For the activities to have a perceptible impact on
educational outcomes such as the capabilities of your
graduates, graduation rates and so on - that can take still
more years. The chances are good that, by the time you
achieve such outcomes, you'll have had to change software
more than once.
Edutools, in
collaboration with the Western Cooperative for Educational
Telecommunications, did this
feature-by-feature comparison of a variety of different
systems. While this collection of vendors is obviously
limited, their site will give you a start in developing your
own chart of functionality and vendors.
At most institutions, 'ePortfolio
initiatives' are defined by a particular software package.
So they would begin studying an activity sometime after they
acquire the software and get it into use. But the
approach recommended here suggests focusing on the activity
(e.g., fostering student reflection; helping students
document achievement for their job and graduate school
applications, etc.). So the right time to begin assessing
the activity is right now.
One advantage of starting now: for
activities of any scale (e.g., monitoring student progress
toward a degree in order to improve the capabilities of
graduates over the long haul), it is likely to take more
years to improve the activity than the lifetime of any one
software solution. In other words, to make the change in the
activity visible, you'll need to be studying the use of a
series of two or more software solutions over a period of
years. So begin gathering evidence about the activity - how
much is it going on? how effectively? what factors affect
it? -- right now.
Ideally you’d like to start measuring
the most important activities before your new ePortfolio is
implemented. That would provide the baseline so that, after
the ePortfolio has gone into wide use, you could compare
levels to see whether the activity has indeed improved.
Whenever you start these measurements,
keep going. Check on the level and effectiveness of the
activity on a regular basis: perhaps once or twice a year.
There are several reasons for periodically measuring
each of the focal activities:
- To make the activity
visible enough so that people can guide it intentionally.
ePortfolio-supported activities are ordinarily almost
invisible: how much and how well do students reflect?
In what ways can employers see examples of student work
and assessments of that work? By making an activity
visible, even in a vague and approximate way, it becomes
more possible for academic staff, students and other
stakeholders to discuss it and alter it.
- To help
maintain attention on this activity for enough years
that the improvements can be even larger. Higher
education suffers from Attention Deficit Disorder. An
annual evaluation report can keep bringing people’s
attention back to an issue for enough years to allow
really important progress.
- To spot areas
where you can boast. Activity measurement can be quite
useful because activities improve earlier than do
outcomes. If, for example, you’re implementing
ePortfolios in part to improve community, as part of a
larger strategy for improving retention and graduation
rates, you would see improved communications and
connectedness using the portfolio a year or more before
you’d see any resulting change in retention, and several
years before you’d see changes in graduation rates.
- To document and dramatize
areas of need (suppose that the
ePortfolio initiative is
college-wide but the activity has improved a lot in some
departments and not at all in others) so that you can
pay more attention to them, and use the data to attract
fresh resources to the problem area.
Discover problems that are
preventing everyone, or most people, from using the
ePortfolio for this activity, so that you can fix the
problems.
Here’s an example of such a ‘bug’: an institution wants to use outside
experts to assess student ePortfolios. But they have
neglected to develop a procedure for finding, rewarding, and
retaining these outside experts adequately. Academic
staff and
students are left on their own to find help. Worse, the work
is so unrewarding that outsiders rarely volunteer twice.
That’s a “bug”: a factor that usually prevents the activity
from happening as hoped.
This step in formative evaluation is
designed to find such bugs early in a pilot program so that
they can be fixed before people’s energy is sapped and
before the program’s reputation begins to suffer.
Debugging might begin with focus
groups. The facilitator takes participants through each step
of the activity, asking open-ended questions such as “is
anything more difficult in this step than you anticipated?
Is this step easy? Exciting? Confusing? Maddening?
Impossible?” A recent study of ePortfolio initiatives
at Waterloo University contains some good examples of
debugging. (Tosh,
Light, Fleming and Haywood, 2005)
Surveys and user response forms built
into the software can also help you find bugs.
Diagnostic evaluation can help you
identify the reasons why not everyone is yet using the
ePortfolio for this activity. Sometimes the reasons are
barriers, or lack of incentives: when you use these
findings, participation rates can increase, sometimes with
relatively little effort.
Example: Suppose your College of
Education planned its ePortfolio initiative to help faculty
work with supervising teachers in area schools as they
assess student teachers. A focal activity: you’re hoping
faculty and supervising teachers will use these
collaborations to discuss how to improve the teacher
education curriculum. And suppose your tracking
studies (section 4) indicate uneven levels of this activity:
some faculty members and supervising teachers are indeed
being stimulated to discuss the curriculum as they analyze
student ePortfolios. But, so far, many other faculty and
supervising teachers are not having such conversations. Why
is that? If you could discover the reasons, your
program might be able to move toward 100% use of the
ePortfolio for this activity.
Because some people are succeeding, we
can guess that the theory itself is sound: ePortfolios can
indeed be used effectively for this activity. The aim of
‘diagnostic’ studies is to identify barriers and incentives
that result in uneven use of the ePortfolio for this
activity. Here are a few examples of such factors:
|
Hypothesis about barrier to collaboration among
faculty members and supervising teachers |
Sources of data to test this hypothesis |
If this hypothesis is supported by the data, what
might you do? |
|
It
hasn’t yet occurred to some faculty and teachers
that they can use ePortfolios in this way (despite
your efforts to publicize this goal). These
particular faculty and teachers would love to use
ePortfolios this way but they didn’t think of it,
or didn’t remember to do it. |
Survey
Interviews |
Discover why they didn’t see, understand, or
remember your publicity or training about this use
of the ePortfolio. Use these findings to improve
your outreach strategy. |
|
Many faculty and teachers deeply distrust
collaboration of this type, and see it as a waste of
time |
Anonymous survey
Interviews |
Look for examples of programs elsewhere where this
kind of collaboration has been productive and
popular.
Use
that data to help conversations among faculty and
teachers to confront their doubts and decide
together how to put them to the test. Maybe
this is a waste of time? How would we prove that?
How might we prove that good collaboration is
productive? |
|
Perhaps some participants are having trouble with
relevant features of the software. |
Survey |
Improve training?
Change software? |
|
Perhaps the participants are each defining a key
term in key ways, leading to needless
misunderstandings and arguments |
Focus groups |
Focus groups and larger group discussions may be alb
to help people develop shared definitions. Or
perhaps more carefully written materials will do the
job. |
That’s the way these kinds of
diagnostic studies are designed. You begin by
identifying a key activity that is not proceeding as
planned. Then think about the elements needed for it
to succeed. Study whether those elements are in fact in use.
And when you discover reasons for the problem, fix it if you
can.
Suppose, for example, that one activity
that’s important for your program is using the ePortfolio
to document student progress toward graduation competences
for your degree. You might discover that 75 courses
are contributing data but another 50 are not. To
quickly discover the reasons, you might create a survey.
Part of such a survey might look something like this:
This
university has implemented an ePortfolio system in order to
document student progress in mastering skills required for a
degree. To help us evaluate and improve the ePortfolio
system, we need some information from you.
1. To
what degree did this course use ePortfolios for this
purpose? (check the answers that apply)
___
Student works
done in this course were submitted to the ePortfolio
system, so far as I know (If you check this answer,
please sign the form below and submit – you’re done)
___
Assessments
of student work were submitted to the ePortfolio
system, so far as I know (If you check this answer,
please sign the form below and submit – you’re done.
___
I’m not sure whether we did or did not submit works or
assessments (If you check this answer, please sign the
form below; you’re done.)
___
This course did not submit student work or
assessments of the work to the ePortfolio system.
(If you check this answer, please also answer #2.)
2. To
what extent were each of the following reasons an important
factor for the decision not to use the ePortfolio?
(on a scale from 3= “crucial reason” to 0= “not a factor”)
___
The ePortfolio system is important, but not appropriate
for this particular course
___
Due to lack of training or poor manuals, I couldn’t
figure out how to use the system in the time I had
available
___
I have objections to this system and decided not to use
it.
___
I couldn’t find enough external assessors
___
Students told me they didn’t like the system
___
I didn’t know the system existed. Sorry!
___
I had planned to use the system but we ran out of time
(etc.)
Another reason technology
initiatives fail is that, as use grows, workload or expenses
grow in ways that are unanticipated and, in crucial ways,
unacceptable. Cost studies, if done early, can help
anticipate and prevent such burnout.
One hazard that any pilot program faces
when it’s about to be scaled up is that the service might
create unacceptable loads as it grows: key support staff may
become overburdened, workloads that are acceptable for
pioneers and early adopters are unacceptable for some of the
faculty or student users, expenses may grow unacceptably,
etc.. If any of those things happens, the innovation
often collapses before a cure can be found. Such
failures can sometimes be avoided if the potential for
burnout is discovered before people become alienated and
budgets are over-spent..
Study 1: check on comparable systems
being used to support comparable activities at other
institutions. Which activities might become
insupportable as the system grows?
Study 2: talk with support staff,
faculty and students during the pilot phase, once they’ve
had some experience using the system. Do they predict
that they themselves will continue to use the system? Do
they think all their friends or colleagues would like it and
be able to fit in their schedules and budgets?
If danger signs appear, you may want to
use activity-based costing to create a model of the activity
and then do some “what-if” modifications to see if there are
ways to redesign key elements of the activity so that, if
possible, performance can improve while costs for people and
budgets are reduced. (For a quick course in how to do
activity-based costing, see, for example, the
Flashlight Cost Analysis Handbook; all subscribing
institutions have a copy and a site license to create
copies.)
ePortfolio software’s educational
value stems only from its use to support improvements in
important educational activities. You can improve the
chances of your initiative’s success, and help control
associated stress and costs, if you focus on those
activities. Your institution can create a program of formative
evaluation that includes some or all of these elements:
- Identify
activities that are the primary uses of the
ePortfolio system – the major reasons for the
investment in this use of software. Are there particular
outcomes that improved activities are supposed to
foster?
- Study
“recipes” for improving activities, without and with
ePortfolios: How are those activities carried out
without ePortfolio at your institution and elsewhere?
What are the forces, strategies and factors that
influence success (the “recipe”)? What kinds of outcome
seem to result, good and bad, as the activity changes?
- Measure key
activities periodically and, when appropriate,
measure their outcomes. Such studies can help
focus attention and guide investment of resources over
the years. If possible, begin before the ePortfolio is
implemented so that, later, you can see whether, when
and how the activities and outcomes improve.
- Debug the
activities, i.e., discover factors that frustrate
most or all users. Discover how to increase the
incentives for those activities.
- Develop and
use diagnostics to reduce barriers to 100%
participation, i.e., discover factors that frustrate
individual users and develop a process that can assess
and aid such users so that participation and success
rates with ePortfolio use approach 100%.
- Study use of
time and of money in order to reduce stress on
staff and budgets as portfolio use widens and
deepens.
In all these areas, starting with
building your list of activities, it helps to study the
experience of other programs and institutions who have
broken trail for you.
- Stephen C. Ehrmann, Director, The
Flashlight Program
References
- Ehrmann,
Stephen C. and John Milam (2003),
Flashlight Cost Analysis Handbook: Modeling
Resource Use in Teaching and Learning with Technology
(version 2.0), Takoma Park, MD: The TLT Group.
1. Select Activities l
2. What other ingredients needed? l
3. Monitor Activities l
4. Debug Activities l
5. Diagnose Barriers to Participation
l 6. Control Costs l Summary l
Attachment: List of Activities
l
Part II: Using Student Feedback to Improve ePortfolio
Activities l Flashlight Evaluation Handbook Table of Contents
|