|
Stephen
C. Ehrmann
Director, The Flashlight Program
Vice President, The TLT Group
Originally published in
Assessment Update,
IX:4, July-August, 1997, pp. 3, 10-11, 13.
Revised January 2005
Abstract:
In
1992, the FlashlightTM Program began developing a an evaluation
"tool kit" of validated survey items, cost analysis methods,
and other resources that educational institutions could use to study and steer their own
uses of technology. The creation of such a tool kit is made possible by the
fact that a) certain hopes and fears about specific uses of technology are
quite universal, and b) many of those same activities (uses of technology)
tend to produce better learning outcomes, according to decades of
educational research. This essay sketches the conceptual and
historical roots of the Flashlight Program.
Flashlight
tool kits have several features that may be of wider interest to the field of
assessment and program evaluation:
- focusing
on choices about learning and teaching made by students and educators (how they use
technology) as a way of explaining the outcomes of technology
investment;
- focusing
on the practices that tend to produce better learning outcomes as a way
of reinforcing data on outcomes or, if data on outcomes are not directly
measurable, as a way of estimating quality;
- surveying
and summarizing changes in teaching and learning practice across the large number of
courses that are typically needed to create substantial improvements in programmatic
outcomes;
-
focusing on negative hypotheses about technology as well as positive ones; and
- developing
an evaluation tool kit that is easy enough to understand and use that the necessary
numbers of faculty and staff can be involved in designing studies, gathering data, and
using results;
Illuminating The Elephant
Educational
institutions of all types are investing enormous effort, money and risk capital in
computing, video and telecommunications. (As are their students.) They hope for changes in
educational strategies and thus to change educational outcomes. For example, institutions
may invest in Internet connectivity partly to help support more collaborative learning and
more use of information resources off-campus; this may in turn be to help achieve better
retention, economies of scale, and graduates who are more able to apply what they've
learned.
Each
institution would usually like to know whether its investment is working. And, if not, the
staff would like to know what the barriers to success might be.
The
technology per se is relatively easy to "assess" -- it's relatively obvious
whether the e-mail is operating or not and it is sometimes feasible to measure its volume.
But two years later is there indeed more collaborative learning? Are graduates now working
more competently in teams? If so, has the e-mail played any sort of role in that success?
The Flashlight: The act of program
evaluation in education is like using a small, dim flashlight to decide what sort of
animal might be in front of you in a pitch black cave. (We'll assume for metaphorical
purposes that you can't hear or smell!) The relative brightness (rigor) of the flashlight
(evaluation) is less important than where one points the beam (asking the right evaluative
question). Each evaluative question is the equivalent of pointing the tiny beam in a
particular direction and waiting to see what walks into the light. It may seem a hopeless
task -- a pitch black cave, a narrow and wavering beam of light, and in that beam
occasional flickering impressions of light and dark. What 'rough beast' is really out
there?
Fortunately,
in this case, the task happens to be relatively more feasible. Imagine that your curiosity
is quite focused. You are vitally interested in knowing whether there is an elephant in
front of you, perhaps because you are hoping to see an elephant and have reason to think
one might be around. Whether a mouse is (also) around is of little concern to you - just
elephants. Because you have that specific question in mind, you would probably shine your
flashlight high and look for signs of tusks or floppy ears in the narrow beam of light. Or
you might have some other idea for how to use your light to identify an elephant.
The Elephant: As it happens, many
technology-using educators are looking for "elephants" these days:
elephant-sized technological revolution in their instructional programs. And the cave is
indeed huge and dark: ordinarily we don't see major changes in who can learn, what
graduates can do, or what education costs unless change is broad, deep and diffused into
the fabric of the program. It is even harder to see whether there are widely diffused
changes in the fabric of teaching and learning practice in an institution. The patterns
are hidden in the relatively private activities of large numbers of students and staff.
The
investigation might well be impossible but for one thing: different types of institutions
and disciplines seem to be adopting similar technologies and using them in comparable ways
for similar purposes. They also have similar anxieties about what might be going wrong out
there in the dark. Schools, two-year colleges, research universities, and large-scale
corporate training programs; geographers, psychologists, and chemists -- their wishes and
worries about technology are strikingly similar. This consensus set of hopes and fears is
what we have been referring to as the elephant. And the fact that so many educators are
wondering whether this elephant is in their cave has given birth to the Flashlight
Program.
This
set of consensus wishes about technology, to be described in more detail below, includes
support for good practices such as collaborative learning, faculty-student interaction,
active learning (e.g., through work on realistic, complex projects) and increased student
time on task as well as outcomes such as more extensive and equitable access to an
education, graduates who can apply what they have learned, and costs that are under
control.
The
consensus worries include issues such as inadequate support, ineffectual pedagogy, and
technology that might be hindering learning.
Describing the Elephant -- The Flashlight Planning
Project, 1993
The
Flashlight Project was conceived in 1992 and took wing with the receipt of a
planning grant from the Fund for the Improvement of Postsecondary Education
FIPSE). The
goal of the Annenberg/CPB Project's 1993-94 Flashlight Planning Project was to discover
whether five very different postsecondary institutions had similar "visions worth
working toward" (to borrow Steven W. Gilbert's phrase), i.e., similar intentions
about why and how to use technology, and similar worries.
The leadership team included the author (then Senior
Program Officer for Interactive Technologies at the Annenberg/CPB Projects), Sally
Johnstone and Robin Zúñiga of the Western Cooperative for Educational
Telecommunications, and Trudy Banta of Indiana University Purdue University Indianapolis.
Five
disparate institutions delegated a two member team -- one faculty member and one
administrator -- to participate. The team prepared an initial working paper and a two
round Delphi study by which the participants fine tuned the model. The effort climaxed in
a two day working meeting in which participants made final decisions about elements of
technology use, educational strategy and educational outcomes that were of common concern.
These
five distinguished and distinctively different institutions of higher education included:
- one
of the largest community college districts in the country (Maricopa Community Colleges),
- a
public institution that offers a state-wide, virtual community college program supported
by a combination of video, computing, and telecommunications (Education Network of Maine);
- a
major land grant institution with innovative programs exploiting technology for students
on- and off-campus (Washington State University -WSU);
- an
institute of technology with a national record in both distance learning and services for
the handicapped (Rochester Institute of Technology -RIT; and
- a
public university that exemplifies institutional partnership at virtually every level
(Indiana University - Purdue University at Indianapolis - IUPUI).
The Shape of the Elephant: A Consensus View of the Nature of the
Educational Revolution Enabled by Information Technology
The
aim of our group was not to invent a vision but rather to report clearly on the vision
that was already taking shape in their own institutions, the otherwise-invisible elephant
that seemed to be gaining size and momentum back home. It is not possible in this brief
paper to describe the issues that Flashlight is designed to track --the shape of the
"elephant" -- in great detail, but we can summarize some key points.
The
technological foundation for educational improvement: Most of this consensus strategy is
based on extensive, sophisticated use of "worldware," i.e., hardware and
software that was developed for use in the wider world but that is also used for teaching
and learning (e.g., spreadsheets, the Internet, computer-aided design software). In
contrast, courseware (i.e., software developed and marketed for specific instructional
purposes) plays a more modest role. In 1994 worldware was far more prevalent and our team
saw no prospect for change in the near future. Flashlight tools will help educators learn
what sorts of worldware students are using, where and how much, e.g., in course work, in
their jobs, at home. The Current Student Inventory also includes a more modest number of
questions about the roles that courseware can play. Technologies covered in version 1.0 of
the Current Student Inventory include:
- Audio
conferencing: use of multi-party live audio, usually by telephone lines but also through
the Internet
- Commercial
software, from spreadsheets to research computer applications (other than word
processors). Students learn to think and act with these tools as part of their education
for thinking and acting with them later on
- Courseware
(e.g., computer-aided instruction, computer tutorials)
- Electronic
communication (e.g., e-mail, newsgroups, listservs, "chat rooms," real-time
writing, etc.). If a researcher wants to study several such electronic communications
media separately, these items can be reworded appropriately:
- Graphing
and scientific calculators
- Internet:
Creation of Web pages and other Web materials by students
- Internet:
Using the Internet and World Wide Web for a combination of purposes in support of an
entire course, for distance, distributed, or campus-based learning
- Internet:
Using the Internet and World Wide Web for research (compared with traditional library)
- Multimedia:
Creation of multimedia materials by students
- Multimedia:
Use of multimedia texts or course modules by students
- Multimedia:
Use of multimedia presentations and lecture support by faculty
- Televised
(live) lectures
- Videotaped
lectures or video course materials
- Voice
mail
- Word
processing
Changes
in teaching and learning: One of the most important assumptions underlying Flashlight's
design is that technology does not itself cause changes in learning, or access, or costs.
Rather it is how the technology is used that matters.
Today's
technologies, especially worldware, are empowering, i.e., they widen the options available
to educators and learners. Thus three institutions might invest in the same computer
conferencing software, with one achieving more collaborative learning for commuting
students, another disrupting classes and increasing attrition, and the third experiencing
no perceptible changes in process or outcomes. The difference stems from the choices made
by faculty and students about how to use the opportunities offered by the conferencing
system.
Flashlight
focuses on whether faculty and students find the available technology useful (or a
hindrance) when they try to implement each of "seven principles of good practice in
undergraduate education." (Chickering and Gamson, 1987; Chickering and Ehrmann,
1996):
- Interaction
between the student and teacher (or tutor, or other expert);
- Student-student
interaction;
- Active
learning;
- Time
on task;
- Rich,
rapid feedback;
- High
expectations of the student's ability to learn; and
- Respect
for different talents, ways of learning.
Because
so much research indicates that these practices support better learning, it would be
significant to discover that they were being implemented and that technology was playing
an important role. By the same token, these objectives are mentioned so often by
technology-using educators (especially the first five) that it would be significant to
discover that an institution investing heavily in technology was not implementing these
principles.
Notice
that our ability to focus on research-based conditions supporting good performance is a
"big win." Many people assume that evaluation of the outcomes of technology
investments is easy: "Just see whether students are learning more!" But the real
value-added from technology usually comes because instructional objectives change. For
example, the use of computers in music enables (among other things) courses in music
composition that employ computers as the instruments of composition and performance.
Changes in content like this mean that tests must change, too, so one cannot compare this
year's test scores with those of five years ago to see if computers aided learning. It's
useless to discover that students scored 80% on one test and 90% (or 80% or 75%) on a
different test administered three years ago. On the other hand, if you can discover that:
- the
conditions for good learning have improved (as measured by increased implementation of the
seven principles); and
- the
faculty and the students believe that their use of technology was substantially helpful,
then
you've learned something important. Similarly it would be useful to discover that
collaborative learning is down and that e-mail has been problematic. Or even that
collaborative learning is extensive but e-mail is widely seen as alienating.
Flashlight
has a myriad of focused questions about the most common hopes and fears about technology
and the seven principles: particular conjectures about how specific technologies might be
used in ways that help or hinder the implementation of each of the seven principles. It
also includes questions about other teaching and learning issues. A list of the issues now
part of the Current Student Inventory is included below:
A
= Active learning
C
= Collaborative learning (and other forms of student-student interaction)
D
= Using time productively
E
= High expectations for all students regardless of learning style
F
= Rich and rapid feedback
G=
Engagement in learning
I
= Faculty-student interaction
N=
Cognitive and creative outcomes (including encouraging creativity)
O=
Accessibility
P=
Positive addiction to technology
S
= Prerequisites for using technology (technical skill deficiencies)
T
= Time on task
U
= Respect for diversity
X
= Application to "real world"
Access
issues in this consensus strategy include student location (relative to the campus), time
demands, and native language. In other words, many educators hope that the ways they use
technology will open their instructional programs to students regardless of their
location, regardless of their job schedules (so long as they have sufficient time to
study) and regardless of their native language (so long as they speak English). Flashlight
will also help institutions interpret retention at the course, and course of study,
levels.
Flashlight
should also help institutions investigate reasons for retention and attrition: student
engagement (or lack of it), barriers to access (or the lack of them) for various sorts of
students, and the intellectual accessibility of the instruction (e.g., are our mediated
courses providing a more equal opportunity for learning for students whose native language
isn't English?)
Our
teams also identified four learning outcomes for which there ought to be perceptible
improvement, so long as the foregoing changes in technology and teaching/learning
practices had been sufficiently widespread for enough years. These learning outcomes
following completion of a course of study include the ability:
- to
apply what was learned in the instructional program (i.e., what was learned was not
sterile or shallow - the 'graduate' would be seen to use the learning in real situations
after completing the instructional program),
- to
work in teams,
- to
use information technology appropriately and creatively in one's work, and
- to
manage one's own process of continuing learning.
Structural
changes of interest are described in Ehrmann (1996). Briefly, our first instrument in this
area will focus on changes in the ways that the institution supports academic work by
students and staff when they're on-campus and when they're off-campus. For many
institutions, faculty and students need to do more of their work off-campus, while still
needing to do some of their work on-campus. But institutions never are rich enough to
support every kind of work everywhere for everyone. Choices need to be made. This
self-study guide would provide some quick clues about the institution's progress in
thinking about which kinds of academic work to support on-campus and which kinds to
support off-campus.
Cost
outcomes of greatest interest include (for large distance learning programs) capital and
operating costs relative to comparable programs on one's own campus, and savings in costs
per graduate coming as a result of (hoped-for) increases in retention. Other issues of
interest include how costs of education may vary for the student (e.g., costs of income
lost due to time spent in commuting, costs of acquiring equipment and network connections
used in part for education).
Using the Vision to Create an Elephant Detector: Flashlight
Armed
with this sense of the kind of goals, strategies, hopes and fears that were quite common
across disciplines and across many types of educational institutions, our team began
developing questions that could be used in surveys, interviews and focus groups to detect
what was really happening in instructional programs.
Leading
the development of the student, faculty, alumni, and alumni supervisor survey items and
interview guides was Robin Zúñiga of the Western Cooperative for Educational
Telecommunications (WCET) [and now Associate Director of the Flashlight
Program]. Leading the development of the cost analysis measures was Joe
Lovrinic of IUPUI.
With
support from the Annenberg/CPB Projects, the WCET worked with the author to
develop a survey item bank and interview guide to get information from currently enrolled
students on the use of information technology in specific courses of study (the
"Current Student Inventory.")
An
early and crucial design change was the decision not to develop
standardized instruments but instead to create tool kits (item banks,
handbooks). Focus groups had consistently
frowned on questions like "how frequently (or how well) is available technology
helping you learn collaboratively?" They had pointed out that a given course or course of
study might make many technologies available simultaneously, some of which helped a lot
(or were used a lot), some a little, some of which were problematic, and some simply
irrelevant to collaborative learning. So how should a student answer, and what how could
the answer be interpreted? Similarly "collaborative learning" was not a single
behavior, they told us, but rather an umbrella term covering many behaviors. A student might want to
say, "The e-mail is helping me study for this course with students from other
universities but the heavy reliance on video and multimedia in class leaves little time
for student-student work." A question like "how frequently (or how well) is
available technology helping you learn collaboratively?" was clearly inadequate, and
was eventually replaced by over forty items that were each far more specific. Since no
single evaluation could or should ask about all those issues, Flashlight became a
construction kit that could be used to create studies tightly focused on issues of local
importance.
In
1996 the beta version of the Flashlight Current Student Inventory was used to create
surveys that were sent to 4,200 students at our five institutional partners. The data were
used to aid final revision of this part of the tool kit. Two of the institutions wrote
reports on how they used the resulting data; these reports are included in the Flashlight
Evaluation Handbook that, along with the Flashlight Current Student Inventory, is now
being site licensed.
Using the Current Student Inventory
Because
Flashlight is a tool kit, not a standardized instrument, the Current Student Inventory
comes with an extensive Evaluation Handbook to help users through the process of building
their own research models and research tools. The key to using Flashlight is to create a
model of teaching and learning practices that make a difference, and then to find, adapt
or create questions about technologies that are meant to support those practices. The
Current Student Inventory provides questions about a variety of valued teaching and
learning practices, especially about student perceptions of their frequency, and an even
larger number of questions about the roles technologies might be playing in aiding or
hindering those practices.
Flashlight
has a variety of applications, including:
- guiding
improvement of courses and courses of study (e.g., majors, minors, freshman year skills
development, writing across the curriculum) and strengthening the roles played by
technology in such efforts;
- evaluating
major grant-funded projects;
- improving
technology-based services (e.g., libraries, computing services, telecommunications and
Internet connectivity) and their leverage in educational improvement;
- supporting
strategic thinking about the curriculum and technology services;
- preparing
for accreditation;
- helping
faculty, departments or institutions compare their uses of technology and outcomes; and
- redesigning
student evaluations of faculty.
Because
Flashlight consists of tool kits rather than standardized instruments, we do not yet
create national or international norms. However we urge interest groups (institutions,
faculty in similar fields, people in the same discipline) to collaborate in choosing or
adapting the same questions so that they can develop their own norms. The TLT Group can
provide assistance in this area by providing training to all members of the group that
includes time to devise shared questions and methods for pooling data.
The
second major developmental step came in 1997. As one faculty member
put it during an early Flashlight workshop, "You're teaching us the
research behind these tools but you're not teaching us how to think when
we use them." The most difficult challenge in "how to
think" about doing a study is the process of figuring out what to
study. Since 1997, the major focus of Flashlight workshops and
consulting has been helping individuals and institutions identify and fine
tune those issues that are most vital to study.
Also
in 1997 Washington State University, one of the original Flashlight
institutions, began design studies for a Web-based system that would
enable users to draw far more easily on the Current Student Inventory.
Eventually called Flashlight Online, this service has been made available
internationally to institutions subscribing to the Flashlight Tool Series
Plus. Users can mix CSI items and their own as they create surveys
and, if they wish, have students respond online. The WSU hosting
service, called CTLSilhouette, also enables Flashlight Online to provide
simple analyses of results; users can also download the raw data to their
own statistical software or database.
In
1998, work on the Cost Analysis Handbook accelerated, thanks to help from
the Andrew W. Mellon Foundation. Like the Current Student Inventory, the
Cost Analysis Handbook focuses attention on what people choose to do with
available resources: their activities. The first edition of the
Handbook was published in 1999. Also in 1999 a beta version of the
Faculty Inventory was made available to institutions that have joined the
Flashlight Network. At this writing (Nov. 2000), over 50 institutions are
now members of the Network and they are taking on a larger role in guiding
the future development of the Flashlight Program.
Program
Status and Next Steps (Update: January, 2005)
The Flashlight Program is now operating under the aegis of The TLT Group,
a non-profit that works with almost 150 subscribing educational
institutions around the world. Flashlight has developed a wide range of
evaluative tools, many of them quite specialized. Subscribing institutions
receive site licenses for all of them, as well as for all other TLT Group
resources. Many of these
resources are also visible on the The TLT Group's web site.
Flashlight publishes a free electronic newsletter
called F-LIGHT with news about workshops, product release, and changes to our Web site. We
put out around 6 issues a year. To subscribe to F-LIGHT, address e-mail to LISTPROC@LISTPROC.WSU.EDU with the one line message
SUBSCRIBE
F-LIGHT (your name)
For information about site services and products
(including a video about research, technology, education and Flashlight), please
send e-mail to Flashlight@tltgroup.org.
|