|
This page is an
annotated bibliography of free articles and Web sites on Flashlight and on
the larger subject of studying educational uses of technology.
You
may also want to look at our collection of Flashlight-related
case studies.
Table
of Contents for This Page
I.
Summarizing
Flashlight
II.
Flashlight
Articles on Methods
III.
Case studies (F-LIGHT back issues)
IV. Resistance
to Evaluation - Dealing With
V.
Applying
Flashlight Thinking to Specific Teaching and Institutional Policy Issues
VI.
Other
Evaluation-Related Sites Useful for Studies of Teaching, Learning, and
Technology
+Other
Examples of Studies Whose Findings Improved Local Educational Uses of
Technology
+Interesting
Evaluations of Educational Uses of Technology
+Methods
and Tools for Evaluation of Courseware
+Studies
of Costs; Methodology of Cost Studies
+Computer-Aided
Evaluation and Classroom Research
+Computer-Aided
Assessment of Learning
+Comparisons
of Distance Education versus Campus-Bound Programs
+Other
Resources Related to the Evaluation of Educational Uses of Technology
About the Flashlight Program and Its Approach to Studying Technology Use
and Learning
Interview
by Distance Educator with Steve Ehrmann, describing the history,
achievements, and prospects of the Flashlight Program. (2001)
(NEW) "On
the Necessity of Grassroots Evaluation of Educational Technology"
(2000). The Flashlight Program's goal is to equip as many faculty members,
administrators and students as possible with the skills and tools to guide
their own uses of technology in support of learning. This article,
originally published in Technology Source and recently added to the
TLT Group's web site, explains why.
TLT/Flashlight Web site
on learning space design and evaluation - This web site, created by
Steve Ehrmann, summarizes the Flashlight approach to evaluating physical,
blended, and virtual learning spaces (e.g., classrooms, course management
systems, libraries, ...) .(2004- )
TLT Flashlight
Web site on formative evaluation of ePortfolio initiatives. (2004- ).
Sample of subscriber materials, created by Steve Ehrmann. The Flashlight
approach begins by identifying the different activities which the use of the
technology is intended to improve. The center from which the Flashlight
inquiry spreads is that set of activities: are they changing? where? why or
why not? with what consequences? What are the strengths and weaknesses of
eportfolios for carrying out these specific activities? how does the use of
eportfolios alter the costs and stresses associated with these activities?
and so on?
TLT/Flashlight web
site on evaluation of institutional portals (2003- ). Sample of
subscriber materials, created by Steve Ehrmann. Uses the same model
described above for eportfolios, but around activities which the use of
institutional portals is supposed to improve.
Ehrmann, Stephen C. (1999), "Studying Teaching, Learning and
Technology: a Tool Kit from the Flashlight Program," Revision of article
originally published in Active Learning IX (December 1998), pp. 38-42. This
somewhat technical essay shows how to design Flashlight-style studies whose
findings can show a program how its technology can foster better educational
outcomes. The essay is illustrated with questions drawn from the Flashlight Current Student Inventory.
The article explains key Flashlight concepts such as 'blob, 'triad,' and
scenario.
Ehrmann, Stephen C. (2000), "Studying
and Improving the Use of Technology to Support Collaborative Learning: An
Illustration of Flashlight Methods and Tools". This essay
tells the story of a study team at a fictional college that is developing a
study to track and improve the use of technology to support collaborative
learning by students and educational outcomes for graduates. It was written
to illustrate Flashlight thinking in action and includes a sample
survey created with Flashlight Online. This article may also be of
special interest to Teaching, Learning, and Technology Roundtables, to teams
preparing for accreditation self-studies, and to others interested in
analyzing the use of technology to support large-scale patterns of
educational improvement.
Ehrmann, Stephen C. (1997) "The Flashlight Project: Spotting an
Elephant in the Dark" . This article gives a general description of the design of
the Flashlight Program, arguing that a tight focus is essential to seeing anything when
studying teaching and learning with technology. The essay summarizes the history of the
Flashlight Program and briefly describes the content and organization of the Flashlight
Current Student Inventory.
Ehrmann, Stephen
C. and Robin Etter Zuniga (1994), The
Flashlight Project Planning Grant. Final Report This report on
the original planning grant from the Fund for the Improvement of
Postsecondary Education (FIPSE) describes the roots of the Flashlight
Program's tools in more detail.
Frequently
Asked Questions About Flashlight
Flashlight
Articles on Methods
Ehrmann, Stephen C.
(2002), "Evaluating (and
Improving) Benefits of Educational Uses of Technology." This
draft chapter for a volume on cost analysis methods serves as a primer of
issues to consider when evaluating the benefits of educational uses of
technology.
Ehrmann, Stephen C.
(2000) - narrated
slideshow explaining the idea of a triad. "Does
[technology X] improve learning?" That's one of the most frequently
asked questions about evaluation. It's also fallacious -- there is no direct
relationship between any materials or tools of learning (computers, paper,
etc.) and learning outcomes, or costs. This narrated slideshow explains why
an investigation that seeks to link technology and outcomes must also study
at least one more thing: the activity that makes use of the technology in
order to produce the outcome. This combination of
technology-activity-outcome is called a triad.
Chickering, Arthur and Stephen C.
Ehrmann (1996), "Implementing
the Seven Principles: Technology as Lever" AAHE Bulletin,
October, pp.
3-6. This essay by Chickering and Ehrmann outlines the kinds of technology use that can
help faculty and students implement Chickering and Gamson's "Seven Principles of Good
Practice in Undergraduate Education." The
Flashlight Current Student Inventory helps
focus attention on technology uses for supporting good instructional practice.
Ehrmann, Stephen C., (1995) " Asking the Right
Questions: What Does Research Tell Us About Technology and Higher Learning?" in Change. The Magazine of Higher Learning, XXVII:2 (March/April), pp.
20-27. This essay from Change Magazine gives a more brief overview of the
evaluation literature on teaching, learning, technology and costs. Some of the key
assumptions of the Flashlight Program are developed here.
Ehrmann, Stephen C. (1998) "How (not) to Evaluate a Grant-Funded
Technology Project," in Stephen C. Ehrmann and Robin Etter Zuniga, The Flashlight
Evaluation Handbook (1.0), Washington, DC: The TLT Group. This essay lays out some of the
most common assumptions underlying evaluations of grant-funded technology projects and
argues why all of them need to be rethought.
Ehrmann,
Stephen C. (2000), "Finding a Great Evaluative Question: The Divining
Rod of Emotion." Do all your
potential topics seem (merely) 'interesting?' Maybe you need to look
further. An intense sense of dread or excitement can signal that a topic is
important enough, and uncertain enough, to be worth the effort to answer, as the examples in this paper
illustrate.
Ehrmann, Stephen C. (1998) "What Outcomes Assessment Misses"
Many people
assume that, to evaluate a technology-based innovation in education, it is necessary and
sufficient to use uniform testing to discover gains in student learning. This recent essay
argues that attending to outcomes is desirable but it is not always possible and it is
almost never sufficient. The essay points more attention to outcomes that uniform tests
miss, to the processes that produce the outcomes, and to the problems that programs
encounter, some of which can be deeply revealing about the program's goals and strategies.
Ehrmann, Stephen C. (1997) "The Student as Co-Investigator,"
from Stephen C. Ehrmann and Robin E. Zuniga, The Flashlight Evaluation Handbook,
Washington, DC: The TLT Group. Educators usually assume that the student's role in program
evaluation is to answer questions. The reasons why the questions are being asked aren't
explained for fear of biasing the answer. This essay argues that students should be
involved with the design of evaluations of teaching, learning, and technology and that
their own theories of using technologies for learning should be brought to the surface and
put to the test.
"Increasing
Student Participation in Studies." A brief essay by Steve Ehrmann
on how to increase response rates to surveys and interviews by demonstrating
that the study is actually worth people's time and thought.
Other Resources for Studying Educational Uses of Technology
Resistance
to Evaluation - Dealing With
Diagnosing
and Responding to Resistance to Evaluation. These notes, drawn from
a panel at the American Evaluation Association meeting in 2001,
summarize some thoughts about uncovering and dealing with the natural
distrust between the evaluator and the rest of the world.
"Frequently Made Objections (FMO's) to
Evaluation, and Some Suggested Responses." The ideas in this list
were drawn from participants in the Flashlight Leadership Workshop in July
2001 held at Syllabus/TLT Group Summer Institute in Santa Clara,
California.
Applying Flashlight Thinking to Specific Teaching and Policy Issues
Most of our guides and sample materials are
available only to the faculty, staff and students of subscribing
institutions. The following materials are publicly available.
(NEW!)
Distance and Distributed Learning:
Examples of Flashlight Studies That Can Be Used to Improve Program
Effectiveness
Getting More Value from Course Management Systems
Evaluating Institutional Portals
(using data to get more educational value from portals). Free sample
of TLT/Flashlight material.
"Using
Technology to Make Large-Scale Improvements in
The Outcomes of Higher Education: Learning From Past Mistakes,"
by Stephen C. Ehrmann. Observers have been expecting an imminent
computer-enabled transformation of teaching and learning in higher
education ever for almost 40 years. Dr. Ehrmann argues that past
effort have often been frustrated inappropriate strategies, not the
technology itself. This brief article outlines a five part strategy for
institutions, systems, and nations that want to use technology to make
valuable, visible improvements in the outcomes of higher learning. This column
was published in the January-February 2002 issue of Educause
Review.
Ehrmann,
Stephen C. (2000), (Draft) "An
Evaluation Plan for the Year". This essay suggests a
two-pronged strategy for helping an institution make best use of
Flashlight methods and tools in order to improve education. Comments
welcome!
For applications of
Flashlight thinking and tools, see our
collection of case studies .
Other
Evaluation-Related Sites Useful for Studies of Teaching, Learning, and Technology
We
are continually expanding our site. Please send us URL's to include
here, along with a brief description of the site.
Other
Examples of Studies Whose Findings Improved Local Educational Uses of
Technology
Ehrmann, Stephen C., (1999), "Asking the Hard Questions About
Technology Use and Education". This article describes a set of evaluative questions -- study designs -- that institutions
have used to good effect in studying the role of technology in the performance and costs
of their academic programs. A revised version was published in the March/April 1999 issue of
Change Magazine, pp. 24-29).
The goal of
Teachers 2000 at Gloucester County College is to infuse technology into
the education of future teachers. The program is divided into four
student cohorts (learning communities), each mentored by a different
faculty member. Evaluation
has helped them improve learning communities, drop a telecourse that
wasn't sufficiently helpful, and demonstrate that the program was
helping to improve learning.
Penn State's Stat
200 course has been restructured to emphasize guided self-study by
individuals and groups, improved assessment, and fewer but more
interactive lectures. Early
evaluation work is helping them fix a problem with the assessment
process and pinpoint early cost savings.
This
recent study
of web-based distance education helped improve faculty development for
distance learning. Like Flashlight's tools, their survey focused
on the seven principles of good practice in undergraduate education.
Interesting
Evaluations of Educational Uses of Technology
The preceding citations are studies that report
both on findings and on what influence the findings had for practice. The
following evaluations strike us as worth reading for various reasons. You
will probably find both ideas you can use and things you think you can do
better. Please let us know if
there are studies you think should also appear on this page.
Gerald Schutte of
the California State University Northridge wrote this widely-discussed evaluation
of his own 1996 statistics course taught both on campus and online.
His report does not say whether the findings affected his teaching, but
I suspect that they did. He found that the online students did better on
exams, apparently because their study environment did a better job of
encouraging collaboration than did the on-campus environment. The online
students didn't like being left so much alone but the lack of faculty
attention appears to have encouraged students to rely more upon one
another.
This 1996 article
by D. R. Newman, Chris Johnson, Clive Cochrane and Brian Webb at Queens
University, Belfast, reports on a seminar split in two: half discussed
face-to-face while the other half conducted their seminar online.
The authors document in their course what some other instructors have
reported anecdotally:
students
doing more critical thinking in their online exchanges than the students
talking face-to-face.
A study that compared two chemistry
courses, one using collaborative inquiry and the other organized around
interactive lectures, and the use of faculty from other departments to assess
student learning: J.
C. Wright, S. B. Millar, S. A. Kosciuk, D. L. Penberthy, P. H. Williams, and B.
E. Wampold (1998) “A Novel Strategy for Assessing the Effects of Curriculum
Reform on Student Competence,” Journal of Chemical Education, v. LXXV,
pp. 986-992 (August). Abstract on the Web at http://jchemed.chem.wisc.edu/Journal/Issues/1998/Aug/abs986.html
.
Evaluation of Courseware
Georgia Tech has a set of generic
tools for evaluating courseware projects (part of an even larger library of tools for
managing the development of courseware projects. The evaluation tools can be found here . We especially
liked the evaluation matrix .
The United Kingdom's TILT Project
has reached its end but the team left behind some good articles and references. During the
three years of the TILT project, the Evaluation Group performed about 20 evaluation
studies of teaching software in Glasgow University across a very wide range of subject
disciplines. Some exercises have been single episode field trials, others have looked at
computer aided learning material in use within a full degree course over two or more
years. The Group employed a variety of instruments, three of the most important in recent
practice being student confidence logs, short quizzes and learning resource
questionnaires.
Another older but still valuable
reference is not itself on the Web. It's a book by Paul Morris, Stephen C. Ehrmann, Randi
Goldsmith, Kevin Howat, and Vijay Kumar (1994) Valuable, Viable Software in Education:
Case Studies and Analysis. New York: Primis Division of McGraw-Hill. This study
described the kinds courseware and other nationally distributed software that have had a
widespread, long term influence on the curriculum, and why. One of the key findings: the
destructive impact of the rapid progress in computers and operating systems. A summary of some of the findings
is on the Web.
Studies of Costs; Methodology of Cost Studies
The Flashlight Program
has its own Cost Analysis Handbook,
which includes instruction on methods, examples, and seven detailed case
studies.
By cost study, we mean an analysis of how educators use time, money,
space, and other scarce resources as they try to educate people; the aim
of the studies of most interest to us is to help the people involve "unstretch"
those resources - to make key processes easier, more fulfilling and more
rewarding. One of the most overlooked, and most important, reasons
to do such studies is to prevent burnout. The Flashlight Cost
Analysis Handbook
teaches its readers how to analyze their own use of resources in
particular activities, in order to improve those activities; we also
periodically offer online workshops on how to do such studies. Check
our calendar to see if one is coming up.
Good
cost studies are rare and their methods are often similar. We work
closely, and want to work more closely, with others doing cost studies,
and developing such methodologies, all over the world. Here is our
list so far, in the order we found these resources.
The Western Cooperative for
Educational Telecommunications, with support from the Fund for the Improvement of
Postsecondary Education, is now at work on the second edition of its Technology Costing
Methodology (TCM) project, which extends previous work on costing done by
NCHEMS. TCM includes a handbook, case book, and tools; TCM focuses mainly
on comparing the costs of two or more types of instructional strategies.
SCALE at the University of Illinois,
Urbana, recently published evidence from nine "Efficiency Projects" that were
SCALEs focus in the 1997-98 academic year. The Efficiency Projects were specifically
aimed at using asynchronous learning networks (ALNs) to achieve higher student/faculty
ratios, without sacrificing instructional quality. The study concentrates on data amassed
for the fall 1997 semester. Evidence was collected on the cost side, for ALN development
and delivery, and the performance/attitude side, from both student and faculty
perspectives. The study
supports the view that when a sensible pedagogic approach is embraced that affords the
students with avenues to communicate about their learning, ALN can produce real efficiency
gains in courses without sacrificing the quality of instruction.
The Pew-funded
Center for Academic Transformation supported a large number of
projects in the United States; each project redesigned a
large (often introductory) course in ways that trimmed costs while often improving outcomes. Their Web site
contains links to a growing number of reports from their projects.
For example, Carol Twigg's monograph,"Innovations
in Online Learning: Moving Beyond No Significant Difference,"
summarizes many of the lessons of their projects for cutting costs while
maintaining or improving outcomes.
The work on costs is
becoming increasingly international. Flashlight is working closely
with efforts in other countries as well. One such program is the project
at Sheffield Hallam University in the United Kingdom that has been
developing activity-based methods of estimating the costs of network
learning. Click here
for their Web page of other efforts on estimating the costs of technology
use.
To
repeat, we would like to add other references and sites to this list,
including studies,
methodologies and resource sites in other countries. Please send your
suggestions to Ehrmann@tltgroup.org
Computer-Aided
Evaluation and Classroom Research
The Student
Assessment of Learning Gains site at the University of Wisconsin helps
faculty craft their own online studies of the effectiveness of different
elements of their courses.
Computer-aided assessment
This site in Great Britain is one source of information on
the use of computers for assessing student learning (where "assessment" refers
to gauging what a student has learned, as opposed to "evaluation," by which we
mean studies of how well educational program is working. Assessment data is one useful
input for evaluations.)
In a March 2000 article
in Education Policy Analysis Archives, Russell and Haney argue
that, for school children accustomed to using computers, state exams
limited to paper and pencil seem to underestimate achievement. Their
research raises interesting questions for course-based college level
assessment.
Comparisons of Distance Education versus Campus-bound
Programs
North Carolina State completed this
study ("Project25")
comparing Web-based courses and campus courses in early 1998.
Tom Russell has assembled abstracts
of some 300 studies to refute the often-heard contention that "distance learning
can't possibly be as good as campus programs." He was looking for studies that
showed, at worst for distance education, 'no significant difference' in quality of
outcomes when compared to programs on-campus. As you'll see, he found hundreds of such studies.
Other
Resources Related to the Evaluation of Educational Uses of Technology
Web
Center - Learning Networks Effectiveness Research. Developed in
conjunction with the Sloan-funded Asynchronous Learning Networks
projects.
Evaluation
results - Program in Course Redesign: summary of Pew self-study of
30 projects redesigning large enrollment courses. Findings on costs and
effectiveness.
The Coalition for Networked Information (CNI) has a long-term project
on assessing the academic networked environment. The McClure-Lopata
volume and the case studies deserve a special look.
Other Resources on Assessment and
Evaluation
Internet
Resources for Higher Education Outcomes Assessment (North Carolina State
University)
Field-tested Learning
Assessment Guide (FLAG) - an indexed collection of assessment tools
for faculty (especially faculty in math, science, engineering, and
technology
Nine
Principles of Good Practice for Assessing Student Learning (AAHE)
|