|
Flashlight Evaluation
Handbook Table of Contents l
Confusors l
Dangerous
Discussions
These
materials are for use only by institutions that subscribe to
The TLT Group, to participants in TLT Group workshops that
feature this particular material, and
to invited guests. The TLT Group is a non-profit whose
existence is made possible by subscription and registration
fees. if you or your institution are not yet among
our subscribers,
we invite you to
join us, use these materials, help us
continue to improve them, and, through your subscription,
help us develop new materials! If you have questions
about your rights to use, adapt or share these materials,
please ask us (info @ tltgroup.org).
'Assessment' is many things. One: the exercise of
control,
even when one is assessing one's own work. That's just
one reason why the discussion of assessment can be sticky. This page of materials has been created
for assessment specialists, people involved in accreditation
and program review, faculty development specialists, and
others who need to work with others about assessment:
training them to do it, helping them do it, or assessing
them.
Our approach to the dangerous discussion of
assessment begins with a working assumption: that, despite
what it may feel like, it's probably safer and better to
bring fear, suspicion or anger into the light rather than
allowing them to work beneath the surface. [For a 4.5 minute
"Low Threshold Activity" - narrated slideshow about this
issue -
click here.]
Confusors: An important step in
beginning a discussion of 'assessment' (or most of its
allied functions, such as evaluation, accreditation,
testing, faculty evaluation, or program review) is by
checking for confusors. A confusor is a term that a)
has two or more conflicting definitions that b) sometimes
lead to unnecessary arguments when people don't realize
they've defined the term differently. For example, people
might argue about whether regional accreditation should, or
shouldn't, insist on assessment of learning outcomes without
ever realizing that the argument has been accidentally
caused by their conflict, unspoken defined 'learning
outcomes' or 'assessment.' Person #1 silently assumed
that a common 'learning outcome' was, for example,
graduation rates of student, while another thought 'learning
outcomes' meant requiring all seniors to take a common
written test. For examples of confusors and their
conflicting definitions, many of them associated with
assessment,
click here.
Shining a Flashlight on Confusors:
Suppose you're convening a meeting or a workshop. Even
before people come, you may want to survey them about some
common confusors, either offering multiple choices of
definitions (such as the definitions on our confusor list)
or asking them to fill in the blank as they describe what
they mean when they say "assessment", "learning outcomes,"
"evaluation" or other terms likely to be important in your
discussion. If you're a Flashlight Online user, we
have developed templates that you can adapt. Template
ZS62098 asks respondents to fill in the blanks, writing
their own definitions of six key terms. Template
ZS62098 asks about the same terms, but offers
multiple choice definitions, asking the respondent to pick
the one that resembles the way that they most often use the
term. The templates don't have introductions.
Here is an introduction you can adapt and add to your
survey:
In our upcoming discussions, most of the following
terms are likely to be used. If you were using each of
the following terms in discussions with us, what would
you personally mean by the words? This survey will help
us see if our definitions clash. By alerting ourselves
in advance, we can avoid some unnecessary arguments that
might otherwise be triggered. (For example, two people
with identical values might get into an unnecessary
argument about whether assessment distorts teaching if
they don't realize that they mean different things by
the term assessment.') All these definitions are
commonly used and 'correct.' Just pick the definition
that most closely resembles how you use the term.
If, as is almost inevitable, you find that
your colleagues have conflicting definitions, you've got at
least two choices when the group gets together, after you
show them the results of your survey:
-
Suggest that, every time that each
person uses one of these confusors, they briefly state
their definition, and/or
-
Suggest that, for the purposes of your
work together, you all use the same definition for each
confusor
Frequently Made Objections (FMOs) to
Assessment, and How to Respond: We've created a list of
frequently made objections and some suggested responses for
each of them. (subscriber-only
version of our list;
earlier,
public version of our list). Please email Steve Ehrmann
(ehrmann@tltgroup.org)
with suggestions for how to improve the list: new responses
and new objections.
Shining a Flashlight on Frequently Made
Objections: As part of your survey, or perhaps as a
second survey, ask people to describe which of the
frequently made objections they'd like to discuss.
This
Flashlight Online survey (also available as template
ZS61934) asks people which
FMOs have been voiced by people they know. (If you're a
Flashlight Online user you can alter that question if you
want, add your own FMOs, and customize the survey in other
ways before using it.)
So this
survey, or a discussion, has resulted in a list of
objections. You now have at least three options:
-
Respond
to each objection (see our guide, above) and continue
that conversation until it ends in agreement or
agreement to disagree.
-
Do #1,
but also have a discussion with each objector about
assessment itself. Some people with objections to
assessment might also say that they don't do
'assessment' themselves, don't know how, and don't care
to learn. Many of these same people are (by other
definitions) already pretty good, or very good, at some
kinds of assessment. If you can ask people about how
they understand how students are learning (achievements,
problems) and what they do with those insights, you may
be able to create some common ground. We know of
examples where people who believed they knew nothing
about assessment were, several months later, leading
workshops on assessment.
-
Sometimes, even after #1 and #2, you may still have a
sense of hidden disagreement and subterranean conflict
that feels dangerous to broach. Perhaps you're guessing
that the objector feels threatened by assessment because
of what the assessment might reveal? or that the
objector hates someone associated with the assessment?
or something else that has seemed too delicate or too
threatening to describe. You'll need to make your
own decision about whether it's more dangerous to assume
you know what the real issues are, or to probe further
to see if other conflicts can be brought into the light.
Click here to read more about
"Diagnosing
and Responding to Resistance to Evaluation."
Case Studies Needed
We'd love to upgrade this chapter of the
Flashlight Evaluation Handbook by interweaving examples
of surveys or other ways you have of using feedback to avoid
'hot button' arguments and have constructive discussions
that lead to win-win solutions. This could be a strategy you
develop alone, or perhaps develop in collaboration with us.
(If you're interested in working with us, one good way is
through a TLT/Flashlight
Network
membership, which includes two days of
consulting/training on topics of your choice.) Whether you
or your institution are TLT Group subscribers or not, we
would welcome information about studies you've done that
have yielded useful information, information valuable enough
to justify the work you put into discovering it. We can
publish it in F-LIGHT and
include it in a rewrite of this chapter. Thanks for thinking
about working together!
|