TLT Group Image

Assessment, Technology, and General Education
In the First Year Experience

TLT Group Image
LEARN ABOUT TLTG
EVENTS AND REGISTRATION
PROGRAMS
RESOURCES
LISTSERV AND FORUMS
corporate sponsors
RELATED LINKS
HOME


  Search TLT Group site:
  

Invited Article for the FYA List
This article by Stephen C. Ehrmann of The TLT Group has been invited for submission by the First Year Experience Assessment listserv. If you'd like to participate in discussion of the article you may want to sign up for the listserv; there is no fee.

Return to Home Page on the Implications of Technology for General Education

Using Portfolios and Surveys to Improve General Education

Stephen C. Ehrmann, Ph.D. (ehrmann@tltgroup.org
Director, The Flashlight Program, The TLT Group
July 18, 2004  --
an earlier version of this essay had a first section on evolving goals of general education. That material, enlarged, will be at the heart of a new article, now being researched, and due for publication in Liberal Education in Fall 2004. When a draft is ready, a link will be placed here. 

        General education, as many people define it, is what all students are supposed to learn from their college educations.  Unfortunately, in many institutions, assessment tools for improving general education are quite limited. To guide their design and teaching of courses, many faculty members have little more than their eyes, ears and the paper on which their quizzes and homework are written.  Worse, when curriculum committees consider how to improve to improve the general education program, they often have no tools for gathering data. Are students making satisfactory progress toward the varied goals of general education? If so, what are we doing that’s working so well? If not, what’s the problem? At many institutions, those questions are often unanswerable.

        In recent years, two technology-supported approaches have widened the faculty’s ability to see what’s going on, within and across courses: electronic portfolios and online surveys.

I.      Electronic Portfolios

        A “portfolio” is a thoughtfully organized collection of student work, usually including work other than, or in addition to, traditional academic papers. For example, web projects can be stored in portfolios, as can video recordings of student performances (oral presentations, participation in teams, dances).  Portfolios also usually include student reflections about how the project provides evidence of their developing skills.  These reflective statements are one way in which portfolio use is intended to deepen student learning.

“Electronic portfolios” store those projects, or recordings of them, plus reflections and feedback, on computers so these records can be accessed online.  The software sometimes enables students to organize the work in several different ways: one “view” organized for an individual course, another view organizing the content to show progress toward goals of general education, another showing progress in the major, and yet another that might be used for employment or graduate school applications.  The work can be used over a period of time by the student, by faculty, and, at some institutions, by people outside the institution (e.g., potential employers). This ability to revisit a project long after the project is completed is one of many distinctive values of electronic portfolios. 

Today there are many types and uses of electronic portfolios.  Let’s focus on just one of those uses: portfolios organized to show student progress over the years toward general education goals such as “communications skills” or “skills of inquiry.” In such portfolios, students store work that represents major stages of progress toward each competence, along with their own comments on how that work represents the skill.  Also stored in the portfolio is feedback from faculty (and sometimes from others) about whether and how the work shows the required level of competence.

Faculty, of course, first must define what the goals and levels of progress are; here, for example, is a statement from faculty at Northern Illinois University about student writing competence goals for the end of the first year of college.  Faculty must then create rubrics to help students and, later, faculty assess whether that level of progress has been achieved.  Then, as student work is submitted, faculty use those rubrics to judge student progress by studying the students’ portfolios. In other words, faculty members finally have a way to look at education in chunks larger than a single course.

Provost Doris Helms of Clemson University commented in an interview with the author that this use of electronic portfolios freed her institution “to think about general education as something other than a smorgasbord of courses.” At Clemson, for example, writing done by admitted students during the summer before the first year is the initial entry into student portfolios. 

Using portfolios to record student progress toward general education goals enables everyone to analyze patterns of learning that extend beyond single courses:

  • Students: Students are asked to examine their own projects in light of skills they need to master.  This reflection is designed to help them deepen learning from work they have already done, while simultaneously helping the student plan and accelerate future learning.

  • Individual faculty members: Faculty members can look at student portfolios before the term begins in order to help decide how to teach their courses.  The portfolio provides vivid evidence of the strengths and needs of students, helping the faculty member fine tune plans for the coming term.

  • The institution: When faculty members work in teams to examine student progress, they can also use this information to reconsider how to organize the curriculum.  Provost Helms commented, “We’ll use this as research—where are students learning what they’re learning?  For example, what are they learning while outside the classroom, in jobs, at home, and in extra-curricular experiences? What kinds of learning should we foster, more intentionally, outside the course?” 

II. Online Surveys used for Course Research and Program Evaluation

        The single best source of guidance for improving learning is close study of student work (projects, online discussion, etc.).  Those insights can be complemented and deepened through feedback from students. The TLT Group’s Flashlight Program has developed banks and templates of survey item and an online system for creating, delivering, and analyzing such surveys. Collectively, these are known as Flashlight Online.

        Flashlight Online enables users to write their own questions or to select questions from almost 500 validated items in the Flashlight Current Student Inventory. Most users are faculty who gather information from their own students. Flashlight Online also allows users to share surveys or data.  It can also be used for multi-institution benchmarking studies in which faculty use the same, or similar, studies to contribute to a common pool of data so that they and their institutions can compare their findings with the rest of the pool (e.g., “Evaluating Educational Uses of the Web In Nursing.”)

        A new version of Flashlight Online is now being developed with the support of the Fund for the Improvement of Postsecondary Education (FIPSE) as part of the BeTA (Better Teaching through Assessment) Project. BeTA will strengthen the use of student feedback about courses and faculty in two ways:

a)   Designing feedback: To help faculty, administrators and students plan the content of course feedback surveys, BeTA project workshops and materials will help them agreement about key issues.  These sessions will help faculty decide, among other things, which questions (if any) should appear in all student feedback surveys.  BeTA surveys will usually include a mix of sources: some common to all courses, some designed for specific types of courses, some from specific institutional programs or colleges, and some authored by the faculty member for his or her own students.

b)   System for Designing and Administering Online Surveys with Single or Multiple Authors: The new survey system being designed for BeTA (Flashlight Online 2.0) will multiple authors to collaborate in developing a survey; each author can keep some of the resulting data from other authors of the same survey, if need be. For example,  faculty member could add a question to the course feedback survey that asks her students, and only her students, to provide feedback on her use of PowerPoint without needing to worry whether that data might hurt her chances for promotion.  Meanwhile, on the same survey might be questions from the writing program about whether feedback on student writing is helping students become more skilled writers and/or helping them with the content of the written work. Other questions might be added by the department. BeTA is designed to encourage people and programs to ask the kinds of risk-taking questions needed for real improvement.

c)   BeTA surveys will typically be delivered online. Because low response rates to online surveys are a widespread concern, BeTA is developing strategies to increase response rates.  One reason for low response rates: as far as the typical respondent is concerned, spending time and thought on a typical course feedback survey produces no visible result.   So BeTA will recommend that individual faculty, departments and the institution each report to students frequently on what they each are doing as a result of student feedback. 

 

III. Concluding Thoughts

        Faculty at a growing number of institutions now can use two quite different sources of data to help them teach better:

       portfolios of current and past projects by students, organized to show student strengths and needs as they move toward educational goals – the institution’s, the major’s, and their own.

       Surveys to gather student feedback about the processes of education.

Each of these two sources of data can be used by individual faculty to improve teaching. The same tools can be used by faculty, working together, to observe and improve the larger patterns of teaching and learning that comprise general education.

 

IV. Questions for Discussion:

1.   To what extent do you, or colleagues at your institution, use portfolios or surveys to guide teaching? What do you think about the strengths and weaknesses of the portfolio and survey software available to you? How does your institution support their use?

2.   To what extent to departments, or your institution as a whole, use portfolios and surveys to look at patterns of learning across courses?  Is there any indication of whether this kind of data gathering and analysis has helped guide efforts to improve education there?

TLT Group subscribers who would like a MS Word copy of this article should e-mail the author at ehrmann@tltgroup.org .

Return to Home Page on the Implications of Technology for General Education

 

Hit Counter visits since July 6, 2004

 


TLTG logo

learn about tltg || events & registration || programs || resources || listserv & forums || corporate sponsors || related links