Course Evaluations on the Web: Our experiences Jacqueline Andrews, SUNY New Paltz Donna Johnson,...

68
Course Evaluations on the Web: Our experiences Jacqueline Andrews, SUNY New Paltz Donna Johnson, SUNY Ulster Lisa Ostrouch, SUNY New Paltz Julie Rao, SUNY Geneseo

Transcript of Course Evaluations on the Web: Our experiences Jacqueline Andrews, SUNY New Paltz Donna Johnson,...

Course Evaluations on the Web:Our experiences

Jacqueline Andrews, SUNY New Paltz

Donna Johnson, SUNY Ulster

Lisa Ostrouch, SUNY New Paltz

Julie Rao, SUNY Geneseo

Agenda

Overview of history of course evaluation New Paltz transition Evaluating online courses Year 3 of being online Questions & discussion welcome

throughout

GENERAL HISTORY OF STUDENT COURSE EVALUATIONS

General History of Student Course Evaluations

1920s at the University of Wisconsin Since 1960s, used by higher education

administration in decisions for tenure and promotion

Traditionally, in class on paper. Referred to by acronyms SEI, SET, SOI,

etc.

General History of Student Course Evaluations

Late 1990s, a few test online administration (ca 2%)

% of institutions implementing online systems is on the rise – medium is the message

Research

Most common concern with online course evaluations: response rates.

Though most research has shown lower responses rates, there is much research that suggest improvement

In addition, some research suggests response rates are lower in only some courses

Research on Response Rates

Factors that seem to affect response rates: Technical difficulties Access to open computers Students’ use of multiple e-mail addresses When and how the availability of the course

evaluation is announced When and how the importance of the

evaluations are addressed Reminders Incentives

Research on Response Rates

A study at the Northern Arizona University showed the professors who posted information about course evaluation on a class discussion board produced the best response rates.

In another study, NAU, found an average 32% increase in response rates when instructor followed these instructions:

1) Announcement with location a few weeks prior to the end of class 2) an explanation of the how the evaluations are used 3) one reminder to complete the evaluation one following the initial announcement by e-mail

In addition, NAU switched from Evalajack to Survey Monkey.

Schools Currently Using the On-line Format

Brigham Young University has a site called OnSET, which is dedicated to information on on-line student evaluations.

Fabulous site: http://OnSET.byu.edu

Examples of Schools Using Online Format to Some Degree

University of Idaho University of Virginia Northwestern University Bates College Yale Clemson University University of Cincinnati UCLA Columbia Penn State

University of Michigan Syracuse Cornell University North Carolina State Ohio State University of Delaware University of

Massachusetts Lehigh University Palm Beach Community

College

Commercial Software

In-house programs or vendor product BYU’s OnSET site listed 10 commercial

providers. They include Evaluation Kit, OCE, Web

eVal,and Class Climate from Scantron and others.

HISTORY OF COURSE EVALUATIONS AT NEW PALTZ

History of Course Evaluations at New Paltz

Fall,1969, 42 questions 1972 to 1976 college-wide procedure ETS for the scanning and reports 24 questions

History of Course Evaluations at New Paltz

1990s, responsibility for scanning and administering reports switched to the Office of Institutional Research.

Results on carbon paper that needed to be separated.

SEI desk attended 7am-9pm.

History of Course Evaluations at New Paltz

Early 1990s a Task Force on Teaching was formed in order to examine and revise the course evaluation form

Recommended a form with 22 questions, still used today

In 2004, 1 survey given to students and 1 to faculty regarding course evaluations

History of Course Evaluations at New Paltz The Current Process

Labels are printed for each course Packets (course/sec) are made up for each course Packets are delivered to Liberal Arts & Sciences – individual departments Business – Dean’s office Engineering – Dean’s office Education – Dean’s office Fine & Performing Arts – Dean’s office Packets are returned to Institutional Research

History of Course Evaluations at New Paltz The Current Process

Each packet is matched to a header sheet Each packet is scanned Scanned packets are uploaded Reports are searched for trouble areas “Cleaned” data sent to Computer Services Reports generated Packets returned to faculty with an individual report

summary and department summary. Chairs and deans receive a copy of each faculty report,

summary, and Department Summary

Online Tests at SUNY New Paltz

Through Blackboard in 2007 Through OCE in Spring and Summer of

2008

The Current ProcessPros Cons

Done in-class - good response rates Very time consuming

(preparation before and after administration)

Students feel anonymous Takes 4-6 weeks for faculty to get results

Lots of room for error (scanning errors, student errors-using pen, etc., illegible comments, handling errors: students can tamper with data or forget to return, people often put forms in a packet for the wrong class, etc)

Students may be apathetic and just fill in anything

Costly (cost of forms, bins, envelopes, work hours, scanner)

Bad for the environment (uses lots of paper)

OnlinePros Cons

Immediate results Lower response rate (effects of use of incentives?)

Far less room for error (no lost forms, scanning issues)

Has to be done on the students’ time (unless technology allows for in-class)

Far less time consuming Anonymity concerns

Far less costly (no scanner, paper forms, much less work hours,etc.)

Green- no need for paper

More student comments

Flexibility for questions/scales

Students who take the time to do them have an opinion

New Paltz Experiment

SUNY New Paltz conducted 2 on-line pilots with the vendor OCE

Summer 2008, all on-line SEIs were conducted for all courses

Spring of 2008, School of Business and School of Science and Engineering

New Paltz Experiment

Comparison of the mean scores of the paper and on-line versions of the SEI to determine whether or not there were statistically significant differences between them.

We calculated a mean SEI score using all the questions on all the SEIs for each school.

We used ANOVA testing to compare means.

New Paltz Experiment Results

The results of the significance scores were inconsistent.

Several of the tests showed significant differences between the mean scores for paper between years.

It is unlikely for the mean scores of on-line SEIs to be significantly different, at the statistical level due from the paper scores, due to the change in format.

These results are consistent with the current body of research of online SEI.

Issues with going online at New Paltz! Differing POV: OIRP, faculty, faculty

governance, Deans, Provost, President Hard for each to see the POV of the other Reducing the OIRP work load is not a driver for

any of these groups except OIRP Lack of consistent other means of evaluating

teaching puts a heavy weight on the SEIs

Assumptions at New Paltz

Harder courses and tougher graders get lower SEI scores

Current way of doing it is perfect Students will not go online to complete

an SEI SEIs are easy

The facts about SEIs

A one semester analysis found no relationship between grades and SEIs

The current way is familiar. It is methodologically suspect. SEI scores are so uniformly high that it is unlikely the questions are valid or reliable.

Students will go online to do the SEIs if they think it is useful to do so.

Here’s that OIRP workload thing again- SEIs take up way too many hours! We handle more than 50,000 sheets of paper multiple times during the year. Surely there is something more useful we could be doing for the college.

More SEI facts

That workload thing – 30% increase in student responses, i.e., pieces of paper from fall,1998 to fall, 2008

What the faculty get and what they give up by going online

Get

Immediate results Flexibility in questions Ability to add their own questions each

semester Comments in a file – no need to read

handwriting Access to their own data all the time Their class time back

Give up

Comfort zone with the present setting Time to do things now unfamiliar:

Need to be involved in the process to secure a decent response rateActive participation in analyzing the data

What students get and what they give up

Get

Ability to do an SEI on their own time Use of a familiar medium – online; no

more golf pencils The class time back Anonymous responses – no handwriting

to be recognized

Give up

The comfort of the familiar A designated time for the SEI – will have

to use their own time

Possibilities for increasing response rates

Hard (hard to sell) waysHold something of value like gradesFaculty award something for completion (timing tricky)Faculty put on syllabusFaculty talk about during the semesterFaculty state how much they value student opinions often

Possibilities for increasing response rates Soft (maybe still hard to sell) ways

Pop-ups – every time log onto site (intranet or Blackboard), there is a reminderDirect route to the survey for those who have not completed (intranet)Email reminders from OIRPEmail reminders to faculty from OIRPIncentivesPaper reminders

Where we hope New Paltz is going next:

In-house software Offer faculty a choice Not be constrained by the existing 22

questions Weight the scale in favor of online Hope to get to a tipping point wherein

95%+ are online

Where it likely New Paltz is headed next: Summer pilot (few have opted out) Work out the kinks with the software Work with the faculty governance system If possible, test the Academic Affairs Committee questions Revise the questions Work with the faculty governance system Perhaps offer a choice to faculty with the 22 questions in the fall Perhaps offer a choice to faculty in the spring with the new

questions Work with the new Provost Work with the work group in evaluation of teaching to put SEIs

in context Work with various ways of ramping up response rates Get to a place wherein MOST of the SEIs are online

COLLECTING STUDENT OPINION DATA IN-HOUSE USING ANGEL

the SUNY Ulster experience

Where we started . . .

Tried using Microsoft Sharepoint in Fall 2007 & Spring 2008 Mailed logins & passwords No portal No Standardized student e-mail account No luck & possible new expense item

Then we tried . . .

Angel Survey through course management system in Fall 2008 Still mailing logins & passwords Still no portal No standardized e-mail accounts No luck BUT better controls and no additional costs

How we’ve changed & why

Switched to using e-mail contact in Spring 2009

Now have a portal All students now have SUNY Ulster e-

mail & are enrolled in Angel classes Reasonably good response rate - 55% No added costs – all electronic

Student Evaluation of Instruction for Online Courses

Use 0 – 5 frequency scale Items examples, “Instructor . . .”

Is well organized Enjoys teaching the course Explains materials clearly Is fair in dealing with students Shows commands of the subject matter Is able to answer questions clearly &

concisely

Each course survey can have individual open and close dates attached to it, that allows for a reasonable period of time for students to participate. I send a separate email to them letting them know when the survey is open and encouraging their participation in the process. Most of our students use Angel to access at least some of their course materials in their regular classes.

One thing that is invaluable is that I have authority to create my own course roster. I use Banner to extract a class roster of ID’s and names, and then Batch Enroll the class into my Student Opinion class. So my “class” shows up automatically as one of the courses they are enrolled in once I add a student to the roster.

The data exports readily into CSV format

I create an Access database for each course that needs to be analyzed from a standard shell where all of the questions are predefined along with the proper answer weighting.

Response weighting table

Summary table used to record response counts and calculate means

I use a query to tabulate the results of each individual question

Means are calculated

A standard report format is used to print reports. I just have to add the individual Course title, instructor name and number of students that participated.

ENTERING OUR THIRD YEAR OF ONLINE COURSE EVALUATIONS

the SUNY Geneseo experience . . .

How we got here

Push from both students & OIR to go online with SOFI – Student Opinion of Faculty Instruction

Committee of faculty, administrators & students chose to go with Online Course Evaluations

Piloted Spring 2006 Fall 2006 live for all courses

Where are we now?

Response rates have gone down Refining what courses go into the

system All courses loaded -> some department opt

out of having labs included Music lessons frequently excluded due to

low number of students enrolled Only 1 load a semester??

Reporting Results

System summarizes instructors’ SOFI scores Summary reports available over the web

for faculty to view own & others’ results Chairs, Deans & students also can view

reportsComments only available to faculty to

whom directed

Student Initiatives

Promote using SOFI results in course scheduling

Introduce SOFI process at orientation with new students

Advertise in student newspaper Pre-registration & once evaluation period

opens Mention changes & responsive

Faculty Initiatives

Challenge is getting to use system Loading courses earlier to give more

time to review, add questions, ask questions

Exploring return to paper reports Trial doing in class on laptops Workshops, workshops, workshops

Include presentation as part of new faculty orientation, Promotion & Tenure workshop, TLC workshop

What we’ve learned

Be responsive Monitor what is going on Reach out to faculty Involve students

Questions, Comments?