Course Evaluations on the Web: Our experiences Jacqueline Andrews, SUNY New Paltz Donna Johnson,...

Click here to load reader

  • date post

  • Category


  • view

  • download


Embed Size (px)

Transcript of Course Evaluations on the Web: Our experiences Jacqueline Andrews, SUNY New Paltz Donna Johnson,...

  • Slide 1
  • Course Evaluations on the Web: Our experiences Jacqueline Andrews, SUNY New Paltz Donna Johnson, SUNY Ulster Lisa Ostrouch, SUNY New Paltz Julie Rao, SUNY Geneseo
  • Slide 2
  • Agenda Overview of history of course evaluation New Paltz transition Evaluating online courses Year 3 of being online Questions & discussion welcome throughout
  • Slide 3
  • Slide 4
  • General History of Student Course Evaluations 1920s at the University of Wisconsin Since 1960s, used by higher education administration in decisions for tenure and promotion Traditionally, in class on paper. Referred to by acronyms SEI, SET, SOI, etc.
  • Slide 5
  • General History of Student Course Evaluations Late 1990s, a few test online administration (ca 2%) % of institutions implementing online systems is on the rise medium is the message
  • Slide 6
  • Research Most common concern with online course evaluations: response rates. Though most research has shown lower responses rates, there is much research that suggest improvement In addition, some research suggests response rates are lower in only some courses
  • Slide 7
  • Research on Response Rates Factors that seem to affect response rates: Technical difficulties Access to open computers Students use of multiple e-mail addresses When and how the availability of the course evaluation is announced When and how the importance of the evaluations are addressed Reminders Incentives
  • Slide 8
  • Research on Response Rates A study at the Northern Arizona University showed the professors who posted information about course evaluation on a class discussion board produced the best response rates. In another study, NAU, found an average 32% increase in response rates when instructor followed these instructions: 1) Announcement with location a few weeks prior to the end of class 2) an explanation of the how the evaluations are used 3) one reminder to complete the evaluation one following the initial announcement by e-mail In addition, NAU switched from Evalajack to Survey Monkey.
  • Slide 9
  • Schools Currently Using the On-line Format Brigham Young University has a site called OnSET, which is dedicated to information on on-line student evaluations. Fabulous site:
  • Slide 10
  • Examples of Schools Using Online Format to Some Degree University of Idaho University of Virginia Northwestern University Bates College Yale Clemson University University of Cincinnati UCLA Columbia Penn State University of Michigan Syracuse Cornell University North Carolina State Ohio State University of Delaware University of Massachusetts Lehigh University Palm Beach Community College
  • Slide 11
  • Commercial Software In-house programs or vendor product BYUs OnSET site listed 10 commercial providers. They include Evaluation Kit, OCE, Web eVal,and Class Climate from Scantron and others.
  • Slide 12
  • Slide 13
  • History of Course Evaluations at New Paltz Fall,1969, 42 questions 1972 to 1976 college-wide procedure ETS for the scanning and reports 24 questions
  • Slide 14
  • History of Course Evaluations at New Paltz 1990s, responsibility for scanning and administering reports switched to the Office of Institutional Research. Results on carbon paper that needed to be separated. SEI desk attended 7am-9pm.
  • Slide 15
  • History of Course Evaluations at New Paltz Early 1990s a Task Force on Teaching was formed in order to examine and revise the course evaluation form Recommended a form with 22 questions, still used today In 2004, 1 survey given to students and 1 to faculty regarding course evaluations
  • Slide 16
  • History of Course Evaluations at New Paltz The Current Process Labels are printed for each course Packets (course/sec) are made up for each course Packets are delivered to Liberal Arts & Sciences individual departments Business Deans office Engineering Deans office Education Deans office Fine & Performing Arts Deans office Packets are returned to Institutional Research
  • Slide 17
  • History of Course Evaluations at New Paltz The Current Process Each packet is matched to a header sheet Each packet is scanned Scanned packets are uploaded Reports are searched for trouble areas Cleaned data sent to Computer Services Reports generated Packets returned to faculty with an individual report summary and department summary. Chairs and deans receive a copy of each faculty report, summary, and Department Summary
  • Slide 18
  • Online Tests at SUNY New Paltz Through Blackboard in 2007 Through OCE in Spring and Summer of 2008
  • Slide 19
  • The Current Process Pros Cons Done in-class - good response ratesVery time consuming (preparation before and after administration) Students feel anonymousTakes 4-6 weeks for faculty to get results Lots of room for error (scanning errors, student errors-using pen, etc., illegible comments, handling errors: students can tamper with data or forget to return, people often put forms in a packet for the wrong class, etc) Students may be apathetic and just fill in anything Costly (cost of forms, bins, envelopes, work hours, scanner) Bad for the environment (uses lots of paper)
  • Slide 20
  • Online Pros Cons Immediate resultsLower response rate (effects of use of incentives?) Far less room for error (no lost forms, scanning issues) Has to be done on the students time (unless technology allows for in-class) Far less time consumingAnonymity concerns Far less costly (no scanner, paper forms, much less work hours,etc.) Green- no need for paper More student comments Flexibility for questions/scales Students who take the time to do them have an opinion
  • Slide 21
  • New Paltz Experiment SUNY New Paltz conducted 2 on-line pilots with the vendor OCE Summer 2008, all on-line SEIs were conducted for all courses Spring of 2008, School of Business and School of Science and Engineering
  • Slide 22
  • New Paltz Experiment Comparison of the mean scores of the paper and on-line versions of the SEI to determine whether or not there were statistically significant differences between them. We calculated a mean SEI score using all the questions on all the SEIs for each school. We used ANOVA testing to compare means.
  • Slide 23
  • New Paltz Experiment Results The results of the significance scores were inconsistent. Several of the tests showed significant differences between the mean scores for paper between years. It is unlikely for the mean scores of on-line SEIs to be significantly different, at the statistical level due from the paper scores, due to the change in format. These results are consistent with the current body of research of online SEI.
  • Slide 24
  • Issues with going online at New Paltz! Differing POV: OIRP, faculty, faculty governance, Deans, Provost, President Hard for each to see the POV of the other Reducing the OIRP work load is not a driver for any of these groups except OIRP Lack of consistent other means of evaluating teaching puts a heavy weight on the SEIs
  • Slide 25
  • Assumptions at New Paltz Harder courses and tougher graders get lower SEI scores Current way of doing it is perfect Students will not go online to complete an SEI SEIs are easy
  • Slide 26
  • The facts about SEIs A one semester analysis found no relationship between grades and SEIs The current way is familiar. It is methodologically suspect. SEI scores are so uniformly high that it is unlikely the questions are valid or reliable. Students will go online to do the SEIs if they think it is useful to do so. Heres that OIRP workload thing again- SEIs take up way too many hours! We handle more than 50,000 sheets of paper multiple times during the year. Surely there is something more useful we could be doing for the college.
  • Slide 27
  • More SEI facts That workload thing 30% increase in student responses, i.e., pieces of paper from fall,1998 to fall, 2008
  • Slide 28
  • What the faculty get and what they give up by going online
  • Slide 29
  • Get Immediate results Flexibility in questions Ability to add their own questions each semester Comments in a file no need to read handwriting Access to their own data all the time Their class time back
  • Slide 30
  • Give up Comfort zone with the present setting Time to do things now unfamiliar: Need to be involved in the process to secure a decent response rate Active participation in analyzing the data
  • Slide 31
  • What students get and what they give up
  • Slide 32
  • Get Ability to do an SEI on their own time Use of a familiar medium online; no more golf pencils The class time back Anonymous responses no handwriting to be recognized
  • Slide 33
  • Give up The comfort of the familiar A designated time for the SEI will have to use their own time
  • Slide 34
  • Possibilities for increasing response rates Hard (hard to sell) ways Hold something of value like grades