Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University.

Post on 02-Jan-2016

215 views 0 download

Tags:

Transcript of Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University.

Web Enhanced Course Evaluation at Columbia University

Jack McGourtyColumbia University

Overview A little history How does course assessment fit

into the “big picture”? Why use web technology? How is it being done? Does it work?

History Columbia’s Fu Foundation School of

Engineering and Applied Science began using the web for course assessment about four years ago starting with a student administered web site for results

Designed and developed state-of-the-art system using student teams

Now building on current infrastructure to include on-line tutorials and increased flexibility for administration

Student Web Site

Search by course or faculty

Current and past results

No comments

The Big Picture Why are we assessing courses and programs?

Continuous improvement of the education process What are we doing right, and what can we do better?

Integral part of our ABET EC2000 Compliance Develop a process Collect and evaluate data Close the loop Document/Archive results

Course evaluation one of several outcome assessment measures such as senior exit surveys, enrolled student surveys, and alumni surveys

How WCES Fits in

SEAS Assessment Processes

1998 1999 2000 2001pre1997

Initiate Course Evaluation Process

Conduct First Alumni Survey (All Alumni)

Conduct Second Alumni Survey1989 & 1994 Grads.

Benchmarking Senior Surveys -Class of 2000

Start Academic Review Cycle

Create Web Based Course EvaluationProcess

Senior Surveys -Class of 2001Alumni - 1996

Initiate Freshman Pre-Attitude Survey

Using Technology Pro

Students have the time to consider their responses

Timely feedback Responses are easily

analyzed, archived and distributed

Less paper Lower cost/efficient

administration

Con You lose the “captive

audience” You can’t guarantee a

diversity of opinions Motivated/Non-

motivated Like course/Dislike

course Not necessarily less

effort

Course Assessment Details 10 Core Items

Course Quality Instructor Quality

Relevant ABET EC2000 Items Pre-selected by

faculty member Customized

questions for specific course objectives

Selecting EC2000 Questions

Monitoring Faculty Usage

One of our culture change metrics is the percentage of faculty who are capitalizing on the system and adding custom and EC2000 questions. Currently around 15%.

Course Evaluation Results Web page access

Current term’s assessment Limited time window Limited access Secure site

Previous terms results Open access to numerical results; not comments

Email Results Individual faculty Aggregate Data – Department Chairs

Reporting

Promoting Responses Student-driven

results website Multiple targeted

emails to students and faculty from Dean

Announcements in classes

Posters all over the school

Random prize drawing

Closing the Loop

Does it Work? Student response rates have steadily

increased over past two years from 72% to 85%

More detail in student written comments in course assessments

Data is available that we have never had before

Faculty use of ABET EC2000 and Customized question features increasing but still limited (15%)