Peer Evaluation System · Pawan Kumar Singh (IIT Bombay) June 28, 2014 11 / 39. Normalization We...
Transcript of Peer Evaluation System · Pawan Kumar Singh (IIT Bombay) June 28, 2014 11 / 39. Normalization We...
Peer Evaluation System
Pawan Kumar Singh
under the guidance of
Prof. Deepak B. Phatak
Department of Computer Science and EngineeringIIT Bombay
June 28, 2014
Pawan Kumar Singh (IIT Bombay) June 28, 2014 1 / 39
Outline
1 Motivation
2 Introduction
3 Implementation
4 Conclusion and Future Work
Pawan Kumar Singh (IIT Bombay) June 28, 2014 2 / 39
Outline
1 Motivation
2 Introduction
3 Implementation
4 Conclusion and Future Work
Pawan Kumar Singh (IIT Bombay) June 28, 2014 3 / 39
Motivation
These are some points
Improve the quality of learning
Promote team based learning
Massive Open Online Courses(Moocs)
Improvement in evaluation techniques
Pawan Kumar Singh (IIT Bombay) June 28, 2014 4 / 39
Traditional Evaluation System
1 Human Expert Evaluation System
+ Belief in marking, seriousness towards feedback- Infeasible for MOOCs
2 Self Evaluation System
+ A notion of self directed learning arises- Inconsistent grading
3 Auto Evaluation System
+ More efficient in terms of time- Not feasible for subjective answer
4 Peer Evaluation System
+ Team based learning, save instructor’s time.- Biasing, fairness, reliability.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 5 / 39
Outline
1 Motivation
2 Introduction
3 Implementation
4 Conclusion and Future Work
Pawan Kumar Singh (IIT Bombay) June 28, 2014 6 / 39
Peer Evaluation System
An evaluation system driven by peers for peers. Peers are those who are workingwith you in the organization.
What it includes:
Evaluation by people of similar competence
Self directed leaning by being evaluator of own work
Grade ratio is equal to hard work done in team based learning
Pawan Kumar Singh (IIT Bombay) June 28, 2014 7 / 39
Problems
Ensure removing/reducing biasing
Reliability
Fairness
Predictive validity
In the next few section we will see how we are handling these problems.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 8 / 39
Peer Evaluation Scheduling
Lets we have N submissions and we want n evaluators for each submission thenwe need:
Divide students in groups(student-group pairing)
Allocation of submissions to students to perform peerevaluation(student-submission allocation pairing)
Pseudo-code for student-group pairing:
Divide students simply in (n + 1) groups if (n + 1) divides N else
Select a max fraction of N say M such that (n + 1) divides M and
Schedule another evaluation for the remaining(N - M) students .
Now we have M students, divide them in (n + 1) groups such that eachgroup has k = M
(n+1) students.
Let M students are stored in an array student[M] and their submissions arestored in submission[M] array, now (student, group) pairing is shown inAlgorithm 1.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 9 / 39
Peer Evaluation Grouping Algorithm
Algorithm 1: Group Pairing Algorithm
Data: M, k, n, student[M]Result: pair list(group[n + 1],Student[M])initialization i = 0, j = 0;while i<(n + 1) do
j = i ∗ k;while j<((i + 1) ∗ k) do
pair list(group[i + 1], student[j ]);end
end
When we have divided students in groups, now (student, submission) allocationpairing is shown in Algorithm 2.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 10 / 39
Peer Evaluation Submission Allocation Algorithm
Algorithm 2: Submission Allocation Algorithm
Data: n, pair list(group[n + 1], student[M]), kResult: pair(student[M], submission[M])initialization i = 1;while i ≤ (n + 1) do
student list 1[k] = pair list[group[i ]];j = 1;while j ≤ (n + 1) do
if i 6= j thenstudent list 2[k] = pair list[group[j ]];p = 0;while p<k do
pair(student list 2[p], submission[student list 1[p]]);end
elseDo Nothing ;
end
end
endPawan Kumar Singh (IIT Bombay) June 28, 2014 11 / 39
Normalization
We are implementing Shannon Entropy [Les11, Ent14] to filter marks givenby peers, the Shannon entropy, which quantifies the expected value of theinformation contained in the data.
Entropy evaluates the randomness in the dataset. So if there is morerandomness in dataset, it will have low entropy value and if less randomnessthen higher entropy value.
Entropy = −∑n
j=1 P(xj) log2 P(xj)where P(xj) =probability mass function for random variable xj wherejε[1, n] and limp(xj )→0+ p(xj) log(p(xj)) = 0.
We have shown entropy value for a set of markings and then find out therange of entropy which will be used for normalization.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 12 / 39
Table : Two evaluator for a single submission peer marking case
Marks Entropy Descrimination(1,1) 1.000000 No(1,2) 0.918296 Yes(1,3) 0.811278 Yes. . .(1,9) 0.468996 Yes(1,10) 0.439497 Yes(2,2) 1.000000 No(2,3) 0.970951 No(2,4) 0.918296 Yes. . .(2,10) 0.650022 Yes(3,3) 1.000000 No(3,4) 0.985228 No(3,5) 0.954434 No(3,6) 0.918296 Yes. . .(4,4) 1.000000 No
. . .(4,7) 0.945660 No(4,8) 0.918296 Yes. . .(5,5) 1.000000 No(5,6) 0.994030 No. . .(5,9) 0.940286 No(5,10) 0.918296 Yes(6,6) 1.000000 No. . .(6,9) 0.970951 No(6,10) 0.954434 No(7,7) 1.000000 No. . .(8,9) 0.997503 No(8,10) 0.991076 No(9,10) 0.998001 No(10,10) 1.000000 No
Pawan Kumar Singh (IIT Bombay) June 28, 2014 13 / 39
Marks Entropy Discrimination(1,1,1) 1.000000 No(1,1,2) 0.946395 No(1,1,3) 0.864974 Yes. . .(1,1,10) 0.515273 Yes(1,2,2) 0.960230 No(1,2,3) 0.920620 Yes. . .(1,10,10) 0.775145 Yes(2,2,2) 1.000000 No(2,2,3) 0.982141 No(2,2,4) 0.946395 No(2,2,5) 0.905713 Yes. . .(2,2,10) 0.724834 Yes(2,3,3) 0.985057 No(2,3,4) 0.965634 No(2,3,5) 0.937231 Yes. . .
(2,3,10) 0.783581 Yes(2,4,4) 0.960230 No(2,4,5) 0.943189 Yes. . .(2,10,10) 0.850864 Yes(3,3,4) 0.991159 No(3,3,5) 0.971307 No(3,3,6) 0.946395 No(3,3,7) 0.919432 Yes. . .(3,3,10) 0.838779 Yes(3,4,4) 0.992215 No(3,4,5) 0.980834 No(3,4,6) 0.962947 No(3,4,7) 0.941735 Yes. . .(3,4,10) 0.872639 Yes(3,5,5) 0.977046 No(3,5,6) 0.965713 No(3,5,7) 0.950069 No
Pawan Kumar Singh (IIT Bombay) June 28, 2014 14 / 39
(3,5,8) 0.932020 Yes(3,5,9) 0.912733 Yes(3,5,10) 0.892935 Yes(3,6,6) 0.960230 No(3,6,7) 0.949701 No(3,6,8) 0.936085 Yes. . .. . .(3,10,10) 0.901090 Yes(4,4,4) 1.000000 No(4,4,5) 0.994737 No(4,4,6) 0.982141 No(4,4,7) 0.965401 No(4,4,8) 0.946395 No(4,4,9) 0.926262 Yes(4,4,10) 0.905713 Yes(4,5,5) 0.995233 No(4,5,6) 0.987781 No(4,5,7) 0.975531 No(4,5,8) 0.960395 No(4,5,9) 0.943579 Yes
(4,5,10) 0.925863 Yes(4,6,6) 0.985057 No(4,6,7) 0.977036 No(4,6,8) 0.965634 No(4,6,9) 0.952090 No(4,6,10) 0.937231 Yes(4,7,7) 0.972883 No(4,7,8) 0.964962 No(4,7,9) 0.954526 No(4,7,10) 0.942426 Yes(4,8,8) 0.960230 No(4,8,9) 0.952684 No(4,8,10) 0.943189 Yes(4,9,9) 0.947795 No(4,9,10) 0.940725 Yes(4,10,10) 0.935893 Yes(5,5,5) 1.000000 No(5,5,6) 0.996512 No(5,5,7) 0.987817 No. . .(10,10,10) 1.000000 No
Pawan Kumar Singh (IIT Bombay) June 28, 2014 15 / 39
Marks Entropy Discrimination(1,1,1,1) 1.000000 No(1,1,1,2) 0.960964 Yes. . .(1,1,10,10) 0.719748 Yes(1,2,2,2) 0.975106 No(1,2,2,3) 0.952820 Yes. . .. . .(1,10,10,10) 0.869714 Yes(2,2,2,2) 1.000000 No(2,2,2,3) 0.987469 No(2,2,2,4) 0.960964 Yes. . .(2,2,2,10) 0.774397 Yes(2,2,3,3) 0.985475 No(2,2,3,4) 0.968130 No(2,2,3,5) 0.943959 Yes. . .. . .(2,2,10,10) 0.825011 Yes
(2,3,3,3) 0.990413 No(2,3,3,4) 0.979574 No(2,3,3,5) 0.961012 Yes. . .(2,3,3,10) 0.842489 Yes(2,3,4,4) 0.975032 No(2,3,4,5) 0.962087 Yes. . .. . .(2,3,10,10) 0.858059 Yes(2,4,4,4) 0.975106 No(2,4,4,5) 0.966457 No(2,4,4,6) 0.952820 Yes. . .. . .(2,10,10,10) 0.911596 Yes(3,3,3,3) 1.000000 No(3,3,3,4) 0.993887 No(3,3,3,5) 0.979595 No(3,3,3,6) 0.960964 Yes. . .
Pawan Kumar Singh (IIT Bombay) June 28, 2014 16 / 39
(3,3,3,10) 0.874386 Yes(3,3,4,4) 0.992614 No(3,3,4,5) 0.982798 No(3,3,4,6) 0.968139 No(3,3,4,7) 0.950753 Yes. . .(3,3,4,10) 0.892738 Yes(3,3,5,5) 0.977217 No(3,3,5,6) 0.966402 No(3,3,5,7) 0.952438 Yes. . .(3,3,10,10) 0.889675 Yes(3,4,4,4) 0.994949 No(3,4,4,5) 0.988609 No(3,4,4,6) 0.977124 No(3,4,4,7) 0.962563 Yes. . .(3,4,4,10) 0.911059 Yes(3,4,5,5) 0.985668 No(3,4,5,6) 0.977343 No(3,4,5,7) 0.965648 No
(3,4,5,8) 0.951851 Yes(3,4,5,9) 0.936781 Yes(3,4,5,10) 0.920993 Yes(3,4,6,6) 0.972005 No(3,4,6,7) 0.963060 Yes. . .(3,4,10,10) 0.910898 Yes(3,5,5,5) 0.985412 No(3,5,5,6) 0.979649 No(3,5,5,7) 0.970323 No(3,5,5,8) 0.958679 Yes(3,5,5,9) 0.945546 Yes(3,5,5,10) 0.931489 Yes(3,5,6,6) 0.976362 No(3,5,6,7) 0.969354 No(3,5,6,8) 0.959845 Yes. . .(3,5,10,10) 0.925051 Yes(3,6,6,6) 0.975106 No(3,6,6,7) 0.970036 No(3,6,6,8) 0.962337 Yes
Pawan Kumar Singh (IIT Bombay) June 28, 2014 17 / 39
(3,6,6,9) 0.952820 Yes(3,6,6,10) 0.942054 Yes(3,6,7,7) 0.966832 No(3,6,7,8) 0.960895 Yes. . .(3,6,10,10) 0.934107 Yes(3,7,7,7) 0.965203 No(3,7,7,8) 0.960771 Yes. . .(3,10,10,10) 0.940186 Yes(4,4,4,4) 1.000000 No. . .(4,4,4,7) 0.975240 No(4,4,4,8) 0.960964 Yes(4,4,4,9) 0.945460 Yes(4,4,4,10) 0.929278 Yes(4,4,5,5) 0.995538 No. . .(4,4,5,8) 0.967358 No(4,4,5,9) 0.953830 Yes(4,4,5,10) 0.939413 Yes
(4,4,6,6) 0.985475 No(4,4,6,7) 0.978033 No(4,4,6,8) 0.968130 No(4,4,6,9) 0.956583 Yes(4,4,6,10) 0.943959 Yes(4,4,7,7) 0.972830 No(4,4,7,8) 0.965010 No(4,4,7,9) 0.955381 Yes. . .(4,4,10,10) 0.931560 Yes(4,5,5,5) 0.996887 No. . .(4,5,5,8) 0.974732 No(4,5,5,9) 0.962898 Yes(4,5,5,10) 0.950011 Yes(4,5,6,6) 0.990702 No. . .(4,5,6,9) 0.966466 No(4,5,6,10) 0.955154 Yes(4,5,7,7) 0.981072 No(4,5,7,8) 0.974541 No
Pawan Kumar Singh (IIT Bombay) June 28, 2014 18 / 39
(4,5,7,9) 0.966119 No(4,5,7,10) 0.956366 Yes(4,5,8,8) 0.969735 No(4,5,8,9) 0.962932 Yes. . .(4,5,10,10) 0.945401 Yes(4,6,6,6) 0.990413 No. . .(4,6,6,9) 0.970951 No(4,6,6,10) 0.961012 Yes(4,6,7,7) 0.983882 No(4,6,7,8) 0.978703 No(4,6,7,9) 0.971555 No(4,6,7,10) 0.962989 Yes(4,6,8,8) 0.975032 No(4,6,8,9) 0.969312 No(4,6,8,10) 0.962087 Yes. . .(4,6,6,9) 0.970951 No
(4,6,6,10) 0.961012 Yes(4,6,7,7) 0.983882 No(4,6,7,8) 0.978703 No(4,6,7,9) 0.971555 No(4,6,7,10) 0.962989 Yes(4,6,8,8) 0.975032 No(4,6,8,9) 0.969312 No(4,6,8,10) 0.962087 Yes. . .(4,6,10,10) 0.954306 Yes(4,7,7,7) 0.982839 No. . .(4,7,8,10) 0.965701 No(4,7,9,9) 0.968471 No(4,7,9,10) 0.963444 Yes(4,7,10,10) 0.959517 Yes(4,8,8,8) 0.975106 No. . .(10,10,10,10) 1.000000 No
Pawan Kumar Singh (IIT Bombay) June 28, 2014 19 / 39
Noramlization Algorithm
PreCondition : for n = 2 Entropy = [0.92000, 1.00], for n = 3Entropy = [0.94500, 1.00] otherwise for n ≥ 4 Entropy = [0.96500, 1.00] androundoff = 0.50Marks[i ][j ] = Marks given by studentj where jε[1, n] for question iTotal Marks[i ] =Total marks given by n students for question iMax Marks[i ] = Maximum marks out of n marks given by n students forquestion iAvg Marks[i ] = Average marks out of n marks given by n students forquestion iData: t,Marks[t][n]Result: Normalized Marks[t]
Now the rest algorithm is as ....
Pawan Kumar Singh (IIT Bombay) June 28, 2014 20 / 39
Algorithm 3: Normalization Algorithm
while i ≤ t do
Entropy = −∑n
j=1 P(xj) log2 P(xj), where P(xj) = Marks[i ][j]Total Marks[i ] ;
if Entropy in range thenAvg Round [i ] = Avg Marks[i ]+roundoff ;Diff = Max Marks[i ]− Avg Round [i ];if Diff<1 then
Normalized Marks[i ] = Max Marks[i ];else
if 1 ≤ Diff<2 thenNormalized Marks[i ] = bAvg Round [i ] + 0.5c;
elseNormalized Marks[i ] = bAvg Round [i ] + 0.5c+ 1;
end
end
elseReport Discrimination;Break;
end
end
Pawan Kumar Singh (IIT Bombay) June 28, 2014 21 / 39
Outline
1 Motivation
2 Introduction
3 Implementation
4 Conclusion and Future Work
Pawan Kumar Singh (IIT Bombay) June 28, 2014 22 / 39
Our System
We will see each entity in coming slides.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 23 / 39
Instructor Side Feature
Can create a course, enroll students.
Can upload study materials for a course.
Can create questions, set a paper by selecting questions for an exam.
Schedules an exam as well as an evaluation for finished exam.
Views student’s performance in a course.
Evaluates reviews and changes the grades if he deems fit.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 24 / 39
Student Side Feature
Views the list of courses and enroll in a course.
Can write exams and can re-write exams which are running any number oftimes.
Can evaluate submissions assigned to him and re-evaluate as many times ashe wants till the evaluation deadline meets.
Can leave any course enrolled in.
Can go for crib if any.
Views previous submissions if any and paper-answers for a not submittedexams.
Views marks given to peers by him for any scheduled evaluation of an exam.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 25 / 39
Figure : Instructor Create Question Page
Pawan Kumar Singh (IIT Bombay) June 28, 2014 26 / 39
Figure : Instructor Create Paper Page
Pawan Kumar Singh (IIT Bombay) June 28, 2014 27 / 39
Figure : Instructor Schedule Exam Page
Pawan Kumar Singh (IIT Bombay) June 28, 2014 28 / 39
Figure : Peer Evaluation Schedule
Pawan Kumar Singh (IIT Bombay) June 28, 2014 29 / 39
Figure : Result Page for Instructor
Pawan Kumar Singh (IIT Bombay) June 28, 2014 30 / 39
Figure : Student Exam Page
Pawan Kumar Singh (IIT Bombay) June 28, 2014 31 / 39
Figure : Student Attempt Exam Page
Pawan Kumar Singh (IIT Bombay) June 28, 2014 32 / 39
Figure : Perform Peer Evaluation
Pawan Kumar Singh (IIT Bombay) June 28, 2014 33 / 39
Figure : Result Page for Student
Pawan Kumar Singh (IIT Bombay) June 28, 2014 34 / 39
Outline
1 Motivation
2 Introduction
3 Implementation
4 Conclusion and Future Work
Pawan Kumar Singh (IIT Bombay) June 28, 2014 35 / 39
Conclusion
There should be some physical intervention to cross check the effectivenessof system.
We should have a comparison level study of peer evaluation marking style,instructor marking style on all aspects (i.e concept understanding, academicperformance, activeness towards problem solving) to make sure the reliabilityof peer evaluation system.
Some guidelines for performing better peer evaluation should be given.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 36 / 39
Future Work
To render a list of model answers for a question while performing peerevaluation for that question.
Citation of solutions, sites as feedback to defense the marking.
Integration of other normalization techniques in the system.
Integration of our peer evaluation system as grading system for MOOC alongwith edX ORA.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 37 / 39
Thank you all for your attention.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 38 / 39
References I
Shannon Entropy, Entropy(information theory) — Wikipedia, the freeencyclopedia, http://en.wikipedia.org/wiki/Entropy_%28information_theory%29,[Online; accessed 15-May-2014].
Annick Lesne, Shannon entropy: a rigorous mathematical notion at thecrossroads between probability, information theory, dynamical systems andstatistical physics.
Pawan Kumar Singh (IIT Bombay) June 28, 2014 39 / 39