Case study 2.1b: Feedback on Academic Achievement First...

19

Transcript of Case study 2.1b: Feedback on Academic Achievement First...

Page 1: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

Case study 2.1b:

Feedback on Academic Achievement First Semester

KU Leuven, 2016-2017

Written by

Tinne De Laet, Tom Broos

KU Leuven, Belgium

STELA Erasmus+ project (562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD)

Page 2: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

"The European Commission support for the production of this publication does not constitute an

endorsement of the contents which re�ects the views only of the authors, and the Commission cannot

be held responsible for any use which may be made of the information contained therein."

2

Page 3: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

Contents

Contents

1 Introduction 4

2 Situation of this Study 4

2.1 Aims and objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.2 Six critical dimensions framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3 Self-re�ection dashboard regarding academic achievement 6

3.1 First-semester academic achievement . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63.2 Dashboard Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.2.1 Introduction tab (Inleiding) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63.2.2 Grades tab (Scores per vak) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63.2.3 Global achievement tab (Globaal) . . . . . . . . . . . . . . . . . . . . . . . . . . 83.2.4 Tips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.2.5 KU Leuven study progress regulation (Regelgeving) . . . . . . . . . . . . . . . 8

3.3 Text parameterization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83.4 Data Sources and System Infrastructure . . . . . . . . . . . . . . . . . . . . . . . . . . 143.5 Involvement of practitioners . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

4 Results 14

4.1 Target Group and Data Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144.2 Dashboard Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144.3 Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154.4 Student Pro�le . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

5 Discussion, Conclusion, and Future Work 15

3

Page 4: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

2 Situation of this Study

1 Introduction

This document reports on a case study performed at KU Leuven at the end of the �rst semester ofthe academic year 2016-2017. It discusses the implementation and evaluation of a Learning Analytics(LA) intervention, which resulted in a student dashboard that provided 1600 �rst-year STEM (Science,Technology, Engineering, and Mathematics) students at the University of Leuven with feedback ontheir academic achievement in the �rst examniation period.

Similar to the other interventions at KU Leuven, the discussed intervention is an attempt tobridge the gap between LA research and the daily practice of supporting students in their �rst year ofhigher education. Therefore, the development of the dashboard was done in close collaboration withpractitioners from the involved faculties. While the study advisors from the Faculty of EngineeringScience were the main support during the development, study advisors from the Faculty of Bio-Engineering, the Faculty of Engineering Technology, and from the Faculty of Science provided feedbackin later development phases.

2 Situation of this Study

In this section, we explain the aims and objectives of the study. The study is part of the STELA,Erasmus+ project (http://stela-project.eu). It is one of the interventions, as a part of case study2, done at KU Leuven within the framework of he project.

2.1 Aims and objectives

The aim of this study is to learn about the use of a dashboard in a realistic context to provide�rst-year students in higher education with feedback on and evoke self-re�ectin regarding theiracademic achievements in the �rst semester. This is translated to the following objectives, which arethe same as the objectives of the earlier intervention on learning skills.

(1) To demonstrate and test the feasibility of a scalable approach to learning analytics, targeting asizable group of students in STEM study programs.

(2) To construct a dashboard based on information that is readily available within the institution,but not yet shared with students.

(3) To collect usermetrics and feedback to assess perceived usefulness and usability and to uncoverareas for further research.

Furthermore, the goal was to further continue to work towards the goals set withing the KU Leuvencontext:

(1) Develop a second student-facing learning analytics dashboard, in close collaboration between re-searchers on learning analytics and practitioners, i.e. study advisors and tutors.

(2) Further develop the local infrastructure in close collaboration with the KU Leuven IT services,embedded within the university systems, such that potential scalability is maximized.

2.2 Six critical dimensions framework

We use the holistic design framework with six critical dimensions for learning analytics by Greller andDraschler [1] to describe our learning analytics intervention. The application of the framework eases�comparison of context parameters with other similar approaches in other contexts, or for replicationof the scienti�c environment� [1].

The stakeholders, both data clients and data subjects, are �rst-year students. The interventionuses two populations. Firstly, all students in the �rst year of a particular bachelor program, suchthat a student's academic achievement in the �rst semester can be compared to his/her peer students.Secondly, �rst-year students in previous academic years, at least 5 years ago, such that the relation

4

Page 5: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

2 Situation of this Study

between a student's academic achievement in the �rst semester and overall study success (measuredas number of years needed for graduation or drop-out) can be shown.

Furthermore and similar to the earlier interventions, study advisors involved in the study receivedaccess to the dashboard which randomly sampled a student from the database. However, the datashown to the study advisors was anonymous: i.e. the study advisors could not derive from thedashboard of which student the data was shown. This access however allowed the study advisors toexplore the dashboard as a student would explore the dashboard. On each new access, a new studentwas sampled such that di�erent `student pro�les' could be explored.

The objective of the dashboard is to unveil information on academic achievement in the �rstsemester to students. The data is available in the central university databases. Students already receivefeedback on their own academic achievement in the �rst semester, but they received no informationon their position with respect to peers or the possible impact on their later study patway. Thedashboard combines re�ection and prediction [1]. Concerning re�ection, students receive feedback ontheir academic achievement in the �rst semester and the comparison with peers students (Fig. ??).As such, a student can re�ect on the data, start a critical self-re�ection and improve his/her self-knowledge (also see: Quanti�ed self [3]). Furthermore, the feedback includes tips and links to supportfor improving academic achievement.

Speci�cally, the dashboard explicitly points students to existing support within the KU Leuven:private appointments with their study advisors, private appointments with the central study advicecenter, training programs at KU Leuven, and available online information and remediation.

Additionally, the objective of providing the access for study advisors was to ensure study advisorsare aware of all information provided to students. The dashboard also unveiled information, previouslynot available to study advisors: i.e. the relation between academic achievement in the �rst semesterand overall study success (measured as number of years needed for graduation or drop-out).

Concerning prediction, the dashboard uses a �mild� form. To make students re�ect on the impor-tance of academic achievement in the �rst semester, the overall study success of the students of earliercohorts is shown in relation to their academic achievement in the �rst semester (Fig. ??).

As the pure underlying data is shown, no machine learning is used. As indicated above, thisinformation was new to both students and study advisors.

This intervention takes advantage of linking the data of academic achievement in the �rst semesterand overall academic achievement of students. While all data is available in the university's datawarehouse, the link between �rst semester academic achievement and overall academic achievementwas not made before. Regarding instruments, the intervention does not rely on advanced technology

and rather provides a visualization of the underlying raw data to students.Now, we discuss internal and external limitations. Regarding conventions, both privacy andethics are important. Approval to collect, connect the data from di�erent databases, and use it toprovide feedback was obtained from the vice-rector of student a�airs KU Leuven. The dashboard waspresented to all Program Advisory Committees of the study programs involved in the intervention.They provided approval for the intervention.

The ethical soundness of the intervention was supported by the inclusion of study counselors andadvisors in the development of the dashboard.

Regarding the time scale, the intervention aimed at being just-in-time: students received feedbackthe day after the o�cial �rst-semester grades were available in the university database. This is alsothe day after they received an o�cial noti�cation of their grades of courses of the �rst semester.Therefore, students already knew their grades, before receiving an invitation for the dashboard. Thetiming di�ered signi�cantly between programs, as programs upload their grades in the universitydatabase at di�erent times. The programs from the faculty of Science (CBBGG and WIF) in factdeliberately delay the uploading of grades in the o�cial university database. This allows them topersonally invite students to communicate their grades, before students receive their grades throughthe KU Leuven system.

Regarding the limitations of the �rst-year students regarding interpreting LA data, the dash-board uses simple visualizations complemented with textual explanations.The textual explanationsprovide additional support for interpreting the simple visualizations, provide additional support on

5

Page 6: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

3 Self-re�ection dashboard regarding academic achievement

self-re�ection, refer to additional support within KU Leuven, and explain important KU Leuven reg-ulation regarding study progress.

3 Self-re�ection dashboard regarding academic achievement

3.1 First-semester academic achievement

The dashboard focuses on providing feedback and evoking self-re�ection regarding �rst-semester aca-demic achievement. The �rst-semester academic achievement that is subject of this intervention con-sists of the grades of courses of the �rst semester of the �rst year of the program.

3.2 Dashboard Design

An interactive dashboard was created to provide feedback to students about their �rst-semester aca-demic achievement, to evoke self-re�ection, and to provide them with contextual and actionable infor-mation. Fig. 1 provides a screenshot of a student's view on the dashboard. The main components ofthe dashboard are introduced below.

To accommodate for the di�erent programs and their particular needs, all feedback text was pa-rameterized (name study program, contact person, url for additional support). Furthermore, the dash-board allows to display di�erent texts depending on the program (when the needs are very speci�c),as elaborated below in the Text parameterization section. Importantly, and new for this interventionis that the de�nition of feedback categories (based on study e�ciency of the �rst semester) is alsoprogram-dependent.

The dashboard is divided into �ve tabs.

3.2.1 Introduction tab (Inleiding)

On access, the �rst tab (Fig. 1) shows an introductory text, explaining the purpose and components ofthe intervention and the origin of the data it is based on. More speci�cally the introduction mentionsthat the platform wants to evoke self-re�ection. A warning was added that the grades shown in thedashboard are not the o�cial grades. If students would observe di�erences with the o�cial gradesthey were requested to send an email to a contact person. Di�erences are only expected when thegrades of students are changed after the o�cial announcement (e.g. due to administrative errors orappeals of students).

3.2.2 Grades tab (Scores per vak)

The second tab (Figure 2) contains a list of all the courses of the �rst semester of the �rst year thatthe student booked in the program, supplemented with their grade obtained on the courses.

The students is invited to answer two multiple-choice questions for each course:

(a) The score is (a lot lower, lower, similar, higher, a lot higer) than what I expected after the exam.

(b) I am (very unsatis�ed, unsatis�ed, nor satis�ed nor unsatis�ed, satis�ed, very satis�ed) with thisresult.

When the student answers these two question he/she can click the button `Show the grades of mypeers'. This triggers a visualization that positions the students grade with respect to other studentsthat booked the course. The students are divided four groups depending on their grades.

A simple unit chart (Fig. 2) uses dots to represent the number of students within the respectivegrade groups of a course. Each dot represents a single student. The grade group that applies to theactive student, is marked with a blue border and background hatching.

Next the student can click `Position my grade on this graph', which results in the highlighting(blinking of un�lled dots) of the students who obtained the same grade on the course.

The decision to only show the grades of peers and to position the grade on the graph upon speci�crequest was made after feedback of practitioners They indicated that students might not be interested

6

Page 7: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

3 Self-re�ection dashboard regarding academic achievement

Figure 1: Screenshot of a student's view on the dashboard when entering the dashboard.

7

Page 8: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

3 Self-re�ection dashboard regarding academic achievement

in comparing to peers, but rather judge their results independently of peers. As this is not considereda bad practice, grades of peers on positioning with respect to peers is only shown upon request.

To ease the interpretation of the unit charts, dots are grouped in clusters of 100 on a 10x10 grid.

3.2.3 Global achievement tab (Globaal)

The third tab (Figure 3) shows the global academic achievement of the �rst semester in terms of studye�ciency (percentage of obtained credits). The positioning is done only for �rst-year students thatare new in higher education. Students that are not in this category, receive a warning and are notpositioned on the graphs.

A simple unit chart (Fig. 3) uses dots to represent the number of students within the respectivestudy e�ciency groups of the program. Each dot represents a single �rst-year student. The studye�ciency group that applies to the active student, is marked with a blue border and backgroundhatching.

A second unit chart (Fig. 4) relates the �rst semester academic achievement of students of earliercohorts (at least 5 years ago) of the study program with their overall academic achievement (dropout or years needed for graduation). Again, each square represents 1% of the students in that studye�ciency group. The color of the dots represents the overall academic of these students using fourcategories (black = drop-out, red = bachelor in 5 years or still studying after 5 years, orange ==bachelor obtained in four years; green = bachelor obtained in three years (nominal duration)). Thecolors are adapted for student with color vision de�ciency [2]. To support the interpretation of thegraphs, textual explanation is provided.

The de�nition of three categories of study e�ciency de�ned above depend on the program andare de�ned based on minimizing the number of false positives (maximally 10%) while still obtainingwell-balanced groups. Furthermore, the de�nition of the study e�ciency boundaries can be adaptedto better conform with the KU Leuven regulation on study progress.

The textual support was constructed by the practitioners involved in the dashboard development.Explicit care was taken to avoid wrong interpretations of the graphs. The particular text di�ersdepending on the study e�ciency group of the student. As such, the textual support is individualizedto the academic achievement of students: three di�erent feedback texts are therefore available.

To ease the interpretation of the unit charts, dots and squares are grouped in clusters of 100 on a10x10 grid.

3.2.4 Tips

The fourth tab (Figure 5) provides detailed textual tips for self-re�ection on the �rst- semester

academic achievement. The advice included simple tips and questions to evoke self-re�ection,reference to the earlier dashboard on learning and studying skills, and an invitation to make a personalappointment with a student adviser. Again, the textual support was constructed by the practitionersinvolved in the dashboard development.

3.2.5 KU Leuven study progress regulation (Regelgeving)

The �fth tab (Figure 6) provides a summary of the relevant KU Leuven regulation on study progress.Links to more elaborate and complete regulations are provided.

3.3 Text parameterization

Similar to the previous intervention, all textual content is adapted to the study program and situationof the student based on experience from the �eld using text parameterization. We invited studycounselors from participating study programs to adapt messages based on their expertise. The sameprocess (division in subparts, use of markdown, and text parameterization) was used. Additionally,the study e�ciency categories in the global achievement tab are program-speci�c. Graphs and textualexplanation automatically adapt to the di�erent de�nitions.

8

Page 9: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

3 Self-re�ection dashboard regarding academic achievement

Figure 2: Screenshot of a student's view on the second tab of the dashboard containing the grades.

9

Page 10: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

3 Self-re�ection dashboard regarding academic achievement

Figure 3: Screenshot of a student's view on the third tab of the dashboard containing the globalacademic achievement.

10

Page 11: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

3 Self-re�ection dashboard regarding academic achievement

Figure 4: Screenshot of the lower part of the third tab of the dashboard containing the global academicachievement.

11

Page 12: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

3 Self-re�ection dashboard regarding academic achievement

Figure 5: Screenshot of the fourth tab of the dashboard containing the tips to evoke self-re�ection.

12

Page 13: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

3 Self-re�ection dashboard regarding academic achievement

Figure 6: Screenshot of the �fth tab of the dashboard containing the KU Leuven regulations on studyprogress.

13

Page 14: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

4 Results

program number or students

bio-engineering science 356biochemistry and biotechnology 143biology 150chemistry 84physics 149geography 67geology 36informatics 178engineering science 542engineering science: architecture 119mathematics 98total 1922

Table 1: Number of students receiving invitation to dashboard in di�erent study programs.

3.4 Data Sources and System Infrastructure

All data in the intervention is automatically retrieved from the the central (SAP) ERP infrastructure,which is a system of record for all o�cial data on students, programs, courses, and results. Similarto the previous intervention, the data was loaded into a relational database using an Extract, Load,Transform (ELT) process.

Similar to the previous intervention, the dashboard is accessible indirectly through the university'sreverse proxy infrastructure, enforcing authentication by a central single sign-on system (Shibboleth).

3.5 Involvement of practitioners

The development of the dashboard was done in close collaboration with practitioners from the involvedfaculties. While the study advisors from the Faculty of Engineering Science were the main supportduring the development, study advisors from the Faculty of Bio-engineering, Faculty of EngineeringTechnology, and from the Faculty of Science provided feedback in later development phases. Feedbackwas requested based on an letter send to study advisors. The letter sent to the study advisors can befound in Annex 1.

4 Results

4.1 Target Group and Data Collection

1922 students in 11 di�erent STEM programs received a personalized invitation by email to access thedashboard, stating that it provides feedback and self-re�ection opportunities regarding their academicachievement in the �rst semester. All students that booked at least one �rst semester course of the�rst year received an invitation. Table 1 provides an overview of the number of invited students perprogram.

Similar to the previous intervention, the dashboard contains a short feedback form with threequestion, on top of the background recording of user activity. The questions are:

(1) I �nd this information useful ;

(2) I �nd this information clear ;

(3) This information impacts my satisfaction regarding my academic achievement.

4.2 Dashboard Interaction

879 (46%) of the students clicked on the link in the invitation email and entered the dashboard. Theclick-through rate di�ers between study programs and ranges from 22% (geology) to 66% (engineering

14

Page 15: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

5 Discussion, Conclusion, and Future Work

Figure 7: Click-through per study program, expressed as the percentage of invited students. The 11study programs are grouped as follows: Bio-Engineering; CBBGG (Chemistry, Biology, Biochemistry-Biotechnology, Geography, Geology), Engineering Science, Engineering Science: Architecture, andMIP (Mathematics, Informatics, Physics). The width of the bars is proportional to the number ofstudents in the grouped study programs.

science) (Fig. 7).Most students clicked through using a desktop browser or a smartphone (Fig. 8). The use of tablets

and other media devices was limited. Students using a desktop or tablet spend on average more timeon the dashboard than smartphone (Fig. 9).

4.3 Feedback

Although the e�ort required to answer the three survey questions was minimal, only a minority ofaccessing students provided feedback on all three questions. Most of the students that providedfeedback indicated that they �nd the dashboard useful and clear (89%). More than half of the studentsindicate that the dashboard changed their satisfaction with their academic achievement. Figure 10summarizes the student feedback.

4.4 Student Pro�le

The click-through behavior depends on the academic achievement of students. While 55% of the topachievers (upper category of study e�ciency, see Section 3.2.3) click through, this is only 45% for themiddle achievers, and 34% of the low achievers (Fig.!11).

5 Discussion, Conclusion, and Future Work

In this report we presented a dashboard that provides feedback on and evokes self-re�ection regardingacademic achievement of courses of the �rst semester of the �rst year. Our aims were to study thefeasibility of deploying such dashboards in a scalable way, to assess the potential of available dataand to collect feedback and metrics about utilization, usability, and perceived usefulness. Below, wediscuss the results from this study with respect to these aims.

Similar to the �rst dashboard, the dashboard discussed in the report demonstrates the scalability.Only data is used that is readily available in digital format within a typical higher-education institution(grades and study duration). While students received their grades through the o�cial institutionalsystems, they did not receive information regarding their position with respect to peers and the possibleimpact on their future study pathway.

This intervention was the second intervention from the project at KU Leuven. While this inter-vention targeted fewer study programs (11 in stead of 13), it targeted more students (1922 students

15

Page 16: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

5 Discussion, Conclusion, and Future Work

Figure 8: Devices used by students.

Figure 9: Minutes spend on device depending on the device used by students.

16

Page 17: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

5 Discussion, Conclusion, and Future Work

Figure 10: Survey responses. Students were asked to provide feedback using the scale (-) 1-2-3-4-5 (+).

Figure 11: Click through rate for di�erent categories of achievers.

17

Page 18: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

5 Discussion, Conclusion, and Future Work

in stead of 1406). All students that booked a course of the �rst semester of the �rst year receivedfeedback.

The �rst intervention was immediately executed at a larger scale: 1406 �rst-year students from13 di�erent programs and four di�erent faculties (Bio-engineering Science, Engineering Science, En-gineering Technology, and Science). The text parameterization showed to be a necessary tool toaccommodate for the di�erences between study programs and faculties.

From a technical perspective, the same infrastructure was used was with the �rst intervention. Assuch, the high investment of time and e�ort for the integration of the technical solution within theKU Leuven systems done when preparing the �rst intervention now paid o�.

Similar to the �rst intervention, we involved domain experts and practitioners early on in theprocess and relied on them for the preparation and distribution of the dashboard. We enabled studentcounselors to adapt the messages delivered to the student based on the study program and individuallearning skills. As highlighted when discussing the previous intervention, we believe that this approachmay enhance the acceptance of the dashboard within the institution, while at the same time improvingits overall quality.

The click-through rate di�ers between study programs. Part of the explanation lies within thetiming which di�ered signi�cantly between programs. Students were invited to the dashboard a dayafter their grades were uploaded in the university database. The programs from the faculty of Science(CBBGG and WIF) deliberately delayed the uploading of grades in the o�cial university database.This allowed them to personally invite students to communicate their grades, before students receivedtheir grades through the KU Leuven system. As a result, the science students only received theinvitation in the second week of the second semester, after having the opportunity to collect theirgrades during a personal conversation with their study advisor and having received the grades throughthe o�cial KU Leuven system. Therefore, for the science students the invitation for the dashboardwas the third moment of feedback on their academic achievement. This, together with the later timingalready after the �rst week of the second semester, might explain the lower click-through rate.

Further involvement of stakeholders of the respective study programs may help to discover addi-tional reasons for the di�erence between programs. This is subject of a follow-up study.

Even more than during the development of the �rst dashboard, a friction was noticed between thedrive for sleekness (mainly coming from the learning analytics researcher and dashboard developer) andcompleteness and nuance (mainly coming from the practitioners) during the dashboard development.The resulting dashboard is a compromise between the two: it contains appealing visualizations whilestill adding elaborate textual information and nuance. Even more textual nuance is provided asfeedback on academic achievement was perceived as being more in�uential than learning and studyingskills.

Similar to the �rst dashboard, the proportion of students providing feedback is limited. Ina subsequent study, feedback gathered from focus groups may help to complement the embeddedfeedback instrument. In their feedback, students tend to appreciate the usefulness and clearness of thedashboard. More than half of the students also indicate that the feedback in�uence their satisfactionwith their result. As part of future research, focus group discussions will be done with students andstudy advisors to get more qualitative feedback on the platform. Furthermore, it should still be studiedif there are di�erences in the feedback depending on the academic achievement, gender, program, andacademic achievement in general. Especially the e�ect on low achievers is of potential interest as studyadvisors are cautious not to scare away these students for any further interaction.

An interesting �nding, is that students that click through to the dashboard, have higher academic

achievement on average. Future dashboards should take into consideration that reaching di�erenttarget groups may require di�erent approaches and levels of e�ort, especially when targeting studentswith lower academic achievement. On the other hand, high achieving students might bene�t from thedashboard as well. These conclusions are in line with the conclusions of the �rst dashboard.

18

Page 19: Case study 2.1b: Feedback on Academic Achievement First Semesterstela-project.eu/files/O8-reportCaseStudy2.1b.pdf · 2017. 5. 15. · Case study 2.1b: Feedback on Academic Achievement

References

References

[1] Wolfgang Greller and Hendrik Drachsler. �Translating learning into numbers: A generic frameworkfor learning analytics.� In: Educational technology & society 15.3 (2012), pp. 42�57.

[2] Masataka Okabe and Kei Ito. �How to make �gures and presentations that are friendly to colorblind people�. In: University of Tokyo (2002).

[3] Gary Wolf. �Know thyself: Tracking every facet of life, from sleep to mood to pain�. In: Wired

Magazine 365 (2009).

19