Design of a student-centered learning management system

21
Design of a student-centered learning management system submitted in partial fulfillment for the degree of master of science Jasper Grannetia 12523542 master information studies faculty of science university of amsterdam 2021-07-02 First Supervisor Second Supervisor Title, Name Dr Bert Bredeweg Dr Frank Nack Affiliation UvA, FNWI, IvI UvA, FNWI, IvI Email [email protected] [email protected]

Transcript of Design of a student-centered learning management system

Design of a student-centeredlearning management system

submitted in partial fulfillment for the degree ofmaster of science

Jasper Grannetia12523542

master information studiesfaculty of science

university of amsterdam

2021-07-02

First Supervisor Second SupervisorTitle, Name Dr Bert Bredeweg Dr Frank NackAffiliation UvA, FNWI, IvI UvA, FNWI, IvIEmail [email protected] [email protected]

Design of a student-centered learning management systemJasper Grannetia, 12523542University of Amsterdam

[email protected]

ABSTRACTStudent-Centered Learning gives students control over their ownlearning. Instead of following a predefined curriculum, in student-centered learning students set their own learning goals and createindividualized learning paths to achieve these goals.

Current learning management systems are still firmly rooted inthe teacher-centered paradigm. Hence, there is a need for a student-centered learning management system that lets students define,plan, execute, and reflect on their personalized learning paths andperformance.

This thesis presents a model for a student-centered learningmanagement system. The model is organized around the goalsstudents set for themselves and incorporates learning analyticsto give students and teachers actionable insights that support thelearning process.

A student-centered learning management system can help stu-dents reach their full potential. It can help students and teachersmanage the complexity of creating and tracking highly individ-ualized learning paths. Keywords: student-centered learning,

learning management system, learning analytics.

1 INTRODUCTION1.1 Digital media in educationThe introduction of digital systems has irreversibly changed oureducational system [30]. Traces of stakeholders’ activities in digitalsystems can be used to perform statistical and other analytical oper-ations. These operations produce hitherto unavailable insights intothe student’s study progress, attitudes, and dispositions [56]. Learn-ers and teachers can use these insights to improve the student’slearning experience. Opportunities are also presented for educatorsto evaluate the quality of the learning programs they provide [56].The science and practice of analyzing educational data to provideactionable insights to stakeholders are now generally referred toas learning analytics [55]. Broadly speaking, learning analytics op-erations either explain what happened in a past learning situationor predict what will happen in a future learning situation [54, 67].Explanatory learning analytics generally represent visualizationsof measured and calculated metrics in learning dashboards [8, 63].Predictive learning analytics generally focuses on predicting stu-dent success by applying machine learning methods (e.g. decisiontree algorithms and neural network analysis) and data mining tech-niques (e.g. classification, regression, and clustering) to historicaldata [4, 13].

1.2 Changing paradigmsThe philosophical and pedagogical views of how people learn andwhat motivates them have changed [32, 37]. Since the 1970s, con-structivism has gradually replaced objectivism as the dominant philo-sophical learning paradigm [29]. Constructivism posits that learnersactively construct knowledge and skills by interacting with theirenvironment and by combining new information with existingknowledge [36]. In the constructivist view, every learner bringshis own knowledge and prior experiences to the learning processand every learner will learn something different. This means thatstudents are expected to be actively involved in shaping their learn-ing process, and not just in consuming it [29]. This also meansthat every learner is motivated differently [65]. This is importantbecause, in the constructivist view, learners take responsibility fortheir learning process, which requires motivation [36]. Studentmotivation is therefore a central consideration in the design ofconstructivist learning programs [36].

Self Determination Theory [15, 16] defines human motivation interms of being autonomous or controlled. It shows that people whohave control over their choices and actions perform better and aremore engaged in what they do [15]. In education, control over thelearning process is traditionally given to the teacher and the educa-tional institution [44]. However, research [44] shows, that studentswho are given control over their learning perform better. It was alsoshown, that students benefit when teachers support students’ auton-omy, instead of controlling students’ behavior. Research shows thatengagement and self-regulated learning are positively associatedwith students’ academic success in online courses [9].

The realization that every student’s learning and motivation areunique, has led to the concept of Student-Centered Learning (SCL).In SCL, students are empowered to take responsibility for theirlearning. They autonomously set goals and plan how to achievethese [31]. The role of the teacher changes from being the primarysource of knowledge to one of guiding the students in their learningexperience [36]. In SCL, teachers provide a learning environmentthat scaffolds the student’s learning [36]. As the student progresses,it is expected that the student’s autonomy will increase and thelevel of scaffolding can be decreased [36].

1.3 Student-Centered LearningA central insight of SCL is that learners actively construct knowl-edge when they connect new experiences and insights with theknowledge they already have [29]. This process is facilitated inthe learning activities the student participates in. The role of theteacher is to present learners with relevant learning activities andto assist the learner in the learning process itself [36].

Learning is also a social activity. It takes place when individualsengage in dynamic discourse with other learners and with teachers.This offers opportunities for critical reflection upon their own and

upon their peers’ contributions. In doing so, learners create newknowledge [33].

Learning activities must be authentic and relevant to the real-lifesituations they represent [33]. Problem-based Learning and Inquiry-based Learning are examples of learning methods that simulatereal-life contexts in which learners can construct new knowledgein collaboration with other learners [38].

Every learner’s learning process is essentially unique [29]. In SCL,learners are challenged to become aware of the learning process andthe thinking that takes place in it [33]. When the learner becomesaware of the learning process, he will be able to reflect on it andtake responsibility for it [36].

Learning analytics can support students in the SCL approachby offering meta-cognitive, actionable insights into the learningprocess [61]. Teachers also require new tools to support the uniquelearning paths student follow in SCL [12]. Prediction of studentperformance can help teachers identify students at risk in timefor successful intervention [2, 26, 40, 41]. Research suggests thatlearners’ acceptance and use of learning analytics is increased ifthey are allowed to select and customize the analytics that arerelevant to them [46].

A digital system central to today’s students’ learning is the Learn-ing Management System (LMS) [22, 30], which is defined as "avirtual learning environment that is used to deliver course material,track progress, and conduct assessments for e-learning." [45]. For anLMS to support SCL, it must [33]:

(1) Support active construction of knowledge.(2) Support social collaboration and negotiation.(3) Be situated in authentic contexts and.(4) Support meta-cognition.

However, LMSs appear to be firmly rooted in a conventional,course-based paradigm that does not adequately support SCL [3,19]. Furthermore, many learning analytics implementations arespecific to one educational context and are not easily transferred toother contexts [64]. This suggests that there is a requirement for aStudent-Centered LMS that supports SCL and offers customizablelearning analytics to support SCL. The following research questionis formulated to address this challenge:

• What are the design characteristics of a student-centeredlearning management system?

2 METHODSThis study consists of two consecutive phases. First, a requirementssurvey is conducted to consult educational stakeholders about theirrequirements concerning the research questions. The results of thissurvey are used to develop a learning management model that 1)supports SCL and 2) provides learning analytics to stakeholders.

2.1 Requirements surveyThe requirements survey was explorative and inductive [66]. Itconsisted of a series of semi-structured interviews [52]. Duringthe interview, educational stakeholders were first asked to nametypes of information that will improve the education they provide.When a list of such types of information was compiled, these topicswere explored further using a questionnaire. The questionnaire

was introduced to ensure that all perspectives on the topics be-ing discussed were considered. The questionnaire was constructedbased on the "Six W-questions" stated in the Handbook of LearningAnalytics [34] and on Greller’s Data Analytics Framework [23]. Thequestionnaire (in Dutch) is represented in Appendix A.

The data gathered in the interviews were analyzed by opencoding as defined in Grounded Theory [20, 21]. In the vocabularythat grew with each consecutive interview, definitions from dbpe-dia.comwere used [14] where possible to use generally agreed-uponconcepts in the analysis.

The goal of the analysis was to reach the third level of concep-tual analysis [20], which is the discovery of core categories in thedata. Data collection was concluded when the consecutive analysesrevealed that the composition and frequency of the core categoriesdid not change anymore with new interviews.

The outcomes of the survey were translated to a set of require-ments for the development of a model for an LMS that supportsSCL.

2.2 Model developmentThe development of the learning management model proposed inthis study was iterative [39] and Agile [1] in nature. It was con-ducted in close consultation with educational stakeholders andLMS subject matter experts. Moreover, it continuously incorpo-rated changing insights and new ideas. During development, themodel was iteratively discussed first mainly with LMS subject mat-ter experts and later also with four educational stakeholders. Thetechnical feasibility of the model is assessed by two software archi-tects knowledgeable in the field of LMS development. The productof this phase is expressed in the learning management model thatis described in Section 4.

3 REQUIREMENTS SURVEYTen interviews were conducted in the period from 5 March 2021 to25 March 2021. They were conducted in online sessions which tookbetween 1,5 and 2 hours. The online sessions were recordedwith theconsent of the interviewees for later reference. The interviews werenot transcribed. The distribution of the role the interviewees held intheir organization is represented in Table 1. The distribution of theeducation sectors represented by the interviewees is representedin Table 2.

Role FrequencyTeacher 3Educational Designer 3Learning Consultant 6Information Manager 2Total 14

Table 1: Represented educational roles

The data analysis was performed on the filled-in questionnaires.This means that only concepts that were selected for further explo-ration during the interviews were used in the analysis. The datagathered in the interviews were analyzed by open coding [20, 21].

2

Role FrequencySecondary Education (VO) 2Vocational Education (MBO) 5Higher Education (HBO) 1Public sector 2Total 10

Table 2: Represented organizations by education sector

The analysis of the interviews was done after each interview wasconcluded.

The prevalence of the core categories in the interviews found inthe data analysis is shown in Table 3. The concept of using learningdashboards is found in all interviews. In eight of ten interviews,the core category ’Student-facing dashboards’ is found. In sevenof ten interviews, the ’Teacher-facing dashboard’ is found. An-other prevalent core category is ’Personalized Learning’, which wasfound in seven of ten interviews. All other core categories wereless prevalent.

This analysis suggests that a need exists for a learning manage-ment model that is personalized in nature and that informs stake-holders about the state of the student’s learning process throughlearning analytics-enabled dashboards.

Interview

Student-facingdashboard

Teacher-facingdashboard

PersonalizedLearning

Correlationofstudentresults

Predictionofstudentsuccess

Adaptive

LearningContent

Studentactivity

StudentContentUsage

Badges

Effectofdidacticinterventions

Formative

assessments

Managem

entstatistics

P1 x x x xP2 x x xP3 x x x x xP4 x x x xP5 x x x x xP6 x x xP7 x x x xP8 x xP9 x xP10 x x

8 7 7 2 2 2 1 1 1 1 1 1Table 3: Prevalence of core categories in interviews

3.1 RequirementsBased on the results of the requirements survey and related litera-ture, the following set of requirements was defined for the devel-opment of a learning management model that supports SCL. Thelearning management model:

(1) Empowers students to define study goals [31].(2) Empowers students to choose how to achieve their goals

[31].(3) Helps students to reflect on and improve upon their perfor-

mance.(4) Allows teachers to provide appropriate scaffolding for the

student’s learning process [36].(5) Allows stakeholders to assess the student’s progress during

the learning process [17, 49].(6) Facilitates efficient communication between stakeholders.(7) Helps teachers to identify opportunities for pedagogic inter-

ventions [2, 26, 40].(8) Allows educational designers to design reusable learning

solutions.

These requirements form the basis for the learning managementmodel that is described in Section 4.

4 THE LEARNING PLAN MODELA Learning Plan is a collection of related goals set by or for astudent or a group of students. Figure 1 shows the learning planmodel that is proposed in this study. This section explains theconcepts in this model. The learning plan model was designed tomeet the requirements defined in Paragraph 3.1. Appendix B showsthe enhanced entity-relationship diagram for this model.

4.1 GoalsA goal is an achievement that is set by or for a student. Goals areachieved by completing one or more digitally performed learningactivities or by achieving subgoals. A goal can have either learningactivities or subgoals. A goal that has learning activities attachedto it is also referred to as a ’leaf goal’.

The data structure of a goal is divided into a set of data scopes(see ’Goal A’ in Figure 1). The General Properties scope containsthe title, status, and optional start and due dates for the goal. Whenproperties in this scope are changed, a copy of this data scope isretained. This creates a historical log of the data scope that can beused for analytical purposes.

Goals have a set of Stakeholder Groups that represent the stake-holder types that are involved in the educational process that thegoal supports. Stakeholder groups can have users and user groupsassigned as members. Goal permissions define the actions that mem-bers of a stakeholder group can perform (see Table 4 in AppendixC). Stakeholder groups allow educational designers to model the ed-ucational context the learning plan will support. Goals can containtwo types of stakeholder group. One type of stakeholder group isdefined at the learning plan level and is inherited by all goals in thelearning plan, including the users and user groups that populatethe stakeholder group. In addition to that, local stakeholder groupscan be defined to assign stakeholders specifically to one goal.

The list of properties of a goal can be extendedwithmetadata sets.Metadata sets are configurable metadataschemas that can includeproperties of different kinds of data types (e.g. text, numerical, etc.).A goal can contain two types of metadata sets. One metadata setdescribes the goal for general purposes, e.g. reporting. A secondmetadata set is applied specifically to define the parameters within

3

Figure 1: Learning plan model

which the student can search for learning activities to assign togoals.

A set of Key Metrics is calculated and stored periodically forevery goal. Keymetrics are calculated values that represent the stateof the learning plan. Key metrics are used in analytical operations.Key metrics are generated periodically in sets, which are retainedwhen a newer set is calculated. The complete list of key metricsthat are calculated and stored for a goal is represented in Table6 in Appendix C. Different operations may be used to calculatekey metrics. Some key metrics are calculated by simple arithmeticoperations. Other key metrics may be calculated using statistics ormachine learning operations.

One or more competencies defined in a competency set can beassigned to a goal. Relationships between the goal and competenciescan be used in assessments and analytics.

Finally, a set of Completion Rules defines what the requirementsare for achieving the goal. The completion rules define that eitherall subgoals or attached learning activities must be completed orwhich subgoals or attached learning activities must be achieved forthe goal to be considered achieved.

4.2 Learning PlanA learning plan is a collection of coherent goals set by or for astudent or a group of students. The data structure of a learning planis divided into a set of data scopes (see ’Goal A’ in Figure 1).

The General Properties scope contains general properties likethe title of the learning plan and optional start and due dates. Whenproperties in this scope are changed, a copy of this data scope isretained. This creates a historical log that can be used for analyticalpurposes.

The stakeholder groups defined at the learning plan level servetwo purposes. First, stakeholder groups define which operationsmembers of the defined stakeholder groups are allowed to performat the learning plan level. E.g., these permissions define whethera member of a stakeholder group is allowed to add goals to thelearning plan. The second purpose of the stakeholder groups isinheritance to all goals in the learning plan. All stakeholder groupsdefined at the learning plan level are automatically inherited by allgoals in the learning plan. Also, members of the stakeholder groupsat the learning plan level are automatically inherited by all goals.The permissions set for stakeholder groups at the learning planlevel are not inherited by the goals in the learning plan.

Metadata can be defined at the learning plan level to extendthe properties of the learning plan. A learning plan can containtwo types of metadata. One metadata set is for general purposes(e.g. for use in reports). A second metadata set can be definedas global search parameters. These parameters are automaticallyincluded in the search criteria when a stakeholder searches forlearning activities. This allows educational designers to imposeglobal limits to the search domain within which the student cansearch for learning activities.

A set of Key Metrics is periodically calculated and stored for thelearning plan. Key metrics represent the state of the learning plan.A list of key metrics for learning plans is represented in Table 7 inAppendix C. Like in goals, keymetrics offer a normalized expressionof the state of the learning plan. Some key metrics can be calculatedby simple arithmetic, but advanced operations can also be appliedto calculate advanced key metrics. Section 7 describes how keymetrics are used to inform and alert stakeholders of the state of thelearning process.

4

An example of an advanced key metric is the prediction of theprobability of student success in the learning plan. This key met-ric could help teachers identify underperforming students in timeto offer assistance. This probability is calculated using predictivemachine learning techniques. These techniques apply predictivemodeling to historical performance data of students who precededcurrent students in comparable situations [13, 26, 43]. DecisionTree Analysis [25] is the most prevalent technique used for this pur-pose [13]. Other techniques include Naive Bayes Classifier (NBA),Support Vector Machine (SVM), and Neural Networks [13]. SimpleKey metrics could potentially be used as input for the calculationof advanced key metrics. Which predictive modeling techniquewill perform best in a particular educational context is beyond thescope of this study. This can only be determined when the learningplan model is implemented and historical student performance databecome available.

In SCL, students are empowered to define their own goals, butdefining goals in a learning plan from scratch would be unpracticaldue to the large number of properties that goals have. Therefore, atthe learning plan level, a collection of goal templates can be definedthat can be used to create new goals. Initially, a newly created goalis an exact copy of the goal template. Structurally, goal templatesare equal to goals.

A competency set can be assigned to the learning plan. A com-petency set is a hierarchically organized collection of competenciesthat are defined as ’indicators of successful performance of in life-roleactivities’ [5]. Competencies are assigned to goals to indicate whichcompetencies are involved in completing the goal. These relation-ships are used to explain the purpose of the goal to all stakeholdersand in formative assessments.

Finally, communication between stakeholders about the learningprocess is facilitated in the learning plan. Stakeholders can planappointments, take meeting minutes, and define to-do items forstakeholders. Incorporating these communicative functions intothe model is beneficial to the learning process for two reasons.

First, appointments and to-do items can be assigned to stake-holder groups. Items assigned to a stakeholder group ’Coach’ willautomatically transfer when a new coach is assigned to the student.Second, these communications can be related to goals and com-petencies and can refer to previous communications. This createsa narrative that describes how the learning plan was conceived,executed, adjusted, and completed. This narrative could supportreflection, sense-making, and decision-making.

4.3 Learning ActivityLearning Activities are defined as digitally performed activities thatcontribute towards achieving goals within the learning plan. Not alllearning is done digitally, but records of off-line learning activities(e.g. reports, images, video) will have to be uploaded to a goal asevidence if they are to contribute to the learning plan.

Learning activities are either native or external to the host LMS.Native learning activities represent learning functionalities that areincluded in the LMS. External learning activities reside outside ofthe LMS and may be accessed by techniques specifically designedto bridge this gap. Examples of such techniques are SCORM [48]and LTI [27].

4.4 Learning Activity RepositoryThe Learning Activity Repository forms the bridge between goalsand learning activities. It performs the following functions:

(1) It allows stakeholders to search for learning activities.(2) It connects learning activities to goals.(3) It records the results the student achieves in learning activi-

ties.Figure 2 is a graphical representation of the Learning Activity

Repository model.In the Learning Activity Repository, learning activities (see Para-

graph 4.3) are represented as Learning Activity Profiles. A LearningActivity Profile describes the represented learning activity by title,description, and start- and end dates (if applicable). It also defineswhat type of learning activity it concerns and how it can be con-nected to technically. Finally, the profile includes search parame-ters that facilitate searching for learning activities by stakeholders.Learning activities are published to stakeholders for enrollmentusing offerings. Offerings define:

(1) The period within which a learning activity can be selected.(2) How many students can be enrolled.(3) Whether a waiting list must be maintained when the maxi-

mum number of enrollments is reached.Students enroll in an offering via an enrollment. When enroll-

ment is completed, the learning activity repository receives, nor-malized and stores:

(1) Progress information.(2) Grades.(3) Passed/not passed indication.Learning activities are black boxes to the learning activity repos-

itory. The learning activity repository relies on the learning activityto provide this information, but and not all learning activities willdo so uniformly or even at all. The learning activity repository nor-malizes and, if necessary, synthesizes this information based on thedata returned by the learning activity. Progress is normalized to apercentage (0-100). Passing information is normalized to a boolean.A historical log of changes to these data is retained for analyticalpurposes.

For specific, native, types of learning activity that do not returnuniform passing information, rules can be defined that determinewhich assessments within the learning activity must be passed toconsider the learning activity passed.

4.5 AssessmentsThe learning plan model supports formative assessment as a wayof measuring the student’s progress in the learning plan. In theseassessments, the student, the teacher(s), and potentially also otherstakeholders will contribute their perspective of how the student isprogressing. These assessments can be based on the competenciesthat are attached to the goals the student is currently working on,but other scenarios are possible. Generally speaking, the outcomesof the assessment will inform a dialogue between the student andthe teachers. In this dialogue, the student and the teachers togetherdetermine how the student is doing with regard to achieving thegoals he set. This dialogue also helps the student and the teachersto adjust the learning plan so these goals may be achieved.

5

Figure 2: Learning activity repository

The assessment method described for formative assessment canalso be used for summative assessment. Teachers may define towhich level a student must master certain competencies for them tobe considered passed. Other, less supervised learning scenarios willrequire the learning plan to be completed automatically based onrules. Currently, the model can be automatically completed whenall goals in the learning plan are achieved.

4.6 WidgetsLearning Analytics Widgets (widgets) are predefined representa-tions of key metrics that can be used in learning analytics dash-boards. Learning analytics widgets are included in the learning planso stakeholders can select and use them in learning dashboards.Section 7 describes widgets in detail.

5 LIFE CYCLE OF A GOALThe model proposed in Section 4 empowers educational designersto design goals to fit specific educational contexts. Stakeholders aregranted goal permissions that fit the pedagogical objectives thatthe goal supports. Other variables in the goal model include:

(1) Whether the stakeholder can select learning activities.(2) How the goal is achieved.(3) What, if any, result information is returned from the attached

learning activities.

The life cycle that a typical goal may go through is representedin Figure 3. For each of the phases in this life cycle, the diagramshows:

(1) The goal permissions required.(2) The status of the goal.(3) The status of learning activities attached to the goal.(4) When performance data is potentially returned from learn-

ing activities.

The following paragraphs describe each of these phases.

5.1 InstantiationGoals are instantiations of goal templates (see Paragraph 4.1). In-stantiation can take place during the creation of the learning planor after the creation of the learning plan. In the former case, thegoal is a part of the learning plan template and is created alongwith the learning plan. In the latter case, stakeholders add the goalto the learning plan as and when needed. In this case, goals arecreated as instantiations of goal templates that are included in thelearning plan template.

5.2 DefinitionOnce the goal is created, stakeholders will further configure itto represent the real-world goal it represents. Depending on thepermissions the stakeholder is granted, the basic properties (title,description, start date, and due date, competencies) and the ad-vanced properties (stakeholder groups and permissions, metadata,competencies, and completion rules) can be configured. Some ofthese properties will have been included in the goal template thatwas used to instantiate the goal. Next, either sub-goals or learningactivities are added to the goal.

5.3 Adding subgoalsGoals can be subdivided into subgoals. Figure 1 represents a learningplan that contains a goal (’Goal A’) that has two subgoals (’Goal A1’and ’Goal A2’). Like all goals, subgoals are instantiations of goaltemplates (see Paragraph 5.1).

5.4 Adding learning activitiesAdding a learning activity to a goal starts with searching for learn-ing activities in the Learning Activity Repository. Stakeholders cansearch by:

(1) Text search (targets the learning activity title and descrip-tion).

(2) Type of learning activity.(3) Period within which the learning activity is offered.(4) Competencies assigned to the learning activity.

6

Figure 3: Life cycle of a goal

The search domain is defined by the global search parametersdefined for the learning plan (see Paragraph 4.1) and by the goal’ssearch parameters. This search logic allows educational designersto limit the range of offered learning activities to specific subjects,types of learning activities, etc. Teachers can also add a selectionof suggested learning activities to a goal for the student to selectfrom.

When a student enrolls in a learning activity, the enrollment isinitially set to the status ’pending approval’. This means that thestudent is enrolled provisionally and is not enrolled in the actuallearning activity (e.g. course) yet. The final step in the definition ofa goal is to propose it. Proposing a goal means it is presented to ateacher who was granted permission to approve the goal. Proposingthe goal also means it cannot be edited until the stakeholder returnsits status to ’new’.

Figure 4: Learning activity search parameters

5.5 ApprovalStakeholders who have the goal permission ’Approve goal’ canapprove or reject a goal that was proposed by a student. Rejectingthe proposal means it is returned to the status ’new’. The studentcan then adjust the goal and propose it anew. When the goal isapproved, any enrollments for learning activities are automaticallyset to the status ’enrolled’ and the student is now actually enrolledin the learning activity. Approving goals one at a time is not efficient.

Implementations will have to offer the options to approve morethan one goal at a time.

5.6 AdjustmentStakeholders with appropriate permissions can modify the start-and due dates for goals to reflect changes to the planning of thegoal. Any other changes to the goal require the goal to be resetto the status ’new’. When all changes are effected, the goal mustbe approved again. Any enrollments in learning activities are notrescinded during the adjustment of the goal. Enrollments can becanceled if required.

5.7 CompletionGoals can be configured to be completed either automatically ormanually. Automatic completion requires all attached learning ac-tivities to return a passed indication. Manual completion requires astakeholder to be assigned the ’set goals status’ permission for thegoal. Ultimately, the goal will have the status ’Achieved’ or ’Notachieved’.

6 LIFE CYCLE OF A LEARNING PLANThis paragraph describes the life cycle of a learning plan.

6.1 Creating learning plansLearning plans are created by instantiating a Learning Plan Tem-plate. Figure 5 represents this process. Learning Plan Templatesare created by educational designers in an application manage-ment module which is out of scope of this study. A Learning PlanTemplate is structurally equal to a Learning Plan (see Section 4.2).Learning Plan Templates can be instantiated for individual usersand user groups. Initially, learning plans are exact copies of thelearning plan templates they were instantiated from.

7

Figure 5: Instantiation of a learning plan template

6.2 Learning plan completionThe model supports both automatic and manual completion ofthe learning plan. Automatic completion requires all goals in thelearning plan to be achieved. Each achieved goal will contribute tocompleting its parent goal. In this scenario, eventually, all goals areachieved and the learning plan is completed. In amanual completionscenario, the summative assessment is performed (see Paragraph4.5) to determine whether the student passes the learning plan. Thestatus of the learning plan is then changed accordingly. Ultimately,the learning plan will have either the status ’Achieved’ or ’Notachieved’.

6.3 Learning plans as part of larger structuresIn many cases, learning plans will represent parts of a larger ed-ucational program. In these cases, a master learning plan can bedesigned that represents an overarching learning program. Learn-ing programs can be connected a posteriori to goals in the masterlearning plan. Figure 6 represents this structure. Being able to assem-ble master learning plans a posteriori affords students and schoolsflexibility in creating personalized curricula.

Figure 6: Learning plans as parts of a master learning plan

7 LEARNING ANALYTICS7.1 RequirementsThe SCL approach proposed in this study empowers students todefine their own learning. Learning analytics can provide students

with actionable insights into the learning process that are otherwisenot apparent [61]. The design of student-facing learning analyticsmust be guided by four principles [7]. Learning analytics must:

(1) Be customizable for the student.(2) Foreground student sense-making.(3) Be enable the student to identify actionable insights and.(4) Be embedded into educational processes.

In SCL, the student will create an essentially unique learning plan.Determining which students need attention, recognizing which stu-dents are at risk of falling behind or even failing, and providingthe appropriate interventions is a challenge for teachers [53, 60].Learning analytics can assist teachers with sense-making and im-plementing appropriate interventions [60].

Learning analytics are often expressed in visualizations [62] be-cause visualizations are an efficient interface for the transmissionof information from the computer to the human brain [50]. Visual-izations offer a larger information bandwidth than other channelsof information such as text or sound. Also, the human brain isgood at recognizing patterns in visual representations [50]. Fivekey characteristics of good visualizations [58] are that they:

(1) Clearly illustrate a relevant point.(2) Are tailored to their audience.(3) Are tailored to the presentation medium.(4) Are memorable.(5) Increase the understanding of the subject matter.

These qualities are inseparable from the educational context inwhich the visualization will be used. Therefore, it is not desirable todefine one-size-fits-all visualizations. Rather, the flexibility affordedto stakeholders in designing learning plans must be extended tothe design of the learning analytics visualizations used with theseplans.

This section introduces the concept of the ’learning analyticswidget’ or ’widget’ to meet this requirement.

7.2 The widget modelA learning analytics widget is an element in the graphical user in-terface (GUI) that renders visualizations of the state of the learningplan and rule-based notifications. Widgets also include an explana-tion of the widget the helps stakeholders to use the widget correctly.

Widgets are designed to be used in learning analytics dashboards.A learning analytics dashboard is “a single display that aggregatesdifferent indicators about learner (s), learning process(es) and/orlearning context(s) into one or multiple visualisations.” [51]. Inthe proposed learning management model, students and teacherscan select and arrange widgets in personal dashboards. Research[7] suggests that students perceive dashboards more positivelyif they are allowed to customize the content and presentation ofthe dashboards. Therefore key features of dashboards will be toallow students to select and arrange widgets and to customizethe presentation and functionality of widgets. The design of thedashboards is beyond the scope of his study.

The visualizations and notifications in the widget model are gen-erated from learning plan key metrics (see Paragraph 4.2). Whilethis is only a subset of all learning data available in the model,it represents a collection of clearly defined parameters (e.g. ’total

8

number of goals in the learning plan’) that are commonly under-stood regardless of the educational context the learning plan isused in. Limiting the widget model to learning plan key metricsis a conscious trade-off that allows educational designers to pre-configure widgets templates that can be instantiated as widgets inany learning plan (see Paragraph 7.5).

7.3 VisualizationWidgets render specific visualizations of learning plan key metricsand relationships between learning plan key metrics. Depending onthe purpose of the visualization, specific functionality will be im-plemented. Two examples of proposed visualizations are describedin paragraphs 7.6 and 7.7.

7.4 NotificationsThe notification canvas is used to render messages to stakeholdersbased on evaluations of one or more key metrics. See Table 7 inAppendix C for a list of key metrics that are calculated for learningplans.

Notifications are designed to inform, encourage and warn stake-holders. Multiple notifications may be defined for one widget. Eachnotification evaluates a statement that can include basic arithmeticoperations (addition, subtraction, multiplication, and division) andlogical operators (e.g. AND and OR). Notifications render messagesin case the evaluation returns true or false. Listing 1 defines thenotification syntax. This syntax was adopted because it is also usedin Microsoft Excel and is thus assumed to be relatively familiar tostakeholders. Figure 13 in Appendix D shows how notifications aredefined in the widget interface.

IF(STATEMENT; message if true, message if not true); (1)

Messages are defined by a notification string and, a notificationtype. The notification type defines how the message is rendered. Itcan have the values ’info’, ’success’, ’warning’, and ’danger’. Fig-ure 8 shows how notifications could be rendered, although this isultimately an implementation choice.

Statement 2 renders an encouraging notification if the studentachieved at least 3 goals during the past week (’LGAW >= 3’), butnot within the first 10 days of the learning plan (’TELD > 10’).

IF(TELD > 10 AND LGAW >= 3; MSG(’Doing great!’, ’success’);(2)

7.5 Life cycle of a widgetAs described in Paragraph 7.2, widgets are designed to be config-urable to the requirements of the end-user. End-users should be ableto select, use and configure widgets for their own use. However, itwould be redundant to let every end-user configure widgets fromscratch. Also, educational designers and teachers should be enabledto prepare widgets for use by stakeholders in specific educationalsettings. This paragraph describes the life cycle of a widget. Figure7 is a representation of this life cycle. It defines three domains:the Global Widget Gallery, the Learning Plan Template, and theLearning Plan (see Paragraph 4.2).

Figure 7: Life cycle of a widget

The purpose of the Global Widget Gallery is to facilitate thedesign and dissemination of widget templates. In the Global WidgetGallery, educational designers create and configure widget tem-plates for specific educational purposes. Separate widget galleriesmay be defined and published to different user groups. Creatinga new widget template in the Global Widget Gallery involves thefollowing steps:

(1) Create a widget template, either by creating a new widgettemplate or by copying an existing widget template.

(2) Select the visualization type (see Paragraph 7.3).(3) Select key metric statements to be used in the visualization

(optional).(4) Enter explanation text.(5) Define notifications (optional) (see Paragraph 7.4).(6) Publish the widget template.Once a widget template is published, stakeholders can search

the Global Widget Gallery for suitable widget templates. Suitablewidget templates can then be added to a Learning Plan Template(see Paragraph 4.2) by the educational designer. At this stage, theeducational designer may adjust the settings of the widget templateto the educational purpose it will be used in. These are the stepsinvolved in

(1) Search the Global Widget Gallery for suitable widget tem-plates.

(2) Select a widget template.(3) Add the widget template to the learning plan template.(4) Adjust the widget template (optional).(5) Publish the widget template for use in the learning plan

template and its instances.Stakeholders with the appropriate learning plan permissions

(see Table 5 in Appendix C) can add widgets to their own learninganalytics dashboards or other users’ dashboards.

Like learning plans and goals, widget templates include stake-holder profiles. If the stakeholder profiles in the widget templatematch the stakeholder profiles of the learning plan template it isadded to, the permissions set for the stakeholder profiles in thewidget template are copied automatically to the widget template inthe learning plan template. If the stakeholder profile in the widgettemplate and the learning plan template do not match, the educa-tional designer will resolve the differences between the two sets ofstakeholder profiles first.

9

7.6 Data Layer widgetThe Data Layer Widget is a student-facing representation of oneor more metrics over time. Metrics are rendered in separate over-lapping layers which can be arranged on the z-axis of the graph.Layers can represent:

(1) A learning plan key metric value.(2) A calculation of key metrics values.(3) A fixed value.

Figure 8 and Figures 11, 12, and 13 in Appendix D are mockupdesigns of the Data Layer Widget. Figure 8 shows the student’sprimary view of the widget. In the configuration that is shown,three data layers are presented. Below that, a short explanationof the visualization is given. Finally, four notifications of differenttypes are rendered. Figure 11 in Appendix D shows that the studentcan switch data layers on and off. This gives the student agencyover the visualization.

Learning analytics should enable the student to compare his ownperformance on learning plan key metrics to three perspectives[18]:

(1) His own prior performance.(2) The performance of his or her peers.(3) A reference frame set by the teacher.

For a fictitious educational context, the example widget in Figure8 combines these three perspectives in one visualization. The exam-ple is set in a fictitious learning context in which students’ progressis monitored by the number of goals achieved over the past week.This metric is represented in the key metric ’LGAW’ (see Table 7 inAppendix C). Fictitious data were used in these examples.

In Figure 8, the blue histogram in the visualization representsthe number of goals achieved by the student during the learningplan (’his own performance’). The red line represents the averagenumber of goals achieved by the students’ peers for the same keymetric over the same period (’the performance of his or her peers’).Finally, the green line represents a threshold set by the teacherbelow which the number of achieved tasks should not fall (’a frameset by the teacher’).

Figure 12 in Appendix D shows the ’Settings’-tab for this widget.This tab is only available to stakeholders who are granted thewidget permission ’Manage settings’. In this tab, the stakeholdercan change the title, the widget explanation, and the definitions ofthe data layers presented in the widget. The layers can be renderedas histograms or as line graphs (see Figure 12 in Appendix D).For every layer in the visualization, the scope of the data that isrepresented can be defined. This scope limits the data used in therendering to:

(1) The student’s own data (’self’).(2) The data of the peers with whom the students work in groups

(’peer groups’).(3) With all peers who share the same learning plan template

(’all peers’).

Figure 13 in Appendix D) shows how students and teacherscan have different permissions for the widget. This Figure alsodemonstrates how notifications are defined in the widget.

Figure 8: Data Layer widget

7.7 Correlation widgetThe Correlation widget represents Pearson’s correlation [10] be-tween two learning plan key metrics. The widget can render thecorrelation for a specific date and it can render the correlation overthe course of the learning plan. The widget can also be used to howthe correlation differs between separate groups who work on thesame learning plan. Figure 9 and Figures 14, 15, and 16 in AppendixE are mock-up designs for this widget.

The widget is designed for use by teachers and researchers. Re-searchers may use the widget in experiments to explore differencesbetween experimental and control groups. Teachers may use thewidget to quickly explore relationships between key metrics forthe body of students who work on the learning plan and to spotdifferences between student groups.

For a fictitious educational context, Figure 9 shows how thewidget can render the correlation between x) the average normal-ized grade (scaled to 1-10) as returned by learning activities andy) the number of overdue goals in the student’s learning plan at aparticular date. Fictitious data were used in these examples.

The user can select the date that is represented with the sliderplaced directly below the graph area. The widget can calculatecorrelations for all students who work on a learning plan or differ-entiate for the student groups (’X1A’ and ’X1B’) involved. Finally,the user can select which groups to include in the graph.

Figure 14 in Appendix E represents the distribution of Pearson’scorrelation coefficient as a function of time. The widget can graph

10

Figure 9: Correlation widget

the correlation coefficient separately for the student groups (’X1A’and ’X1B’) involved and for the total body of students who workon the learning plan.

Figure 15 in Appendix E shows how users with the permission’Manage settings’ can select and label the key metrics that thewidget will correlate. The user may enter the key metric code (seeTable 7 in Appendix C) for the key metrics to be included or entera statement in which a calculation key metrics is calculated.

The key metric Average Normalized Grade (’LANG’) is storedas a value on a scale from 0-100. In the definition of the x-axis,this value is divided by 10 (’LANG/10’) because Dutch students areaccustomed to this scale for grading. For the y-axis, the key metricNumber of Overdue Goals (LTGO) is entered.

8 CONCLUSIONThe goal of this study was to explore how Learning ManagementSystems (LMS) can support Student-Centered Learning (SCL). Asurvey of educational stakeholders confirmed that a need for astudent-centered LMS exists. This survey showed that personaliza-tion and learning analytics dashboards are important characteristicsof such a system.

To address this requirement, a learning management model wasdeveloped that organizes learning differently from existing LMSs.It organizes the learning process around the goals that students setfor themselves instead of around a teacher-defined curriculum. Themodel allows teachers to gradually give students more autonomyover their learning as the students’ meta-cognitive skills develop.

An important characteristic of a student-centered LMS is that itcan be tailored to the educational setting it is used in. In the modelproposed in this study, educational designers use freely configurablestakeholder groups and permissions to configure learning plansthat offer an appropriate balance between student autonomy andteacher support.

The proposed learning management model was designed withlearning analytics in mind. Clearly defined and meaningful learn-ing data is calculated and stored throughout the model. The modelincludes historical records of changes to the learning plan, the un-derlying goals, and the student’s performance in learning activities.Also, a set of key metrics that capture the state of the learning planis calculated periodically.

In SCL, students require tools that provide feedback on theirlearning process. As students pursue highly individualized learn-ing plans, teachers also require new tools to track the students’progress and to implement timely interventions. This study showshow educational designers can prepare learning analytics func-tionality for students and teachers and how students and teacherscan personalize these learning analytics to fit their informationrequirements.

9 DISCUSSIONThe requirements survey conducted in this study involved a rel-atively small set of stakeholders. This limits the generalizabilityof the results. Future research among a larger group should beconducted to improve the reliability of this study. In addition, intro-ducing multiple reviewers and determining intercoder reliabilitycould increase the generalizability of the results.

No students were involved in this study. Now that a model fora student-centered LMS is proposed, future research should beconducted among students to investigate how students perceivethis approach to SCL.

Research shows that learning analytics can support students andteachers in the learning process [12, 36, 61]. However, successfullyintroducing learning analytics in practice is challenging [6, 28, 35].Future experiments should be conducted to validate the LearningAnalytics Widget concept proposed in this study.

The introduction of any learning analytics implementation raisesethical issues [42, 47, 57]. In particular, privacy [11, 24] is a centralconcern. Future research should determine if the model as it isnow proposed allows stakeholders to incorporate adequate dataprotection. It is advisable to perform a Digital Protection ImpactAssessment (DPIA) [59] to determine if the stakeholders’ privacyis addressed adequately.

To ensure the full potential of the proposed solution is realized,it is recommended that educational stakeholders, researchers, andsoftware producers invest in a longitudinal cooperation to design,produce and incrementally improve a student-centered LMS asproposed in this study.

REFERENCES[1] Agile (2001). Manifesto for Agile Software Development.https://agilemanifesto.org/ (last accessed 2021-06-21).

[2] Alamri, R. and Alharbi, B. (2021). Explainable Student Performance PredictionModels: A Systematic Review. IEEE Access, 9:33132–33143.

[3] Aldowah, H., Al-Samarraie, H., and Fauzy, W. M. (2019). Educational data miningand learning analytics for 21st century higher education: A review and synthesis.

11

Telematics and Informatics, 37(January):13–49.[4] Alyahyan, E. and Düştegör, D. (2020). Predicting academic success in highereducation: literature review and best practices. International Journal of EducationalTechnology in Higher Education, 17(1).

[5] Ashworth, P. D. and Saxton, J. (1990). On ‘competence’. Journal of further andhigher education, 14(2):3–25.

[6] Avella, J. T., Kebritchi, M., Nunn, S. G., and Kanai, T. (2016). Learning analyticsmethods, benefits, and challenges in higher education: A systematic literature review.

[7] Bennett, L. and Folley, S. (2020). Four design principles for learner dashboards thatsupport student agency and empowerment. Journal of Applied Research in HigherEducation, 12(1):15–26.

[8] Bodily, R. and Verbert, K. (2017). Review of research on student-facing learninganalytics dashboards and educational recommender systems. IEEE Transactions onLearning Technologies, 10(4):405–418.

[9] Broadbent, J. and Poon, W. L. (2015). Self-regulated learning strategies & academicachievement in online higher education learning environments: A systematic review.Internet and Higher Education, 27:1–13.

[10] Burns, R. P. and Burns, R. (2008). Business research methods and statistics usingSPSS. Sage.

[11] Cavoukian, A. and others (2009). Privacy by design: The 7 foundational principles.Information and privacy commissioner of Ontario, Canada, 5:12.

[12] Clow, D. (2013). Teaching in Higher Education An overview of learning analytics.Teaching in Higher Education, 18(6):683–695.

[13] Cui, Y., Chen, F., Shiri, A., and Fan, Y. (2019). Predictive analytic models of studentsuccess in higher education: A review of methodology. Information and LearningScience, 120(3-4):208–227.

[14] DBpedia Association (2021). DBpedia Association. https://www.dbpedia.org/(last accessed 2021-06-21).

[15] Deci, E. L. and Ryan, R. M. (1980). Self-determination Theory : When Mind Medi-ates Behavior Author ( s ): Edward L . Deci and Richard M . Ryan Published by : Insti-tute of Mind and Behavior , Inc . Stable URL : https://www.jstor.org/stable/43852807Self-determination Th When Mind Mediates Be. The Journal of Mind and Behavior,1(1):33–43.

[16] Deci, E. L. and Ryan, R. M. (1985). Self-determination and intrinsic motivation inhuman behavior. EL Deci, RM Ryan.–1985.

[17] Fitzgerald, E., Jones, A., Kucirkova, N., and Scanlon, E. (2018). A literaturesynthesis of personalised technology-enhanced learning: What works and why.Research in Learning Technology, 26(1063519):1–16.

[18] Friend Wise, A. (2014). Designing pedagogical interventions to support studentuse of learning analytics. ACM International Conference Proceeding Series, pages203–211.

[19] Gedrimiene, E., Silvola, A., Pursiainen, J., Rusanen, J., and Muukkonen, H. (2020).Learning Analytics in Education: Literature Review and Case Examples From Voca-tional Education. Scandinavian Journal of Educational Research, 64(7):1105–1119.

[20] Glaser, B. G. (2002). Conceptualization: On Theory and Theorizing UsingGrounded Theory. International Journal of Qualitative Methods, 1(2):23–38.

[21] Glaser, B. G. and Holton, J. (1967). Discovery of grounded theory. Citeseer, 1edition.

[22] Green, K. R. and Chewning, H. L. (2020). The Fault in our Systems: LMS as aVehicle for Critical Pedagogy. TechTrends, 64(3):423–431.

[23] Greller, W. and Drachsler, H. (2012). Translating learning into numbers: A genericframework for learning analytics. Educational Technology and Society, 15(3):42–57.

[24] Gursoy, M. E., Inan, A., Nergiz, M. E., and Saygin, Y. (2017). Privacy-PreservingLearning Analytics: Challenges and Techniques. IEEE Transactions on LearningTechnologies, 10(1):68–81.

[25] Hamoud, A. K., Hashim, A. S., and Awadh,W. A. (2018). Predicting Student Perfor-mance in Higher Education Institutions Using Decision Tree Analysis. InternationalJournal of Interactive Multimedia and Artificial Intelligence, 5(2):26.

[26] Hellas, A., Ihantola, P., Petersen, A., Ajanovski, V. V., Gutica, M., Hynninen, T.,Knutas, A., Leinonen, J., Messom, C., and Liao, S. N. (2018). Predicting academicperformance: A systematic literature review. Annual Conference on Innovation andTechnology in Computer Science Education, ITiCSE, pages 175–199.

[27] IMS Global Learning Consortium (2019). Learning Tools Interoperability CoreSpecification 1.3. http://www.imsglobal.org/spec/lti/v1p3/ (last accessed 2021-06-21).

[28] Jivet, I., Scheffel, M., Drachsler, H., and Specht, M. (2017). Awareness Is NotEnough: Pitfalls of Learning Analytics Dashboards in the Educational Practice. InLavoue, E., Drachsler, H., Verbert, K., Broisin, J., and Pérez-Sanagustín, M., editors,Data Driven Approaches in Digital Education, pages 82–96, Cham. Springer Interna-tional Publishing.

[29] Jonassen, D. H. (1985). Objectivism versus Constructivism : Do We Need a NewPhilosophical Paradigm? Educational technology research and development, 39(3):5–14.

[30] Kasim, N. N. and Khalid, F. (2016). Choosing the right learning managementsystem (LMS) for the higher education institution context: A systematic review.International Journal of Emerging Technologies in Learning, 11(6):55–61.

[31] Kevin Michael Klipfel, D. B. C. (2017). Learner-Centered Pedagogy : Principles andPractice. ALA Editions.

[32] Kivunja, C. (2014). Do You Want Your Students to Be Job-Ready with 21stCentury Skills? Change Pedagogies: A Pedagogical Paradigm Shift from Vygotskyian

Social Constructivism to Critical Thinking, Problem Solving and Siemens’ DigitalConnectivism. International Journal of Higher Education, 3(3):81–91.

[33] Kunz, P. (2004). The Next Generation of Learning Management System (LMS):Requirements from a Constructivist Perspective. World Conference on EducationalMultimedia, Hypermedia and Telecommunications, 2004(1):300–307.

[34] Lang, C. (2017). Handbook of Learning Analytics. Society for Learning AnalyticsResearch, 1 edition.

[35] Larrabee Sønderlund, A., Hughes, E., and Smith, J. (2019). The efficacy of learninganalytics interventions in higher education: A systematic review. British Journal ofEducational Technology, 50(5):2594–2618.

[36] Lee, E. and Hannafin, M. J. (2016). A design framework for enhancing engagementin student-centered learning: own it, learn it, and share it. Educational technologyresearch and development, 64(4):707–734.

[37] McCombs, B. (1993). Learner-Centered Psychological Principles: Guidelines forSchool Redesign and Reform.

[38] Melero, J., Hernández-Leo, D., and Blat, J. (2012). A review of constructivistlearning methods with supporting tooling in ict higher education: Defining differenttypes of scaffolding. Journal of Universal Computer Science, 18(16):2334–2360.

[39] Mills, A., Durepos, G., andWiebe, E. (2010). Encyclopedia of Case Study Research.[40] O’Donnell, E., Lawless, S., Sharp, M., and Wade, V. P. (2015). A review of per-sonalised e-learning: Towards supporting learner diversity. International Journal ofDistance Education Technologies, 13(1):22–47.

[41] Overby, K. (2011). Student-Centered Learning. Essai, 9(1):32.[42] Pardo, A. and Siemens, G. (2014). Ethical and privacy principles for learninganalytics. British Journal of Educational Technology, 45(3):438–450.

[43] Ranjeeth, S., Latchoumi, T. P., and Paul, P. V. (2020). A Survey on PredictiveModels of Learning Analytics. Procedia Computer Science, 167(2018):37–46.

[44] Reeve, J. and others (2002). Self-determination theory applied to educationalsettings. Handbook of self-determination research, 2:183–204.

[45] Reshad, A. (2018). Learning management systems. The TESOL Encyclopedia ofEnglish Language Teaching, pages 1–5.

[46] Roberts, L. D., Howell, J. A., and Seaman, K. (2017). Give Me a Customizable Dash-board: Personalized Learning Analytics Dashboards in Higher Education. Technology,Knowledge and Learning, 22(3):317–333.

[47] Rubel, A. and Jones, K. M. (2016). Student privacy in learning analytics: Aninformation ethics perspective. Information Society, 32(2):143–159.

[48] Rustici Software LLC (2021). Technical Overview of SCORM Specification/Stan-dard. https://scorm.com/scorm-explained/technical-scorm/ (last accessed 2021-06-21).

[49] Sansone, N. and Cesareni, D. (2019). Which learning analytics for a socio-constructivist teaching and learning blended experience. Journal of E-Learningand Knowledge Society, 15(3):319–329.

[50] Santoso, H. B., Batuparan, A. K., Isal, R. Y. K., and Goodridge, W. H. (2018). Thedevelopment of a learning dashboard for lecturers: A case study on a student-centerede-learning environment. Journal of Educators Online, 15(1).

[51] Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni,M. S., Holzer, A., Gillet, D., and Dillenbourg, P. (2017). Perceiving learning at a glance:A systematic literature review of learning dashboard research. IEEE Transactions onLearning Technologies, 10(1):30–41.

[52] Seidman, I. (2013). Interviewing As Qualitative Research : A Guide for Researchersin Education and the Social Sciences, volume 4th ed. Teachers College Press, NewYork.

[53] Seufert, S., Meier, C., Soellner, M., and Rietsche, R. (2019). A Pedagogical Perspec-tive on Big Data and Learning Analytics: A Conceptual Model for Digital LearningSupport. Technology, Knowledge and Learning, 24(4):599–619.

[54] Shmueli, G. (2010). To explain or to predict? Statistical Science, 25(3):289–310.[55] Siemens, G. (2011). Penetrating the Fog: Analytics in Learning and Education.Educause Review, 46(5):30–40.

[56] Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. AmericanBehavioral Scientist, 57(10):1380–1400.

[57] Slade, S. and Prinsloo, P. (2013). Learning Analytics: Ethical Issues and Dilemmas.American Behavioral Scientist, 57(10):1510–1529.

[58] Stoltzman, S. (2018). What Type of Data Visualization Do You Choose (if any)?https://opendatascience.com/data-visualization-part-3/ (last accessed 2021-06-23).

[59] SURF (2020). Learning Analytics in 5 Stappen Een Handreiking Voor De Avg.[60] van Leeuwen, A., vanWermeskerken, M., Erkens, G., and Rummel, N. (2017). Mea-suring teacher sense making strategies of learning analytics: a case study. Learning:Research and Practice, 3(1):42–58.

[61] Verbert, K., Duval, E., Klerkx, J., Govaerts, S., and Santos, J. L. (2013). LearningAnalytics Dashboard Applications. American Behavioral Scientist, 57(10):1500–1509.

[62] Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Van Assche, F., Parra, G., andKlerkx, J. (2014). Learning dashboards: An overview and future research opportunities.Personal and Ubiquitous Computing, 18(6):1499–1514.

[63] Verbert, K., Ochoa, X., De Croon, R., Dourado, R. A., and De Laet, T. (2020). Learn-ing analytics dashboards: The past, the present and the future. ACM InternationalConference Proceeding Series, pages 35–40.

[64] Viberg, O., Hatakka, M., Bälter, O., andMavroudi, A. (2018). The current landscapeof learning analytics in higher education. Computers in Human Behavior, 89(October2017):98–110.

12

[65] Williams, M. and Burden, R. (1997). Motivation in language learning : a socialconstructivist approach. Recherche et pratiques pédagogiques en langues de spécialité -Cahiers de l’APLIUT, 16(3):19–27.

[66] Yilmaz, K. (2013). Comparison of quantitative and qualitative research traditions:Epistemological, theoretical, and methodological differences. European Journal ofEducation, 48(2):311–325.

[67] You, J. W. (2016). Identifying significant indicators using LMS data to predictcourse achievement in online learning. Internet and Higher Education, 29:23–30.

13

A REQUIREMENTS SURVEYThis questionnaire was used in the requirements Survey to explore topics of interest (see Section 2.1).

• Beschrijf de analyse– Wat is het doel van de analyse?– Over wie gaat de analyse?– Wie gaan gebruik maken van de analyse?– Welke informatie/inzicht gaat de analyse opleveren?– Welke soort learning analytics betreft het? (beschrijvend, verklarend, etc.)

• Uitvoering van de analyse– Welke gegevens zijn nodig om de analyse te kunnen uitvoeren? In welk systeem bevinden deze gegevens zich?– Welke analysemethoden worden toegepast in deze analyse?

• Toepassing van de analyse– Over welke competenties moeten de gebruikers van de analyse beschikken om deze goed te kunnen gebruiken?

• Overige opmerkingen

14

B DIAGRAMS

Users and Groups

user

iUserID INT

sUserSource VARCHAR(255)

sUserID VARCHAR(255)

sUserName VARCHAR(1024)

sUserEmail VARCHAR(1024)

usergroup

iGroupID INT

iFKParentUserGroup INT

sUserGroupName VARCHAR(512)

usergroup_user

iFKUserGroupID INT

iFKUserID INT

Learning Plans

learningplanstakeholdergroup

iLearningPlanStakeholderGroupID INT

sLearningPlanStakeholderGroupTitle VARCHAR(255)

iPermissionsBinary INT

iFKLearningPlanID INT

bIsStudentStakeholderGroup TINYINT

learningplanstakeholdergroup2user

iLearningPlantakeholdergroup2UserID INT

iFKLearningPlanStakeholderGroupID INT

iFKUserID INT

learningplanstatus

iLearningPlanStatusID INT

sLearningPlanStatusLabel VARCHAR(45)

iOrder INT

learningplanpropertyhistory

iLearningPlanPropertiesID INT

iFKLearningPlanID INT

iFKLearningPlanStatusID INT

sLearningPlanTitle VARCHAR(255)

dStartDateTime DATETIME

dDueDateTime DATETIME

learningplan2templategoal

iLearningPlan2TemplateGoalID INT

iFKLearningPlanID INT

iFKGoalID INT

learningplanstakeholdergroup2usergroup

iLearningPlanStakeholderGroup2UserGroupID INT

iFKLearningPlanStakeholderGroupID INT

iFKUserGroupID INT

learningplan_keymetrics

iLearningPlanMetricsID INT

iFKLearningPlanID INT

dDateTime DATETIME

iLTG INT

iLTGL INT

iLTGNW INT

iLTGPR INT

iLTGAP INT

iLTGAC INT

iLTGNA INT

iLTGO INT

iLTGOL INT

iLTLV INT

9 more...

learningplan

iLearningPlanID INT

iFKOwnerUserID INT

iFKLearningPlanTemplateID INT

dCreated DATETIME

bIsTemplate TINYINT

iLearningPlanCompletionRule INT

learningplancol VARCHAR(255)

Goals

goalstakeholderpermissi…

iGoalStakeholderPermissionsID INT

iFKGoalID INT

iFKStakeholderGroupID INT

iPermissionsBinary INT

goalstakeholdergroup_user

iGoalStakeholderGroup_UserID INT

iFKGoalStakeholderGroupID INT

iFKUserID INT

goal_keymetrics

iGoalKeyMetricsID INT

iFKGoalID INT

dDateTime DATETIME

iGDO INT

iGSTR INT

13 more...

goalstakeholdergroup

iGoalStakeholderGroupID INT

sGoalStakeholderGroupName VARCHAR(255)

iPermissionsBinary INT

iFKGoalID INT

goalstakeholdergroup_usergroup

iGoalStakeholderGroup_UsergroupID INT

iFKGoalStakeholderGroupID INT

iFKUserGroupID INT

goal

iGoalID INT

iFKParentGoalID INT

iFKParentLearningPlanID INT

bIsTemplate TINYINT

iFKMetadataSetID INT

iGoalCompletionRule INT

goalstatus

iGoalStatusID INT

sGoalStatusLabel VARCHAR(45)

goalpropertyhistory

iGoalPropertiesID INT

dDateTime DATETIME

iFKGoalID INT

sGoalTitle VARCHAR(255)

iFKGoalStatusID INT

sGoalTitle VARCHAR(255)

sGoalDescription TEXT

iFKChangedByUserID INT

dStartDate DATETIME

dEndDate DATETIME

Learning Activity Repository

learningactivityenrollment

iEnrollmentID INT

iFKOfferingID INT

iFKUserID INT

iFKGoalID INT

iEnrollmentStatusID INT

offering

iOfferingID INT

iFKLearningActivityID INT

iFKOfferingStatusID INT

dOfferingOpeningDate DATETIME

dOfferingClosingDate DATETIME

iFKMetadataSetID INT

iMaxNumberOfEnrollments INT

bKeepWaitingList TINYINT

learningactivitytype

iLearningActivityTypeID INT

sLearningActivityTypeLabel VARCHAR(45)

offeringstatus

iOfferingStatusID INT

sOfferingStatusLabel VARCHAR(45)

offering2usergroupsuggestion

iOffering2UserGroupSuggestionID INT

iFKOfferingID INT

iFKUserGroupID INT

learningactivitycompletionrules

iLearningActivityCompletionRuleID INT

iFKLearningActivityID INT

sLearningActivityElementID VARCHAR(255)

bMustBeCompleted TINYINT

bMustBeMastered TINYINT

sLearningActivityCompletionRuleNote TEXT

offering2usersuggestion

iOfferingUserSuggestionID INT

iFKOfferingID INT

iFKUserID INT

learningactivityprofile

iLearningActivityID INT

iFKGoalID INT

iFKLearningActivityTypeID INT

iFKMetaDataSetID INT

dLearningActivityStartDateTime DATETIME

dLearningActivityEndDateTime DATETIME

iFKLearningActivityStudyroutePublicationID INT

iFKLearningActivityActivityplanTemplateID INT

iFKLearningActivitySCORMContentPackageID INT

iFKLearningActivityViewID INT

iFKLearningActivityViewTemplateID INT

sFKLearningActivityLTIURI VARCHAR(4096)

iFKOfferingMetadataSchema INT

flNominalHours DECIMAL(4,1)

iPoints INT

sLearningActivityTitle VARCHAR(512)

sLearningActivityDescription TEXT

iFKLearningActivitySetID INT

bReferToSourceForCompletionIndication INT

bUseLearningActivityComponentsToAssessCompletion INT

learningactivityenrollmentstatus

iLearningActivityEnrollmentStatusID INT

sLearingActivityErnrollmentStatusTitle VARCHAR(255)

learningactivityperformance

iLearningActivityPerformanceID INT

iFKLearnngActivityEnrollmentID INT

dDateTime INT

iFKLearningActivityTypeID INT

iNormalizedProgressIndication INT

bIsPassed TINYINT

dcGrade DECIMAL(6,2)

sRawPerformanceData TEXT

Metadata

metadataproperty

iMetadataPropertyID INT

sMetadataPropertyLabel VARCHAR(45)

sMetaDataSchemaDataType ENUM(...)

iFKMetadataSchemaPropertyID INT

iFKParentMetaDataSchemaID INT

metadataset

iMetaDataSetID INT

iFKMetaDataSchemaID INT

sMetadataSetTitle VARCHAR(255)

metadatamodel

metadataschema

iMetaDataSchemaID INT

sMetaDataSchemaName VARCHAR(45)

bObsolete TINYINT

sMetadataSchemaDescription TEXT

metadataschemaproperty

iMetadataSchemaPropertyID INT

iFKMetaDataSchemaID INT

sMetadataSchemaLabel VARCHAR(45)

sMetaDataSchemaDataType ENUM(...)

bExportToLRS TINYINT

Competencies

competencyset

iCompetenceSetID INT

sCompetenceSetName VARCHAR(255)

sCompetenceSetStatus INT

bIsTemplate TINYINT

competency

iCompetenceID INT

iFKCompetenceSetID INT

iFKParentCompetenceID INT

sCompetenceTemplateName VARCHAR(255)

iOrderAmongPeers INT

competencylevel

iCompetenceLevelID INT

iFKCompetenceSetID INT

sCompetenceLevelLabel VARCHAR(45)

iOrderAmongPeers INT

competencyassessment

iCompetenceAssessmentID INT

iFKCompetenceID INT

iFKCompetenceLevelID INT

iFKAssessedByUserID INT

dDateOfAssessment DATETIME

sAssessmentNote TEXT

goal_competency

iGoal2CompetenceID INT

iFKGoalID INT

iFKCompetenceID INT

Coaching

note

iNoteID INT

dDateTimeEntered DATETIME

sDeliberationText TEXT

dStartDateTime DATETIME1 more...

note_learningplanstakeholdergroup

iNote2StakeholderGroupID INT

iFKLearningPlanStakeholderGroupID INT

iFKNoteID INT

todoitem

iTodoItemID INT

iTodoItemStatusID INT

sTodoItemTitle VARCHAR(2048)

sTodoItemDescription TEXT

dStartDate DATETIME

dDueDate DATETIME

todoitem2goal

iTodoitem2GoalID INT

iFKTodoitemID INT

iFKGoalID INT

todoitem2competence

iTodoitem2CompetenceID INT

iFKTodoitemID INT

iFKCompetenceID INT

todoitem2learningplan

iTodoitem2LearningplanID INT

iFKTodoItemID INT

1 more...

todoitem2userid

iTodoitem2UserID INT

iFKTodoitemID INT

iFKUserID INT

sRole ENUM('act', 'view')

todoitem2stakeholdergroup

iTodoitem2UserID INT

iFKTodoitemID INT

iFKLearningPlanStakeholderGroupID INT

sRole ENUM('act', 'view')

note2todoitem

iNote2TodoitemID INT

iFKNoteID INT

iFKTodoItemID INT

Widgets

widget

iWidgetID INT

iFKParentWidgetTemplateID INT

bIsWidgetTemplate TINYINT

sWidgetTitle VARCHAR(255)

sWidgetExplanationText TEXT

sWidgetPublicDescription TEXT

iFKWidgetStatusID INT

iFKWidgetTypeID INT

bDeleted TINYINT

widgetstatus

iWidgetStatusID INT

sWidgetStatusLabel VARCHAR(255)

1 more...

widgetlayer

iWidgetLayerID INT

iFKWidgetID INT

iFKWidgetLayerVisualizationTypeID INT

sWidgetLayerStatement TEXT

iWidgetLayerOrder INT

1 more...

widgettype

iWidgetTypeID INT

iWidgetTypeLabel VARCHAR(255)

sWidgetTypeDesignerDescription VARCHAR(255)

sWidgetTypePublicDescription VARCHAR(255)

widgetnotification

iWidgetNotificationID INT

iFKWidgetID INT

sKeyMetricFormula TEXT

iOrder INT

widgetlayervisualizationtype

iWidgetLayerVisualizationTypeID INT

sWidgetLayerTypeVisualizationLabel VARCHAR(255)

sWidgetLayerVisualizationDesignerDescription TEXT

sWidgetLayerVisualizationPublicDescription TEXT

widget2learningplan

iWidget2LearningPlanID INT

iFKWidgetID INT

iFKLearningPlanID INT

iFKUserID INT

iFKUserGroupID INT

widgettemplate2learningplan

iWidget2LearningPlanID INT

iFKWidgetID INT

iFKLearningPlanID INT

widgetgallery2usergroup

idWidgetGallery2UserGroupID INT

iFKWidgetGalleryID INT

1 more...

widgetgallery2widget

iWidgetGallery2WidgetID INT

iFKWidgetGalleryID INT

iFKWidgetID INTwidgetgallery

iWidgetGalleryID INT

sWidgetGalleryName VARCHAR(255)

Dashboard

dashboard

iDashboardID INT

iFKOwnerUserID INT

sDashboardTitle VARCHAR(255)

dashboard2widget

iDashboard2WidgetID INT

iFKDashboardID INT

iFKWidgetID INT

iOrder INT

dashboard2user

iDashboard2UserID INT

iFKDashboardID INT

iFKUserID INT

dashboard2usergroup

iDashboard2UserID INT

iFKDashboardID INT

iFKUserGroupID INT

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞11111

∞∞∞∞∞

11111

∞∞∞∞∞11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞ 11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

∞∞∞∞∞

11111

user

iUserID INT

sUserSource VARCHAR(255)

sUserID VARCHAR(255)

sUserName VARCHAR(1024)

sUserEmail VARCHAR(1024)

usergroup

iGroupID INT

iFKParentUserGroup INT

sUserGroupName VARCHAR(512)

usergroup_user

iFKUserGroupID INT

iFKUserID INT

learningplanstakeholdergroup

iLearningPlanStakeholderGroupID INT

sLearningPlanStakeholderGroupTitle VARCHAR(255)

iPermissionsBinary INT

iFKLearningPlanID INT

bIsStudentStakeholderGroup TINYINT

learningplanstakeholdergroup2user

iLearningPlantakeholdergroup2UserID INT

iFKLearningPlanStakeholderGroupID INT

iFKUserID INT

learningplanstatus

iLearningPlanStatusID INT

sLearningPlanStatusLabel VARCHAR(45)

iOrder INT

learningplanpropertyhistory

iLearningPlanPropertiesID INT

iFKLearningPlanID INT

iFKLearningPlanStatusID INT

sLearningPlanTitle VARCHAR(255)

dStartDateTime DATETIME

dDueDateTime DATETIME

learningplan2templategoal

iLearningPlan2TemplateGoalID INT

iFKLearningPlanID INT

iFKGoalID INT

learningplanstakeholdergroup2usergroup

iLearningPlanStakeholderGroup2UserGroupID INT

iFKLearningPlanStakeholderGroupID INT

iFKUserGroupID INT

learningplan_keymetrics

iLearningPlanMetricsID INT

iFKLearningPlanID INT

dDateTime DATETIME

iLTG INT

iLTGL INT

iLTGNW INT

iLTGPR INT

iLTGAP INT

iLTGAC INT

iLTGNA INT

iLTGO INT

iLTGOL INT

iLTLV INT

9 more...

learningplan

iLearningPlanID INT

iFKOwnerUserID INT

iFKLearningPlanTemplateID INT

dCreated DATETIME

bIsTemplate TINYINT

iLearningPlanCompletionRule INT

learningplancol VARCHAR(255)

goalstakeholderpermissi…

iGoalStakeholderPermissionsID INT

iFKGoalID INT

iFKStakeholderGroupID INT

iPermissionsBinary INT

goalstakeholdergroup_user

iGoalStakeholderGroup_UserID INT

iFKGoalStakeholderGroupID INT

iFKUserID INT

goal_keymetrics

iGoalKeyMetricsID INT

iFKGoalID INT

dDateTime DATETIME

iGDO INT

iGSTR INT

13 more...

goalstakeholdergroup

iGoalStakeholderGroupID INT

sGoalStakeholderGroupName VARCHAR(255)

iPermissionsBinary INT

iFKGoalID INT

goalstakeholdergroup_usergroup

iGoalStakeholderGroup_UsergroupID INT

iFKGoalStakeholderGroupID INT

iFKUserGroupID INT

goal

iGoalID INT

iFKParentGoalID INT

iFKParentLearningPlanID INT

bIsTemplate TINYINT

iFKMetadataSetID INT

iGoalCompletionRule INT

goalstatus

iGoalStatusID INT

sGoalStatusLabel VARCHAR(45)

goalpropertyhistory

iGoalPropertiesID INT

dDateTime DATETIME

iFKGoalID INT

sGoalTitle VARCHAR(255)

iFKGoalStatusID INT

sGoalTitle VARCHAR(255)

sGoalDescription TEXT

iFKChangedByUserID INT

dStartDate DATETIME

dEndDate DATETIME

learningactivityenrollment

iEnrollmentID INT

iFKOfferingID INT

iFKUserID INT

iFKGoalID INT

iEnrollmentStatusID INT

offering

iOfferingID INT

iFKLearningActivityID INT

iFKOfferingStatusID INT

dOfferingOpeningDate DATETIME

dOfferingClosingDate DATETIME

iFKMetadataSetID INT

iMaxNumberOfEnrollments INT

bKeepWaitingList TINYINT

learningactivitytype

iLearningActivityTypeID INT

sLearningActivityTypeLabel VARCHAR(45)

offeringstatus

iOfferingStatusID INT

sOfferingStatusLabel VARCHAR(45)

offering2usergroupsuggestion

iOffering2UserGroupSuggestionID INT

iFKOfferingID INT

iFKUserGroupID INT

learningactivitycompletionrules

iLearningActivityCompletionRuleID INT

iFKLearningActivityID INT

sLearningActivityElementID VARCHAR(255)

bMustBeCompleted TINYINT

bMustBeMastered TINYINT

sLearningActivityCompletionRuleNote TEXT

offering2usersuggestion

iOfferingUserSuggestionID INT

iFKOfferingID INT

iFKUserID INT

learningactivityprofile

iLearningActivityID INT

iFKGoalID INT

iFKLearningActivityTypeID INT

iFKMetaDataSetID INT

dLearningActivityStartDateTime DATETIME

dLearningActivityEndDateTime DATETIME

iFKLearningActivityStudyroutePublicationID INT

iFKLearningActivityActivityplanTemplateID INT

iFKLearningActivitySCORMContentPackageID INT

iFKLearningActivityViewID INT

iFKLearningActivityViewTemplateID INT

sFKLearningActivityLTIURI VARCHAR(4096)

iFKOfferingMetadataSchema INT

flNominalHours DECIMAL(4,1)

iPoints INT

sLearningActivityTitle VARCHAR(512)

sLearningActivityDescription TEXT

iFKLearningActivitySetID INT

bReferToSourceForCompletionIndication INT

bUseLearningActivityComponentsToAssessCompletion INT

learningactivityenrollmentstatus

iLearningActivityEnrollmentStatusID INT

sLearingActivityErnrollmentStatusTitle VARCHAR(255)

learningactivityperformance

iLearningActivityPerformanceID INT

iFKLearnngActivityEnrollmentID INT

dDateTime INT

iFKLearningActivityTypeID INT

iNormalizedProgressIndication INT

bIsPassed TINYINT

dcGrade DECIMAL(6,2)

sRawPerformanceData TEXT

metadataproperty

iMetadataPropertyID INT

sMetadataPropertyLabel VARCHAR(45)

sMetaDataSchemaDataType ENUM(...)

iFKMetadataSchemaPropertyID INT

iFKParentMetaDataSchemaID INT

metadataset

iMetaDataSetID INT

iFKMetaDataSchemaID INT

sMetadataSetTitle VARCHAR(255)

metadataschema

iMetaDataSchemaID INT

sMetaDataSchemaName VARCHAR(45)

bObsolete TINYINT

sMetadataSchemaDescription TEXT

metadataschemaproperty

iMetadataSchemaPropertyID INT

iFKMetaDataSchemaID INT

sMetadataSchemaLabel VARCHAR(45)

sMetaDataSchemaDataType ENUM(...)

bExportToLRS TINYINT

competencyset

iCompetenceSetID INT

sCompetenceSetName VARCHAR(255)

sCompetenceSetStatus INT

bIsTemplate TINYINT

competency

iCompetenceID INT

iFKCompetenceSetID INT

iFKParentCompetenceID INT

sCompetenceTemplateName VARCHAR(255)

iOrderAmongPeers INT

competencylevel

iCompetenceLevelID INT

iFKCompetenceSetID INT

sCompetenceLevelLabel VARCHAR(45)

iOrderAmongPeers INT

competencyassessment

iCompetenceAssessmentID INT

iFKCompetenceID INT

iFKCompetenceLevelID INT

iFKAssessedByUserID INT

dDateOfAssessment DATETIME

sAssessmentNote TEXT

goal_competency

iGoal2CompetenceID INT

iFKGoalID INT

iFKCompetenceID INT

note

iNoteID INT

dDateTimeEntered DATETIME

sDeliberationText TEXT

dStartDateTime DATETIME1 more...

note_learningplanstakeholdergroup

iNote2StakeholderGroupID INT

iFKLearningPlanStakeholderGroupID INT

iFKNoteID INT

todoitem

iTodoItemID INT

iTodoItemStatusID INT

sTodoItemTitle VARCHAR(2048)

sTodoItemDescription TEXT

dStartDate DATETIME

dDueDate DATETIME

todoitem2goal

iTodoitem2GoalID INT

iFKTodoitemID INT

iFKGoalID INT

todoitem2competence

iTodoitem2CompetenceID INT

iFKTodoitemID INT

iFKCompetenceID INT

todoitem2learningplan

iTodoitem2LearningplanID INT

iFKTodoItemID INT

1 more...

todoitem2userid

iTodoitem2UserID INT

iFKTodoitemID INT

iFKUserID INT

sRole ENUM('act', 'view')

todoitem2stakeholdergroup

iTodoitem2UserID INT

iFKTodoitemID INT

iFKLearningPlanStakeholderGroupID INT

sRole ENUM('act', 'view')

note2todoitem

iNote2TodoitemID INT

iFKNoteID INT

iFKTodoItemID INT

widget

iWidgetID INT

iFKParentWidgetTemplateID INT

bIsWidgetTemplate TINYINT

sWidgetTitle VARCHAR(255)

sWidgetExplanationText TEXT

sWidgetPublicDescription TEXT

iFKWidgetStatusID INT

iFKWidgetTypeID INT

bDeleted TINYINT

widgetstatus

iWidgetStatusID INT

sWidgetStatusLabel VARCHAR(255)

1 more...

widgetlayer

iWidgetLayerID INT

iFKWidgetID INT

iFKWidgetLayerVisualizationTypeID INT

sWidgetLayerStatement TEXT

iWidgetLayerOrder INT

1 more...

widgettype

iWidgetTypeID INT

iWidgetTypeLabel VARCHAR(255)

sWidgetTypeDesignerDescription VARCHAR(255)

sWidgetTypePublicDescription VARCHAR(255)

widgetnotification

iWidgetNotificationID INT

iFKWidgetID INT

sKeyMetricFormula TEXT

iOrder INT

widgetlayervisualizationtype

iWidgetLayerVisualizationTypeID INT

sWidgetLayerTypeVisualizationLabel VARCHAR(255)

sWidgetLayerVisualizationDesignerDescription TEXT

sWidgetLayerVisualizationPublicDescription TEXT

widget2learningplan

iWidget2LearningPlanID INT

iFKWidgetID INT

iFKLearningPlanID INT

iFKUserID INT

iFKUserGroupID INT

widgettemplate2learningplan

iWidget2LearningPlanID INT

iFKWidgetID INT

iFKLearningPlanID INT

widgetgallery2usergroup

idWidgetGallery2UserGroupID INT

iFKWidgetGalleryID INT

1 more...

widgetgallery2widget

iWidgetGallery2WidgetID INT

iFKWidgetGalleryID INT

iFKWidgetID INTwidgetgallery

iWidgetGalleryID INT

sWidgetGalleryName VARCHAR(255)

dashboard

iDashboardID INT

iFKOwnerUserID INT

sDashboardTitle VARCHAR(255)

dashboard2widget

iDashboard2WidgetID INT

iFKDashboardID INT

iFKWidgetID INT

iOrder INT

dashboard2user

iDashboard2UserID INT

iFKDashboardID INT

iFKUserID INT

dashboard2usergroup

iDashboard2UserID INT

iFKDashboardID INT

iFKUserGroupID INT

Figure 10: Enhanced Entity Relationship diagram

15

C TABLES

Property DescriptionView goal Allows a user to view the goal.Edit goal Allows a user to edit the following properties of the goal if the goal has the status ’new’: General

Properties, Metadata, Competences. (see Figure 1)Edit advanced goal properties Allows a user to edit the following properties of the goal if the goal has the status ’new’: Inherited

Stakeholders, Local Stakeholders, Completion Rules (see Figure 1).Select learning activities Allows a user to select learning activities from the Learning Activity Repository (see Paragraph

4.4) and attach it to the goal. Only applies to goals that do not have subgoals.Propose goal Allows a user to change the status of the goal to ’proposed’. After this status change, the goal

cannot be edited unless its status is reset to ’new’.Approve goal Allows a user to approve the goal for use. The status of the goal will then change to ’approved’.Reschedule goal Allows a user to adjust the start- and due dates of an approved goal.Set goal status Allows the user to set the status of the goal to any status.Add subgoals Allows a user to create goals as subgoals of the goal.

Table 4: Goal permissions

Property DescriptionView learning plan Allows a stakeholder to view the learning plan.Edit learning plan Allows a stakeholder to edit the learning plan’s general properties.Add widget template to learning plan Allows a stakeholder to copy widget templates from the Global Widget Gallery to a learning plan

template.Set learning plan status Allows a stakeholder to set the status of the learning plan.

Table 5: Learning plan permissions

Key Metric Code DescriptionGDO The number of days the goal is overdue. 0 if the goal is not overdue or if no due date was specified for the goal.GSTR Number of days since the stated start date of the goal (if specified).GEND Number of days until the stated due date of the goal (if specified).GTLA Number of learning activities attached to the goal.GTSG Number of subgoals attached to the goal.GSGR Number of running subgoals attached to the goal.GSGA Number of achieved subgoals attached to the goal.GSGNA Number of not achieved subgoals attached to the goal.GTLA Number of running learning activities attached to the goal.GTLAP Number of passed learning activities attached to the goal.GTLANP Number of not passed learning activities attached to the goal.GANG Average normalized grade for all learning activities (0-100).GANGH Highest normalized grade for all learning activities (0-100).GANGL Lowest normalized grade for all learning activities (0-100).GANGP Average normalized grade for all passed learning activities (0-100).GANGNP Average normalized grade for all not passed learning activities (0-100).

Table 6: Goal key metrics

16

Key Metric Code DescriptionLTG The number of goals in the learning plan.LTGL The number of goals with learning activities in the learning plan.LTGS The number of goals with subgoals in the learning plan.LTGNW Total number of goals with status ’new’.LTGPR Total number of goals with status ’proposed’.LTGAP Total number of goals with status ’approved’.LTGAC Total number of goals with status ’achieved’.LTGNA Total number of goals with status ’not achieved’.LTGO Total number of overdue goals.LTGOL Total number of overdue leaf goals.LTLV Number goal levels in the learning plan.LALA Average number of learning activities (for goals with learning activities).LASG Average number of subgoals (for goals with subgoals).LGAW Number of goals set to status ’achieved’ during the past week.LGAD Number of goals set to status ’achieved’ during the past 24 hours.LANG Average normalized grade for all learning activities in the learning plan.LANGP Average normalized grade for all passed learning activities in the learning plan.LANGNP Average normalized grade for all not-passed learning activities in the learning plan.TELD Number of days since the start date of the learning plan (if a start date is defined).TRMD Number of days till the end date of the learning plan (if an end date is defined).

Table 7: Learning plan key metrics

17

D DATA LAYERWIDGET

Figure 11: Customizing the visualization of the Data Layerwidget

Figure 12: Settings for the Data Layer widget

18

Figure 13: Widget permissions and definition of a notifica-tion for the Data Layer widget

19

E CORRELATIONWIDGET

Figure 14: Pearson’s correlation coefficient over time in theCorrelation widget

Figure 15: Settings for the Correlation widget

Figure 16: Widget permissions and definition of notifica-tions for the Correlation widget

20