Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion...

10
Institutional Brief The 8th International Conference on Learning Analytics and Knowledge – From K12 to Policy Is Learning Analytics Achieving Sustainable Impact? www.solaresearch.org Number 4, May 2018

Transcript of Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion...

Page 1: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

InstitutionalBrief

The 8th International Conference onLearning Analytics and Knowledge – From K12 to Policy

Is Learning Analytics AchievingSustainable Impact?

www.solaresearch.org

Number 4, May 2018

Page 2: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

The 8th International Conference on Learning Analytics and Knowledge

INSTITUTIONAL BRIEF

The first day of the event brought together close to 60 attendees participating in learning analytics initiatives within the K-12 space. The full day event was conceived with the objectives in mind: 1) To raise awareness of how initiatives in the learning analytics space are already being deployed in schools; 2) To offer delegates the opportunity to interact with those responsible for on the ground operations and share knowledge

and expertise; and 3) To explore the challenges faced by teachers and leaders when deploying learning analytics at the institutional level. As a conclusion of the event, learning analytics has a very fertile context in the K-12 space, but its deployment requires careful consideration of boundary conditions posed by the institution, government agencies, and other stakeholder groups.

The 8th edition of the International Conference on Learning Analytics and Knowledge (LAK) took place from 5-9 March 2018 in Sydney, Australia. As the flagship event of the Society for Learning Analytics Research, LAK is always a perfect time to assess where is the area heading and what has been achieved so far. The event this year had several highlights worth remembering.

First event in the K-12 space

2

Page 3: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

Six issues were identified as playing a relevant role when aiming at sustainable impact in Learning Analytics. The first one was to focus on the right questions. The field may be driven by the variety of data available, when in fact it should be driven initially by the right questions. The issue of scalability was also highlighted as a barrier to consider. When communicating the vision to stakeholders, there is a need for a clear value proposition: What are the advantages? This vision needs to be nurtured both through local (bottom-up) and central (top-down) approaches. Embracing initiatives in this space require addressing several cultural issues such as access to data, innovation, institutional

Leadershipsummit

This edition of the conference also convened a leadership summit. The mission underlying the Learning Analytics community is to explore how data and analytics methods can be used to increase the understanding and improve learning. Despite the future-gazers’ hype around Learning Analytics, everything we know about technology adoption reminds us that it is very human factors such as staff skills, work processes, and organisational incentives that determine whether digital innovations deliver real change and improvement. The Leadership Summit was convened to clarify the key role that university leadership plays, not only in fostering Learning Analytics innovation, but sustainable impact.

support, etc. Finally, analytics initiatives need to be aligned with existing requirements coming from bodies such TEQSA (Tertiary Education Quality and Standards Agency, the institution responsible for the quality of higher education in Australia).The leaders also identified the need for institutions such as SoLAR to lead the conversation at a national level to provide the expertise and guidance when addressing issues about quality, accountability, measurement, etc. Failing to participate in this conversation may translate into undesirable legislation in this space that does not reflect the nuances of the area.

» Focus on the right questions » Clear value proposition » Alignment with existing bodies

» Scalability » Local vs central approaches » Cultural issues

Six Issues identified for sustainable impact in Learning Analytics

3

Page 4: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

Pablo Munguia, Director Learning Analytics from RMIT presented the institutional structure and procedures that were put in place as a consequence of the migration of to the LMS Canvas and RMIT’s learning analytics strategy. Learning analytics at RMIT behaves as an ecosystem engineer, modifying the environment to increase diversity and productivity. This ecosystem engineer is best exemplified as a Bandicoot

as we dig into problems to provide analysis, we construct tunnels to connect data to the right people and we disperse these data quickly to aid in decision making. RMIT has conceived a set of processes to align data capture, quality assurance, and sustainable improvement of the teaching and learning experience (Fig 1). We work in a sandbox environment where we can quickly imagine solutions and test them; once ready these solutions are scaled to the enterprise.

The conference also hosted the traditional meeting of the representatives of the SoLAR institutional members. The topic selected for this edition of the meeting was the presentation of on-going initiatives of institutional deployment of learning analytics in several institutions

Above ground

Under ground

Test Scale up

Pilot Roll out

Success

We developan idea

It getstested and

piloted quickly

RMIT widesolution isidentified

Tool reachesall stakeholders

We measurethe tool to

identify futureimprovements

How are analytics going to engage and generate tools?

InstitutionalMembersMeeting

Fig 1 - RMIT University Analytics Process

4

Page 5: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

Joshua Lee, Analytics Specialist from Institutional Analytics and Planning at The University of Sydney presented their work connecting existing data sources derived from the LMS Canvas with student information services and the computational and IT requirements needed to manage such a large quantity of data. The presentation discussed issues related to single source of truth, and the much-needed required relationship with the LMS vendor.

The meeting also included the contribution of a representation of University of South Australia that shared with the audience their structures and processes integrated with an open source LMS (Moodle). This context provided the meeting with a comprehensive view of how institutions are currently addressing the challenges when bringing analytics solutions at the organisational level.

Fig 2 - University of Sydney LMS Logs

5

Page 6: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

The 2018 edition had a record 355 submissions (115 of them full research papers and 88 short research papers). The acceptance rate was 30% for both these types of submissions. We offered an extensive program including a Research track (35 full papers, 26 short papers, four extended abstracts, and 17 posters and demonstrations) and a Practitioner track (11 full papers, two short papers, nine posters and demonstrations). The program opened with four full-day workshops, six half-day workshops and our fourth hackathon, which ran over two days. Our sixth doctoral consortium received a record-breaking 35 submissions.

These numbers reinforce the trajectory observed during the 8 editions of this event ensuring the ideal context to select high-quality and relevant publications for the conference program.

As it has been from the first edition, full and short research papers were published in the ACM Digital Library. The new addition this year was the publication of the Companion

Proceedings containing practitioner papers, posters, demonstrations, doctoral consortium, and all papers used in the pre-conference workshops topping more than 700 pages of material. This year we could also see the consolidation of a community of researchers that work collaboratively and are spread all over the world. SoLAR is committed to increasing the participation in the geographical areas with the lowest participation rates. In fact, recently we held an election for the Society’s first Inclusion Chair, whom will lead a working group that will work towards these goals of greater geographic representation and ensuring that SoLAR increases diversity and remains an inclusive community.

The Conference at a Glance

Fig 3 - LAK Submission by Category

Fig 4 - LAK Participant by Country

6

Page 7: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

The recently held Leadership Summit in parallel with the 8th International Conference on Learning Analytics and Knowledge brought to the forefront an aspect of the discipline that is still not properly addressed. Are learning analytics initiatives truly providing a clear value to institutions? At the start of this decade there were numerous publications arguing that aspects such as student retention (or detecting students at risk of abandoning the institution) was a problem with enough ROI to justify its adoption at the institutional level. But learning analytics has evolved to widen its scope to a much richer set of areas. As in the case of student retention, there is a need to clearly articulate the benefit of adopting these initiatives at the institutional level.

A quick glance at the research initiatives presented in the flagship conference of SoLAR, shows a reduced number of initiatives that can be considered at a mature institutional level and clearly providing sustainable impact. Part of the reason for this situation may be due to the area still

being at its early stages (after all, a decade is not that long in terms of institutional adoption), but it is also worth exploring where is the area heading when considering sustainable impact, and perhaps more importantly, what are the ideal conditions within an institution that promote or foster a consistent approach towards sustainable impact.

Dawson et al.1 (https://dl.acm.org/citation.cfm?id=3170375&dl=ACM&coll=DL )presented a paper in LAK 18 that reviewed how institutions are approaching the deployment of learning analytics initiatives. The data analysed suggests that institutions need to broaden their LA adoption models to transition from small scale initiatives to more holistic organizational level. It is only at this level that sustainable impact becomes a realistic goal.

Recently, two of the most relevant researchers in the area, Tim McKay and

1 Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through com-plexity leadership theory. Paper presented at the International Conference on Learning Analytics and Knowledge - LAK ‘18, Sydney, Australia. doi:10.1145/3170358.3170375

Is Learning Analytics Achieving Sustainable Impact?

7

Page 8: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

Simon Buckingham Shum published an article in Educause titled Architecting for Learning Analytics: Innovating for Sustainable ImpactOrganizational architectures2 (https://er.educause.edu/articles/2018/3/architecting-for-learning-analytics-innovating-for-sustainable-impact) exploring these structures. They posit that there are three type of structures currently emerging when deploying learning analytics in higher education institutions: 1) IT or Service Centric, 2) Academic centric, and 3) a hybrid innovation centre.

The IT or Service Centric architecture is the best positioned from the point of view of providing robust 24/7 support for IT infrastructure. It is likely to provide a set of pre-defined affordances in the form of reports, dashboards, or data availability for academics to access.

The main disadvantages of this model is innovation is unlikely to happen beyond the limits of pre-defined commercial products. Innovation at the intersection of academic, technologic, and human/computer interaction is highly unlikely to occur.

The Academic Centric model is the best positioned to foster radical innovation. If participated by academics with a strong

2 Buckingham Shum, S., & McKay, T. A. (2018). Architecting for Learn-ing Analytics: Innovating for Sustainable Impact. EDUCAUSE Review, March/April(2018), 25-37.

sense of innovation and research, the initiatives emerging from this model will be truly innovative, conducted with scientific rigour and provide solid evidence of their effect. However, according to McKay and Buckingham Shum, these centres will struggle to take these innovations out of the bleeding edge and into mainstream adoption thus being reduced to low-scale impact.

The solution proposed to remove these hurdles while preserving the advantages is a hybrid In parallel to these challenges to deploy analytics at the institutional level lies another dimension that was highlighted also at LAK: the need to embrace a user-centric design methodology for analytics. Learning occurs in highly situated scenarios. This makes the deployment of tools and methods very sensitive to structures such as learning designs, pedagogical approaches and other contextual aspects.

It is for this reason that even with the right structures in place, with the right balance between innovators and service infrastructure, at the end of the day, learning

Institutional architectures for sustained impact

8

Page 9: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

• Ideal capacity to provide robust services• Innovation restricted by pre-defined products IT

Cent

ricAc

adem

ic

Cent

ricHy

brid

Ce

ntre

• Ideal for radical innovation• Significant challenges to scale at institutional level

• Provision of robust services to guarantee scaling• Fostering innovation driven by academic needs

analytics solutions must tend to the needs of the main stakeholders. Designers and instructors need solutions that cater to their needs within an educational context. Students also need to be an essential part of this conversation and provide their views and guide the deployment of techniques and support initiatives that are aligned with their needs.

Overall, as pointed by the leaders gathered during LAK 18, all these elements need to be combined to focus on the right questions and provide clearly articulated value propositions to all stakeholders.

9

Page 10: Institutional Brief...when deploying learning analytics at the institutional level. As a conclusion of the event, learning ... conversation at a national level to provide the expertise

www.solaresearch.org [email protected]

10