UKSG Conference 2017 Breakout - User Engagement Analytics: measuring and driving meaningful use of...

Post on 12-Apr-2017

10 views 0 download

Transcript of UKSG Conference 2017 Breakout - User Engagement Analytics: measuring and driving meaningful use of...

3 May 2023 1

Helen Adey, Resource Acquisitions and Supply Team Manager,Nottingham Trent UniversityAndrea Eastman-Mullins, COO, Alexander Street

User Engagement Analytics: measuring and driving meaningful use of e-resources.

3 May 2023 2

Abstract

Nottingham Trent University (NTU) and Alexander Street have partnered to pilot an in-depth view on analytics, demonstrating user engagement and impact of use. This presentation shares findings on how e-resources were used and how these analytics can go beyond simple cost-per-use evaluation to support effective decision making on the marketing and promotion of resources and improve our understanding of how library users are engaging with the resources we provide.

3 May 2023 3

Content• Background - Why User Engagement

Analytics? Examples of when usage reports and conventional data aren’t enough

• User Engagement Analytics - the Publisher view– What is possible– Issues to consider

• User Engagement Analytics – the library view– What do they tell us and how might we use them– Issues to consider

• Conclusions - what have we learned?• Next steps……..

3 May 2023 4

Background - Why do we need User Engagement Analytics?• Scenarios when conventional usage data alone is

not enough– Evidence Based Acquisitions plans - what

constitutes “good” usage– eBook short term loan rentals, free use for less

than 5 minutes

• How would User Engagement Analytics help and what might engaged usage look like?

• How long is a resource used for rather than how many times has it been used?

• Active engagement usage types: notes, cut and paste, citations……?

User Engagement Analytics: The Publisher ViewAndrea Eastman-MullinsCOO, Alexander StreetUKSG, April 2017

COUNTER 4 – Media Report 1

Playbacks

There is more to “ROI” than cost-per-use

• New discovery or dead-end search?

• Shown in class to 50+ students or one mobile view?

• Preview or 100% viewed?

• Found via organic search or “curated” link?

• Met learning outcome?

Was it viewed?

How was it viewed?

What difference did it make?

Measuring Impact – Annotation

Measuring Impact – Playlists

How valuable was this video?

(1 – 5 stars)

How were videos used?• Assigned for class• Shown in class• For research project• Entertainment• Training/Education• Other (open ended)

Measuring Impact – Feedback

Measuring Impact—Video Interaction

Impact Stats

Publisher View: Challenges

• Balance need for transparency and consistency in stats.

• Librarians and publishers need to invest time to determine what is useful.

• Definitions evolving– On/off-campus– What is a playback in an opera with multiple movements?– Easy vs. tools to dig deeper

3 May 2023 18

User Engagement Analytics - the Library view: Acquisitions• Much richer view than we get

from usage and cost-per-use alone - many metrics showing engagement.

• Most viewed - obvious EBA candidates for final acquisition decisions, but what about “watched for longest” or % played?

• Breadth of content used may support decision to re-subscribe rather than purchase specific content

Engagement metrics summary screen

3 May 2023 20

User Engagement Analytics - the Library view: 2 - Technical• Reassuring Top referring URL data - Libraries Resource

Discovery system way out in front – supports practice of loading Marc records whenever possible

3 May 2023 21

Devices, Browsers and Operating Systems Data

Some surprising levels of use from non or less supported devices

3 May 2023 22

Curated Views• Through what paths are our

users finding and accessing content and what does this tell us about usage and engagement?

• On and Off campus use - less clear what the data is telling us; data may be less reliable because of use of proxy servers etc?

3 May 2023 23

User Engagement Analytics 3 - Strategic / Policy• Evidence strongly

supports policy of loading records into our Discovery system

• Evidence contradicts received wisdom regarding Google as a starting point

• Evidence suggests support for Apple devices is increasingly important

24

Marketing and Promotion Strategies• Targeted use of different

promotion routes could be monitored for impact & effectiveness.

• Could inform marketing policies e.g. what is most effective route for driving usage and engagement - Twitter, Facebook, Librarian recommendation?

7 April, 2017

3 May 2023 25

Subject use - Potentially useful for course accreditation and departmental review purposes

3 May 2023 26

Conclusions & Questions- what have we learned? (1)1. Possible tension between seamless & non-intrusive ease

of access and the desire to know and understand more about what our customers are doing by asking questions?

2. Needs to be an addition to Counter Usage reports and not a replacement.

3. If there is no standard, how might consistency in reporting be reached?

4. Can we trust users to be honest with the self declared use data? None of the use was for entertainment reasons????

5. How should we interpret low % used data - we don’t expect users to start reading all eBooks at page 1, so what would be the markers of good targeted usage of resources as apposed to random dipping in and out?

3 May 2023 27

Areas requiring more investigation / thought • Can we rely on Off Campus data? Do proxy servers and the lack of

WAYFless URL deep links make this meaningless?• Very clear Resource List usage but comparatively few clips, playlists

or embedding - are there opportunities for engagement with Academics to improve user experience & learning outcomes?

• We need to understand more about the % used data - what does it tell us and do we know what “good” looks like?

• Are more information and data always a good thing or do we run the risk of information overload preventing the decision making processes that we are trying to improve?

• Would benchmarking of usage with other institutions be possible and what would it tell us?

3 May 2023 28

Where next – Options for other Engagement AnalyticsHow many different people are using the

resource? e.g. Anonymised unique user numbers. Are there conclusions to be drawn between content being a lot by a few users and content used a little by a lot of users?

What might user engagement analytics look like for other types of text-based content:Notes; Cutting and pastingHighlights; citation; pages turned / viewedURL Referrals.

Opportunities: Evolving Standards

• Definitions where there are no agreed standards.

• Evolve standards to include impact.

• Learning analytics models can guide us.

Opportunities: Expand Referring Data

Emphasis on “curated view” and referring URLs• It’s not mostly Google.

• Do library efforts in discovery work?

• Does more meaningful use come from library referrals vs. google—longer on site, richer experience?

• Influence of social network, word of mouth.

Opportunities: Example Channel Report

Discovery

Catalog

Guide

Proxy

Menu System

Proxy

LMS

Channel

Opportunities

• Expand beyond video—music, archives, scores, text.

• Partner with libraries willing to experiment to find most useful data.

• How can measuring engagement drive it? Expose stats to user community. – User uploads stats– Heat map of video– Ratings– Most viewed– Etc.

3 May 2023 33

Any Questions?

Helen Adeyhelen.adey@ntu.ac.uk

Andrea Eastman-MullinsAEastmanMullins@alexanderstreet.com