Post on 24-Jun-2015
description
Crowd Sourcing of Reference and User
Services
John G. DoveFormer CEO, Credo
Scott JohnsonCEO, ChiliFresh
Tim SpauldingFounder, LibraryThing
Ilana Barnes StonebrakerAsssitant Professor of Library Science, Purdue University
Saturday November 8, 2014 9:45-10:45 AM
Colonial Ballroom, Francis Marion Hotel
2
Crowdsourcing
A venerable history in libraries.
E. G. the National Union Catalogs
National Union Catalog photo by Boatwright Memorial Library https://www.flickr.com/photos/boatwrightlibrary/6899675938/
3
Wikipedia
4
Oxford English Dictionary
5
Central to Crowdsource Strategies
• Three major challenges:– Distinguishing between good and mediocre
contributions, i.e.: Curation– Systematic Biases– The Update Problem
6
Scalable Curation
How to get Wisdom? Rather than Madness?
7
To Get Wisdom
James Surowiecki Three Main Factors
• Independence of contributing groups
• A fair way to sum-up the contributions
• No systematic biases
8
Condorcet Jury TheoremCass Sunstein Decision Theory
• If each member scores at least 50% group score converges on 100%.
• The darkside? If each member scores 49% the group score converges on zero!!
9
Wisdom vs. Madness
Wikipedia How does it measure up?
10
2 examples of curated collaboration
Birds of North America
Encyclopedia of Life
11
The Update Problem
• Or: How to put the “Crowd” into “Crowd Sourcing”?
• What % needs to contribute meaningfully into order to create the necessary value?
• Eg. Zagat, Yelp• Otherwise you have an electronic ghost-
town
12
For each of these panelists
• Have they solved the Update problem?• Have they found an effective way to scale
up the curation of the content?• Have they avoided the problem of
systematic biases?
LibraryThing and the
Ladder of Engagement
Tim SpaldingCharleston Conference 2014
tim@librarything.com@LibraryThingTim
What is LibraryThing?
• Personal cataloging• Social cataloging• Social networking• (Also makes products for libraries.)
The Ladder of Engagement
• How people “climb” the site• Shows qualitative differences in types of
crowdsourcing• Suggests model for “adding”
crowdsourcing
The Ladder of Engagement
• Personal cataloging • Exhibitionism, voyeurism• Self-expression• Social cataloging• Policing and helping• Collaborative cataloging
112m tags added
393 covers for Harry Potter and the Sorcerer’s Stone
4 million covers across the site
The Ladder of Engagement
• Personal cataloging • Exhibitionism, voyeurism• Self-expression• Social cataloging• Policing and helping• Collaborative cataloging
The Ladder of Engagement
• Personal cataloging • Exhibitionism, voyeurism• Self-expression• Social cataloging• Policing and helping• Collaborative cataloging
The Ladder of Engagement
• Personal cataloging • Exhibitionism, voyeurism• Self-expression• Social cataloging• Policing and helping• Collaborative cataloging
• 2.3 million reviews
• 830k “unique” works reviewed
The Ladder of Engagement
• Personal cataloging • Exhibitionism, voyeurism• Self-expression• Social cataloging (cataloging on shared data)• Policing and helping• Collaborative cataloging
• 6.4m edits
All “Authority Control”
•Work-edition control (more than 4m acts of disambiguation)
• Author combination/separation• Homonymous author division• Tag disambiguation
Does it work? Like all get-out.
The Ladder of Engagement
• Personal cataloging • Exhibitionism, voyeurism• Self-expression• Social cataloging• Policing and helping• Collaborative cataloging
The Ladder of Engagement
• Personal cataloging • Exhibitionism, voyeurism• Self-expression• Social cataloging• Policing and helping• Collaborative cataloging
The Ladder of Engagement
• Personal -> social• Love of the thing -> love of other• Love of self -> altruistic• Done be low-interest people -> high-
interest• Done by many -> done by few– This is a good thing.
Lessons from the Ladder
• Secure the bottom of the ladder• Build it rung-by-rung?• Consider it rung-by-rung• Crowdsourcing is not a “feature”
Above all:• It’s not about what you GET• It’s about what you GIVE
ChiliFresh
A Global Community At Your Local Library
Community
• Before the Internet – Community was a geographic space– You could ride your bicycle around your
community– Your work place was also part of your
community
• Because of the Internet – Community is not define by geographics– Vocation– Hobbies– Literary Interests
Information
• The Power of Opinion• Collaborative Data• The Voice of The Crowd• A Valuable Message OR Noise?
Interactivity where your Users “live”
• Mobile Apps (iOS & Android)– Full catalog search– Notifications– Review and Social Aggregation
• Facebook App– Full catalog search– Like/share– Review and Social Aggregation
Your Library’s Ecosphere
• LMS Agnostic• Single-sign on• Local Connection with Global Reach
Crowdsourcing, Libraries And Reference:
CrowdAsk
Ilana StonebrakerAssistant Professor of Library ScienceBusiness Information SpecialistNovember 8th, 2014
How is your library like 54 jokes in 4 minutes?
Your library supports (online) community
Some (online) communities your library may support
• Your city• Your school or organization• Your alumnae or retirement base• Your “fans”• Your collection strengths
4 Not Shocking Facts
1. Everyone uses the internet more now than before for a variety of purposes and reasons.
2. Our patrons are part of the community in which we serve but also participate.
3. Students don’t read the manual.4. The majority of reference questions
are lower level.
What is supposed to happen
Students develop
questions about
resources, which they ask
a librarian about.
The librarian answers the
question dazzlingly well.
Student rocks project, get
amazing job, loves library for
life.
What actually happens
Challenges
• Reference service model flawed
Librarians
Other Student
s
Friends Professo
r
Challenges
Questions are all treated alike• Majority of reference questions are lower
level• Questions are context-based• The process of reference decontextualizing
then has to add context in the reference interview process- inefficient
• Lack of utilization of other information sources such as graduate students, instructors
We live in an information ecosystem
Crowdsourcing
Crowdsourcing is an “online distributed problem-solving and production model that leverages the collective intelligence of online communities to serve specific organizational goals.”
Crowdsourcing
In crowdsourcing, the “locus of control regarding the creative production of goods and ideas exists between the organization and public, a shared process of bottom-up, open creation by the crowd and top-down management by those charged with serving an organization’s strategic interests.”
Community: Students, Staff and Faculty of Purdue UniversitySpecific Goal: Provide Contextual Answers for Student and Alumnae Questions, teaching students how to ask better questions and strengthening alumnae networks.
CrowdAsk on Purdue Website
• Systems statistics– 129 users posted questions, 184
answered questions– 257 voted– 316 questions, 700 answers– Most views on a question: 182– Most answers to a question: 16– Most votes on a question: 48
Quantitative Assessment- Students
• Google Analytics (January 5, 2014 to April 2, 2014)– 1,150 visits from 474 unique visitors– 14,715 page views
• average 12.8 pages per visit.
– 6 minutes and 7 seconds average visit duration.
• Usability Test of 4 students (2 novice and 2 expert)
• Motivation for expert users: reciprocity, not points
CrowdAsk- Usability Tests
Launch on Purdue Libraries Website
Goal: develop sustainable user engagement and community involvement as part of the Purdue University Libraries website.
Crowds are your communities.
Keys for Crowdsourcing Reference
• The stronger the online community, the stronger the answer base
• Crowdsourcing can also strengthen online community by bringing people together for common goals
• Reciprocity is important to these communities. They need to give as well as take.
How could you use CrowdAsk?
Where are theOpportunities?
Where are the possibleThreats or Weakness?
Where are the possibleBenefits?
Thank YouCrowdAsk codehttps://github.com/crowdask0/crowdask
Short video on CrowdAskhttp://youtu.be/-kaNIPJ82yA
Ilana StonebrakerBusiness Information SpecialistPurdue University Librariesstonebraker@purdue.edu
Tao Zhang, PhDDigital User Experience SpecialistPurdue University Librarieszhan1022@purdue.edu
Works cited and Image Credit
• Bishop, B. W., & Bartlett, J. A. (2013). Where Do We Go from Here? Informing Academic Library Staffing through Reference Transaction Analysis. College & Research Libraries, 74(5), 489-500.
• Brabham, Daren C. (2013), Crowdsourcing, MIT Press.pg 1• http
://static.guim.co.uk/sys-images/Money/Consumer/financialservicesbrochures/2014/2/28/1393599196845/Angry-man-about-to-throw--011.jpg