• Individual vs. Collaborative Methods of Crowdsourced Transcription

    Author(s):
    Samantha Blickhan (see profile) , Amy Boyer, Daniel Hanson, Coleman Krawczyk, Andrea Simenstad, Victoria Van Hyning
    Date:
    2019
    Subject(s):
    Crowdsourcing, Digital humanities, Research, Methodology, Research--Methodology, Transcription
    Item Type:
    Article
    Tag(s):
    DH tools, Research and Development, Digital humanities research and methodology, Research methods, Text transcription
    Permanent URL:
    http://dx.doi.org/10.17613/aanb-3109
    Abstract:
    While online crowdsourced text transcription projects have proliferated in the last decade, there is a need within the broader field to understand differences in project outcomes as they relate to task design, as well as to experiment with different models of online crowdsourced transcription that have not yet been explored. The experiment discussed in this paper involves the evaluation of newly-built tools on the Zooniverse.org crowdsourcing platform, attempting to answer the research question: “Does the current Zooniverse methodology of multiple independent transcribers and aggregation of results render higher-quality outcomes than allowing volunteers to see previous transcriptions and/or markings by other users? How does each methodology impact the quality and depth of analysis and participation?” To answer these questions, the Zooniverse team ran an A/B experiment on the project Anti-Slavery Manuscripts at the Boston Public Library. This paper will share results of this study, and also describe the process of designing the experiment and the metrics used to evaluate each transcription method. These include the comparison of aggregate transcription results with ground truth data; evaluation of annotation methods; the time it took for volunteers to complete transcribing each dataset; and the level of engagement with other project elements such as posting on the message board or reading supporting documentation. Particular focus will be given to the (at times) competing goals of data quality, efficiency, volunteer engagement, and user retention, all of which are of high importance for projects that focus on data from galleries, libraries, archives and museums. Ultimately, this paper aims to provide a model for impactful, intentional design and study of online crowdsourcing transcription methods, as well as shed light on the associations between project design, methodology and outcomes.
    Metadata:
    Status:
    Published
    Last Updated:
    4 years ago
    License:
    All Rights Reserved
    Share this:

    Downloads

    Item Name: docx submitted_blickhan_individualvcollab.docx
      Download
    Activity: Downloads: 366