Evaluation was an important component of the CCC grant. Evaluative efforts were led by the grant's Evaluation Working Group. In year one, the Evaluation Working Group defined the CCC project's quantitative assessment needs.
In year two, the group added new members with expertise in qualitative assessment and drafted a comprehensive plan for qualitative evaluation. Over the course of years two and three, several qualitative evaluation efforts were undertaken, and all data that was gathered and tracked from the beginning of the project was analyzed.
In 2013, an article was published by Joyce Chapman and Samantha Leonard entitled, "Cost and benefit of quality control visual checks in large-scale digitization of archival manuscripts." The article is a case study using CCC production to determine the optimum balance between production and quality control visual checks. The article can be accessed via Library Hi Tech, Vol. 31 Iss: 3, pp. 405-418.
Participating institutions tracked the time they spent on various grant activities for evaluative purposes. These activities included:
An important and time-consuming part of the large-scale digitization process is preparing the materials for digitization. Student workers timed themselves as they reviewed the materials in the following areas:
Condition and conservation review
Privacy and IP review
This review included identifying materials in need of conservation work, removing fasteners such as staples or paper clips from pages, and identifying materials with privacy or copyright concerns.
Transportation of materials
Transportation time will be tracked by the Digital Production Manager, who is responsible for all materials transport during the grant.
Data to be tracked for each material transport includes:
Time driving (in minutes)
Time other (loading/unloading/moving (in minutes))
In years two and three of the grant, the Evaluation Working Group will plan and conduct qualitative assessments of the project. Qualitative assessment plans were developed broadly in the second half of year one. Instead of testing delivery interfaces, the group will focus on analyzing the user experience as defined by large-scale digitization at TRLN. Delivery interfaces may be tangentially tested through this process, but are not the focus of our assessment work.
One on one interviews will be held with some scholars and faculty. The target population will be people who have used highly curated digital content before, such as Documenting the American South. In this way, we will be able to ask interviewees to compare the highly curated versus large-scale aspect.
Three members of the CCC steering committee are teaching undergraduate courses in fall 2012 that can serve as testbeds for project evaluation. These three courses include one taught at NCSU, one taught at UNC, and one jointly taught between NCCU and Duke. The group hopes to conduct at least one project evaluation activity in each of the three courses. These activities may include a task that requires them to find interesting documents within one of the delivery interfaces and write a brief reaction paper evaluating the search and discovery experience, or a website evaluation exercise.
Evaluation of K-12 educators perspectives of TRLN's approach to online digital delivery (i.e., no additional metadata other than that which exists in the finding aid is applied to each digital image, and materials are discoverable through the context of the finding aid and not through specialized web portals with advanced searching capabilities) will be two part. In summer 2012, focus groups and one-on-one interviews were held with some local K-12 educators. In addition, an online survey of approximately 2,000 North Carolina teachers of social sciences in middle and high schools will be conducted in August 2012.
Google Analytics will be used to track all usage statistics for the grant. It was decided that Google Analytics would be set up on each institutions' finding aids as well as digital objects where applicable.
Baseline use metrics that we will track at each institution and report in aggregate for the entire grant include:
Collection guide views
Clicks on linked folder titles from collection guides
Clicks on links to "all digital content for this collection" from collection guides
Unique page views for scans (this data is available by collection for all institutions except NCSU)
Use stats are collected from Google Analytics accounts at the various libraries by the Project Librarian twice a year for aggregation and reporting.