•  
  •  
 

Article Type

Full-Length Paper

Publication Date

2019-12-23

DOI

10.7191/jeslib.2019.1166

Abstract

Objective: Best practices such as the FAIR Principles (Findability, Accessibility, Interoperability, Reusability) were developed to ensure that published datasets are reusable. While we employ best practices in the curation of datasets, we want to learn how domain experts view the reusability of datasets in our institutional repository, ScholarsArchive@OSU. Curation workflows are designed by data curators based on their own recommendations, but research data is extremely specialized, and such workflows are rarely evaluated by researchers. In this project we used peer-review by domain experts to evaluate the reusability of the datasets in our institutional repository, with the goal of informing our curation methods and ensure that the limited resources of our library are maximizing the reusability of research data.

Methods: We asked all researchers who have datasets submitted in Oregon State University’s repository to refer us to domain experts who could review the reusability of their data sets. Two data curators who are non-experts also reviewed the same datasets. We gave both groups review guidelines based on the guidelines of several journals. Eleven domain experts and two data curators reviewed eight datasets. The review included the quality of the repository record, the quality of the documentation, and the quality of the data. We then compared the comments given by the two groups.

Results: Domain experts and non-expert data curators largely converged on similar scores for reviewed datasets, but the focus of critique by domain experts was somewhat divergent. A few broad issues common across reviews were: insufficient documentation, the use of links to journal articles in the place of documentation, and concerns about duplication of effort in creating documentation and metadata. Reviews also reflected the background and skills of the reviewer. Domain experts expressed a lack of expertise in data curation practices and data curators expressed their lack of expertise in the research domain.

Conclusions: The results of this investigation could help guide future research data curation activities and align domain expert and data curator expectations for reusability of datasets. We recommend further exploration of these common issues and additional domain expert peer-review project to further refine and align expectations for research data reusability.

The substance of this article is based upon a panel presentation at RDAP Summit 2019.

Keywords

data curation, data peer-review, data publication

Data Availability

The data for this article are the reviews provided by 11 reviewers. It would be impossible to anonymize these reviews, so we are not sharing the data. The article is accompanied by additional materials, that include the questionnaire we used.

Acknowledgments

Thank you to the anonymous reviewers who gave their time to this article and to the datasets discussed therein, as well as the data depositors who agreed to participate in this study. Thank you to all the colleagues at Oregon State University that reviewed the text to make this article stronger and better.

Corresponding Author

Clara Llebot, Oregon State University, 121 The Valley Library Corvallis OR 97331–4501 USA; clara.llebot@oregonstate.edu

Rights and Permissions

Copyright Llebot and Van Tuyl © 2019

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Questionnaire_1166.pdf (492 kB)
Questionnaire

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.