» Articles » PMID: 23060783

Designing Next-generation Platforms for Evaluating Scientific Output: What Scientists Can Learn from the Social Web

Overview
Specialty Biology
Date 2012 Oct 13
PMID 23060783
Citations 8
Authors
Affiliations
Soon will be listed here.
Abstract

Traditional pre-publication peer review of scientific output is a slow, inefficient, and unreliable process. Efforts to replace or supplement traditional evaluation models with open evaluation platforms that leverage advances in information technology are slowly gaining traction, but remain in the early stages of design and implementation. Here I discuss a number of considerations relevant to the development of such platforms. I focus particular attention on three core elements that next-generation evaluation platforms should strive to emphasize, including (1) open and transparent access to accumulated evaluation data, (2) personalized and highly customizable performance metrics, and (3) appropriate short-term incentivization of the userbase. Because all of these elements have already been successfully implemented on a large scale in hundreds of existing social web applications, I argue that development of new scientific evaluation platforms should proceed largely by adapting existing techniques rather than engineering entirely new evaluation mechanisms. Successful implementation of open evaluation platforms has the potential to substantially advance both the pace and the quality of scientific publication and evaluation, and the scientific community has a vested interest in shifting toward such models as soon as possible.

Citing Articles

Beyond advertising: New infrastructures for publishing integrated research objects.

DuPre E, Holdgraf C, Karakuzu A, Tetrel L, Bellec P, Stikov N PLoS Comput Biol. 2022; 18(1):e1009651.

PMID: 34990466 PMC: 8735620. DOI: 10.1371/journal.pcbi.1009651.


Peer Review in Law Journals.

Stojanovski J, Sanz-Casado E, Agnoloni T, Peruginelli G Front Res Metr Anal. 2021; 6:787768.

PMID: 34957369 PMC: 8692876. DOI: 10.3389/frma.2021.787768.


A multi-disciplinary perspective on emergent and future innovations in peer review.

Tennant J, Dugan J, Graziotin D, Jacques D, Waldner F, Mietchen D F1000Res. 2017; 6:1151.

PMID: 29188015 PMC: 5686505. DOI: 10.12688/f1000research.12037.3.


Evidence appraisal: a scoping review, conceptual framework, and research agenda.

Goldstein A, Venker E, Weng C J Am Med Inform Assoc. 2017; 24(6):1192-1203.

PMID: 28541552 PMC: 6259661. DOI: 10.1093/jamia/ocx050.


Using science and psychology to improve the dissemination and evaluation of scientific work.

Buttliere B Front Comput Neurosci. 2014; 8:82.

PMID: 25191261 PMC: 4137661. DOI: 10.3389/fncom.2014.00082.


References
1.
Neylon C, Wu S . Article-level metrics and the evolution of scientific impact. PLoS Biol. 2009; 7(11):e1000242. PMC: 2768794. DOI: 10.1371/journal.pbio.1000242. View

2.
BATH F, Owen V, Bath P . Quality of full and final publications reporting acute stroke trials: a systematic review. Stroke. 1998; 29(10):2203-10. DOI: 10.1161/01.str.29.10.2203. View

3.
Broad W . The publishing game: getting more for less. Science. 1981; 211(4487):1137-9. DOI: 10.1126/science.7008199. View

4.
Smith R . Classical peer review: an empty gun. Breast Cancer Res. 2010; 12 Suppl 4:S13. PMC: 3005733. DOI: 10.1186/bcr2742. View

5.
Platt J . Strong Inference: Certain systematic methods of scientific thinking may produce much more rapid progress than others. Science. 1964; 146(3642):347-53. DOI: 10.1126/science.146.3642.347. View