We are happy to announce that our paper titled “Enhancing Human-in-the-Loop Ontology Curation Results through Task Design” [1] has just been accepted for publication in the Special Issue on Human in the Loop Data Curation of the Journal of Data and Information Quality.

In the paper we describe two large-scale controlled experiments where we test two design aspects important for human-centric ontology evaluation campaigns: (1) qualification testing of the crowd and (2) the ontology representation formalism. 

In the campaign, human contributors had to judge the correctness of ontology axioms in terms of the appropriate modelling of the existential and universal ontology restrictions. To describe the performed experiments we make use of our previously proposed HERO methodology and we make all artefacts, created in the course of the experimental investigation, publicly available as a Zenodo resource [2].


The work described in the paper is funded by the FWF HOnEst project. 

[1] S. Tsaneva and M.Sabou. (2023). Enhancing Human-in-the-Loop Ontology Curation Results through Task Design. Journal of Data and Information Quality. 

[2] S. Tsaneva, K. Käsznar, & M.Sabou. (2023). HERO- a Human-centric ontology Evaluation pROcess. Zenodo.