#PAGE_PARAMS# #ADS_HEAD_SCRIPTS# #MICRODATA#

The feasibility of using citizens to segment anatomy from medical images: Accuracy and motivation


Autoři: Judith R. Meakin aff001;  Ryan M. Ames aff002;  J. Charles G. Jeynes aff003;  Jo Welsman aff003;  Michael Gundry aff004;  Karen Knapp aff004;  Richard Everson aff005
Působiště autorů: Biomedical Physics Group, College of Engineering, Mathematics and Physical Sciences, University of Exeter, Exeter, United Kingdom aff001;  Biosciences, College of Life and Environmental Sciences, University of Exeter, Exeter, United Kingdom aff002;  Centre for Biomedical Modelling and Analysis, University of Exeter, Exeter, United Kingdom aff003;  Medical Imaging, University of Exeter Medical School, University of Exeter, Exeter, United Kingdom aff004;  Computer Science, College of Engineering, Mathematics and Physical Sciences, University of Exeter, Exeter, United Kingdom aff005
Vyšlo v časopise: PLoS ONE 14(10)
Kategorie: Research Article
doi: https://doi.org/10.1371/journal.pone.0222523

Souhrn

The development of automatic methods for segmenting anatomy from medical images is an important goal for many medical and healthcare research areas. Datasets that can be used to train and test computer algorithms, however, are often small due to the difficulties in obtaining experts to segment enough examples. Citizen science provides a potential solution to this problem but the feasibility of using the public to identify and segment anatomy in a medical image has not been investigated. Our study therefore aimed to explore the feasibility, in terms of performance and motivation, of using citizens for such purposes. Public involvement was woven into the study design and evaluation. Twenty-nine citizens were recruited and, after brief training, asked to segment the spine from a dataset of 150 magnetic resonance images. Participants segmented as many images as they could within three one-hour sessions. Their accuracy was evaluated by comparing them, as individuals and as a combined consensus, to the segmentations of three experts. Questionnaires and a focus group were used to determine the citizens’ motivation for taking part and their experience of the study. Citizen segmentation accuracy, in terms of agreement with the expert consensus segmentation, varied considerably between individual citizens. The citizen consensus, however, was close to the expert consensus, indicating that when pooled, citizens may be able to replace or supplement experts for generating large image datasets. Personal interest and a desire to help were the two most common reasons for taking part in the study.

Klíčová slova:

Algorithms – Citizen science – Image analysis – Imaging techniques – Magnetic resonance imaging – Medicine and health sciences – Software tools – Vertebrae


Zdroje

1. Gundry M, Knapp K, Meertens R, Meakin JR. Computer-aided detection in musculoskeletal projection radiography: A systematic review. Radiography. 2017. https://doi.org/10.1016/j.radi.2017.11.002.

2. Sharp G, Fritscher KD, Pekar V, Peroni M, Shusharina N, Veeraraghavan H, et al. Vision 20/20: perspectives on automated image segmentation for radiotherapy. Med Phys. 2014;41(5):050902. Epub 2014/05/03. doi: 10.1118/1.4871620 24784366; PubMed Central PMCID: PMC4000389.

3. Marro A, Bandukwala T, Mak W. Three-Dimensional Printing and Medical Imaging: A Review of the Methods and Applications. Curr Probl Diagn Radiol. 2016;45(1):2–9. doi: 10.1067/j.cpradiol.2015.07.009 26298798

4. Bv Ginneken, Schaefer-Prokop CM, Prokop M. Computer-aided Diagnosis: How to Move from the Laboratory to the Clinic. Radiology. 2011;261(3):719–32. doi: 10.1148/radiol.11091710 22095995.

5. Kullenberg C, Kasperowski D. What Is Citizen Science?—A Scientometric Meta-Analysis. PLoS One. 2016;11(1):e0147152. Epub 2016/01/15. doi: 10.1371/journal.pone.0147152 26766577; PubMed Central PMCID: PMC4713078.

6. Follett R, Strezov V. An Analysis of Citizen Science Based Research: Usage and Publication Patterns. PLoS One. 2015;10(11):e0143687. Epub 2015/11/26. doi: 10.1371/journal.pone.0143687 26600041; PubMed Central PMCID: PMC4658079.

7. Slotnick S, Awad C, Nath S, Sherman J. Novice Reviewers Retain High Sensitivity and Specificity of Posterior Segment Disease Identification with iWellnessExam. J Ophthalmol. 2016;2016:1964254. Epub 2016/02/18. doi: 10.1155/2016/1964254 26881058; PubMed Central PMCID: PMC4736818.

8. Hutt H. Automatic segmentation of the lumbar spine from medical images [PhD Thesis]: University of Exeter; 2016.

9. Al Arif SMMR, Knapp K, Slabaugh G. Fully automatic cervical vertebrae segmentation framework for X-ray images. Comput Methods Programs Biomed. 2018;157:95–111. doi: 10.1016/j.cmpb.2018.01.006 29477438

10. Lewandowski E, Specht H. Influence of volunteer and project characteristics on data quality of biological surveys. Conserv Biol. 2015;29(3):713–23. Epub 2015/03/25. doi: 10.1111/cobi.12481 25800171.

11. Deeley MA, Chen A, Datteri R, Noble JH, Cmelak AJ, Donnelly EF, et al. Comparison of manual and automatic segmentation methods for brain structures in the presence of space-occupying lesions: a multi-expert study. Phys Med Biol. 2011;56(14):4557–77. Epub 2011/07/05. doi: 10.1088/0031-9155/56/14/021 21725140; PubMed Central PMCID: PMC3153124.

12. Odille F, Steeden JA, Muthurangu V, Atkinson D. Automatic segmentation propagation of the aorta in real-time phase contrast MRI using nonrigid registration. J Magn Reson Imaging. 2011;33(1):232–8. doi: 10.1002/jmri.22402 21182145

13. Ghafoorian M, Karssemeijer N, Heskes T, van Uden IWM, Sanchez CI, Litjens G, et al. Location Sensitive Deep Convolutional Neural Networks for Segmentation of White Matter Hyperintensities. Sci Rep. 2017;7(1):5110. doi: 10.1038/s41598-017-05300-5 28698556

14. Juneja P, Evans PM, Harris EJ. The Validation Index: A New Metric for Validation of Segmentation Algorithms Using Two or More Expert Outlines With Application to Radiotherapy Planning. IEEE Trans Med Imaging. 2013;32(8):1481–9. doi: 10.1109/TMI.2013.2258031 23591482

15. Lampert TA, Stumpf A, Gançarski P. An Empirical Study Into Annotator Agreement, Ground Truth Estimation, and Algorithm Evaluation. IEEE Transactions on Image Processing. 2016;25(6):2557–72. doi: 10.1109/TIP.2016.2544703 27019487

16. Jordan Raddick M, Bracey G, Gay PL, Lintott CJ, Cardamone C, Murray P, et al. Galaxy zoo: Motivations of citizen scientists. Astronomy Education Review. 2013;12(1). doi: 10.3847/AER2011021

17. Land-Zandstra AM, Devilee JLA, Snik F, Buurmeijer F, Broek JMvd. Citizen science on a smartphone: Participants’ motivations and learning. Public Understanding of Science. 2016;25(1):45–60. doi: 10.1177/0963662515602406 26346340.

18. Del Savio L, Prainsack B, Buyx A. Motivations of participants in the citizen science of microbiomics: data from the British Gut Project. Genet Med. 2017. Epub 2017/01/27. doi: 10.1038/gim.2016.208 28125088.

19. René W, Nirwan S, Chris M, Annie R, Advaith S. The role of automated feedback in training and retaining biological recorders for citizen science. Conserv Biol. 2016;30(3):550–61. doi: 10.1111/cobi.12705 27111194

20. Bonney R, Ballard H, Jordan R, McCallie E, Phillips T, Shirk J, et al. Public Participation in Scientific Research: Defining the Field and Assessing Its Potential for Informal Science Education. A CAISE Inquiry Group Report. Online Submission. 2009.


Článek vyšel v časopise

PLOS One


2019 Číslo 10
Nejčtenější tento týden
Nejčtenější v tomto čísle
Kurzy

Zvyšte si kvalifikaci online z pohodlí domova

plice
INSIGHTS from European Respiratory Congress
nový kurz

Současné pohledy na riziko v parodontologii
Autoři: MUDr. Ladislav Korábek, CSc., MBA

Svět praktické medicíny 3/2024 (znalostní test z časopisu)

Kardiologické projevy hypereozinofilií
Autoři: prof. MUDr. Petr Němec, Ph.D.

Střevní příprava před kolonoskopií
Autoři: MUDr. Klára Kmochová, Ph.D.

Všechny kurzy
Kurzy Podcasty Doporučená témata Časopisy
Přihlášení
Zapomenuté heslo

Zadejte e-mailovou adresu, se kterou jste vytvářel(a) účet, budou Vám na ni zaslány informace k nastavení nového hesla.

Přihlášení

Nemáte účet?  Registrujte se

#ADS_BOTTOM_SCRIPTS#