Insights into the quantification and reporting of model-related uncertainty across different disciplines

Simmonds, Emily G. and Adjei, Kwaku Peprah and Andersen, Christoffer Wold and Aspheim, Janne Cathrin Hetle and Battistin, Claudia and Bulso, Nicola and Christensen, Hannah and Cretois, Benjamin and Cubero, Ryan and Davidovich, Iván A. and Dickel, Lisa and Dunn, Benjamin and Dunn-Sigouin, Etienne and Dyrstad, Karin and Einum, Sigurd and Giglio, Donata and Gjerløw, Haakon and Godefroidt, Amélie and González-Gil, Ricardo and Cogno, Soledad Gonzalo and Große, Fabian and Halloran, Paul and Jensen, Mari F. and Kennedy, John James and Langsæther, Peter Egge and Laverick, Jack and Lederberger, Debora and Li, Camille and Mandeville, Caitlin and Mandeville, Elizabeth and Moe, Espen and Schröder, Tobias Navarro and Nunan, David and Parada, Jorge Sicacha and Simpson, Melanie Rae and Skarstein, Emma Sofie and Spensberger, Clemens and Stevens, Richard and Subramanian, Aneesh and Svendsen, Lea and Theisen, Ole Magnus and Watret, Connor and O’Hara, Robert B. (2022) Insights into the quantification and reporting of model-related uncertainty across different disciplines. iScience, 25 (12). 105512. ISSN 2589-0042 (https://doi.org/10.1016/j.isci.2022.105512)

[thumbnail of Simmonds-etal-iScience-2022-Insights-into-the-quantification-and-reporting-of-model-related-uncertainty]
Preview
Text. Filename: Simmonds_etal_iScience_2022_Insights_into_the_quantification_and_reporting_of_model_related_uncertainty.pdf
Final Published Version
License: Creative Commons Attribution 4.0 logo

Download (1MB)| Preview

Abstract

Quantifying uncertainty associated with our models is the only way we can express how much we know about any phenomenon. Incomplete consideration of model-based uncertainties can lead to overstated conclusions with real world impacts in diverse spheres, including conservation, epidemiology, climate science, and policy. Despite these potentially damaging consequences, we still know little about how different fields quantify and report uncertainty. We introduce the "sources of uncertainty" framework, using it to conduct a systematic audit of model-related uncertainty quantification from seven scientific fields, spanning the biological, physical, and political sciences. Our interdisciplinary audit shows no field fully considers all possible sources of uncertainty, but each has its own best practices alongside shared outstanding challenges. We make ten easy-to-implement recommendations to improve the consistency, completeness, and clarity of reporting on model-related uncertainty. These recommendations serve as a guide to best practices across scientific fields and expand our toolbox for high-quality research.

ORCID iDs

Simmonds, Emily G., Adjei, Kwaku Peprah, Andersen, Christoffer Wold, Aspheim, Janne Cathrin Hetle, Battistin, Claudia, Bulso, Nicola, Christensen, Hannah, Cretois, Benjamin, Cubero, Ryan, Davidovich, Iván A., Dickel, Lisa, Dunn, Benjamin, Dunn-Sigouin, Etienne, Dyrstad, Karin, Einum, Sigurd, Giglio, Donata, Gjerløw, Haakon, Godefroidt, Amélie, González-Gil, Ricardo, Cogno, Soledad Gonzalo, Große, Fabian, Halloran, Paul, Jensen, Mari F., Kennedy, John James, Langsæther, Peter Egge, Laverick, Jack ORCID logoORCID: https://orcid.org/0000-0001-8829-2084, Lederberger, Debora, Li, Camille, Mandeville, Caitlin, Mandeville, Elizabeth, Moe, Espen, Schröder, Tobias Navarro, Nunan, David, Parada, Jorge Sicacha, Simpson, Melanie Rae, Skarstein, Emma Sofie, Spensberger, Clemens, Stevens, Richard, Subramanian, Aneesh, Svendsen, Lea, Theisen, Ole Magnus, Watret, Connor and O’Hara, Robert B.;