References

References#

[1]

Stefan Chmiela, Alexandre Tkatchenko, Huziel E. Sauceda, Igor Poltavsky, Kristof T. Schütt, and Klaus-Robert Müller. Machine learning of accurate energy-conserving molecular force fields. Science Advances, 3(5):e1603015, May 2017. doi:10.1126/sciadv.1603015.

[2]

Alexandre René, André Longtin, and Jakob H. Macke. Inference of a Mesoscopic Population Model from Population Spike Trains. Neural Computation, 32(8):1448–1498, June 2020. doi:10.1162/neco_a_01292.

[3]

Mengyang Gu, Xinyi Fang, and Yimin Luo. Data-Driven Model Construction for Anisotropic Dynamics of Active Matter. PRX Life, 1(1):013009, August 2023. doi:10.1103/PRXLife.1.013009.

[4]

Keith Beven and Jim Freer. Equifinality, data assimilation, and uncertainty estimation in mechanistic modelling of complex environmental systems using the GLUE methodology. Journal of Hydrology, 249(1):11–29, August 2001. doi:10.1016/S0022-1694(01)00421-8.

[5]

Albert Tarantola. Popper, Bayes and the inverse problem. Nature Physics, 2(8):492–494, August 2006. doi:10.1038/nphys375.

[6]

Astrid A Prinz, Dirk Bucher, and Eve Marder. Similar network activity from disparate circuit parameters. Nature Neuroscience, 7(12):1345–1352, December 2004. doi:10.1038/nn1352.

[7]

R. M. Golden. Discrepancy Risk Model Selection Test Theory for Comparing Possibly Misspecified or Nonnested Models. Psychometrika, 68(2):229–249, June 2003. doi:10.1007/BF02294799.

[8]

Jinchi Lv and Jun S. Liu. Model Selection Principles in Misspecified Models. Journal of the Royal Statistical Society Series B: Statistical Methodology, 76(1):141–167, January 2014. doi:10.1111/rssb.12023.

[9]

Hsiang-Ling Hsu, Ching-Kang Ing, and Howell Tong. On model selection from a finite family of possibly misspecified time series models. The Annals of Statistics, April 2019. doi:10.1214/18-AOS1706.

[10]

Sébastien Van Bellegem and Rainer Dahlhaus. Semiparametric Estimation by Model Selection for Locally Stationary Processes. Journal of the Royal Statistical Society Series B: Statistical Methodology, 68(5):721–746, November 2006. doi:10.1111/j.1467-9868.2006.00564.x.

[11]

Brian M. de Silva, David M. Higdon, Steven L. Brunton, and J. Nathan Kutz. Discovery of Physics From Data: Universal Laws and Discrepancies. Frontiers in Artificial Intelligence, 2020. URL: https://www.frontiersin.org/articles/10.3389/frai.2020.00025 (visited on 2023-12-22).

[12]

Bin Yu. Stability. Bernoulli, September 2013. doi:10.3150/13-BEJSP14.

[13]

Armen Der Kiureghian and Ove Ditlevsen. Aleatory or epistemic? Does it matter? Structural Safety, 31(2):105–112, March 2009. doi:10.1016/j.strusafe.2008.06.020.

[14]

Eyke Hüllermeier and Willem Waegeman. Aleatoric and epistemic uncertainty in machine learning: an introduction to concepts and methods. Machine Learning, 110(3):457–506, March 2021. doi:10.1007/s10994-021-05946-3.

[15]

Keith Beven and Andrew Binley. The future of distributed models: Model calibration and uncertainty prediction. Hydrological Processes, 6(3):279–298, 1992. doi:10.1002/hyp.3360060305.

[16]

Jery R. Stedinger, Richard M. Vogel, Seung Uk Lee, and Rebecca Batchelder. Appraisal of the generalized likelihood uncertainty estimation (GLUE) method. Water Resources Research, 2008. doi:10.1029/2008WR006822.

[17]

Keith Beven and Paul Smith. Concepts of Information Content and Likelihood in Parameter Calibration for Hydrological Simulation Models. Journal of Hydrologic Engineering, 20(1):A4014010, January 2015. doi:10.1061/(ASCE)HE.1943-5584.0000991.

[18]

Marc C. Kennedy and Anthony O'Hagan. Bayesian calibration of computer models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 63(3):425–464, 2001. doi:10.1111/1467-9868.00294.

[19]

Yarin Gal, Jiri Hron, and Alex Kendall. Concrete Dropout. In Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017. URL: https://papers.nips.cc/paper_files/paper/2017/hash/84ddfb34126fc3a48ee38d7044e87276-Abstract.html (visited on 2023-07-12).

[20]

Florian List, Nicholas L. Rodd, Geraint F. Lewis, and Ishaan Bhat. Galactic Center Excess in a New Light: Disentangling the γ-Ray Sky with Bayesian Graph Convolutional Neural Networks. Physical Review Letters, 125(24):241102, December 2020. doi:10.1103/PhysRevLett.125.241102.

[21]

Leonid Kahle and Federico Zipoli. Quality of uncertainty estimates from neural network potential ensembles. Physical Review E, 105(1):015311, January 2022. doi:10.1103/PhysRevE.105.015311.

[22]

Andrew Gelman and Yuling Yao. Holes in Bayesian Statistics. Journal of Physics G: Nuclear and Particle Physics, 48(1):014002, January 2021. arXiv:2002.06467, doi:10.1088/1361-6471/abc3a5.

[23]

Pedro J Gonçalves, Jan-Matthis Lueckmann, Michael Deistler, Marcel Nonnenmacher, Kaan Öcal, Giacomo Bassetto, Chaitanya Chintaluri, William F Podlaski, Sara A Haddad, Tim P Vogels, David S Greenberg, and Jakob H Macke. Training deep neural density estimators to identify mechanistic models of neural dynamics. eLife, 9:e56261, September 2020. doi:10.7554/eLife.56261.

[24]

Alexandre René. EMD model comparison library. Zenodo, March 2025. doi:10.5281/zenodo.15033190.

[25]

V. Vapnik. Principles of Risk Minimization for Learning Theory. In Advances in Neural Information Processing Systems, volume 4. Morgan-Kaufmann, 1992. URL: https://proceedings.neurips.cc/paper/1991/hash/ff4d5fbbafdf976cfdc032e3bde78de5-Abstract.html (visited on 2021-12-08).

[26]

Vladimir N Vapnik. The Nature of Statistical Learning Theory. Springer New York, New York, NY, 2000. ISBN 978-1-4757-3264-1.

[27]

David F. Findley. Counterexamples to parsimony and BIC. Annals of the Institute of Statistical Mathematics, 43(3):505–514, September 1991. doi:10.1007/BF00053369.

[28]

Peter Grünwald and Thijs van Ommen. Inconsistency of Bayesian Inference for Misspecified Linear Models, and a Proposal for Repairing It. Bayesian Analysis, 12(4):1069–1103, December 2017. doi:10.1214/17-BA1085.

[29]

Tilmann Gneiting and Adrian E. Raftery. Strictly Proper Scoring Rules, Prediction, and Estimation. Journal of the American Statistical Association, 102(477):359–378, 2007. URL: https://www.jstor.org/stable/27639845 (visited on 2024-11-20), arXiv:27639845.

[30]

Andrew Gelman, John B Carlin, Hal Steven Stern, David B Dunson, Aki Vehtari, and Donald B Rubin. Bayesian Data Analysis. CRC Press, Boca Raton, 2014. ISBN 978-1-4398-9822-2.

[31]

Aki Vehtari, Andrew Gelman, and Jonah Gabry. Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC. Statistics and Computing, 27(5):1413–1432, September 2017. doi:10.1007/s11222-016-9696-4.

[32]

Hirotugu Akaike and J. de Leeuw. Information Theory and an Extension of the Maximum Likelihood Principle. In Samuel Kotz and Norman L. Johnson, editors, Breakthroughs in Statistics: Foundations and Basic Theory, volume 1 of Springer Series in Statistics, Perspectives in Statistics, pages 610–624. Springer, New York, NY, 1992. doi:10.1007/978-1-4612-0919-5.

[33]

H Akaike. Information theory and an extension of the maximum likelihood principle. In 2nd International Symposium on Information Theory, 267–281. Akadémiai Kiadó Location Budapest, Hungary, 1973.

[34]

Gideon Schwarz. Estimating the Dimension of a Model. The Annals of Statistics, 6(2):461–464, 1978. URL: https://www.jstor.org/stable/2958889 (visited on 2024-11-25), arXiv:2958889.

[35]

David J. Spiegelhalter, Nicola G. Best, Bradley P. Carlin, and Angelika Linde. The Deviance Information Criterion: 12 Years on. Journal of the Royal Statistical Society Series B: Statistical Methodology, 76(3):485–493, June 2014. doi:10.1111/rssb.12062.

[36]

David J. Spiegelhalter, Nicola G. Best, Bradley P. Carlin, and Angelika Van Der Linde. Bayesian Measures of Model Complexity and Fit. Journal of the Royal Statistical Society Series B: Statistical Methodology, 64(4):583–639, October 2002. doi:10.1111/1467-9868.00353.

[37]

David Luengo, Luca Martino, Mónica Bugallo, Víctor Elvira, and Simo Särkkä. A survey of Monte Carlo methods for parameter estimation. EURASIP Journal on Advances in Signal Processing, 2020(1):25, May 2020. doi:10.1186/s13634-020-00675-6.

[38]

B. De Schuymer, H. De Meyer, and B. De Baets. Cycle-transitive comparison of independent random variables. Journal of Multivariate Analysis, 96(2):352–373, October 2005. doi:10.1016/j.jmva.2004.10.011.

[39]

Bernard De Baets and Hans De Meyer. Toward Graded and Nongraded Variants of Stochastic Dominance. In Ildar Batyrshin, Janusz Kacprzyk, Leonid Sheremetov, and Lotfi A. Zadeh, editors, Perception-Based Data Mining and Decision Making in Economics and Finance, pages 261–274. Springer, Berlin, Heidelberg, 2007. doi:10.1007/978-3-540-36247-0_10.

[40]

Daniel T. Gillespie. The mathematics of Brownian motion and Johnson noise. American Journal of Physics, 64(3):225–240, March 1996. doi:10.1119/1.18210.

[41]

Gloria Mateu-Figueras, Gianna S. Monti, and J. J. Egozcue. Distributions on the Simplex Revisited. In Peter Filzmoser, Karel Hron, Josep Antoni Martín-Fernández, and Javier Palarea-Albaladejo, editors, Advances in Compositional Data Analysis: Festschrift in Honour of Vera Pawlowsky-Glahn, pages 61–82. Springer International Publishing, Cham, 2021. doi:10.1007/978-3-030-71175-7_4.

[42]

V. Pawlowsky-Glahn and J. J. Egozcue. Geometric approach to statistical analysis on the simplex. Stochastic Environmental Research and Risk Assessment, 15(5):384–398, October 2001. doi:10.1007/s004770100077.

[43]

G Mateu-Figueras and V Pawlowsky-Glahn. The Dirichlet distribution with respect to the Aitchison measure on the simplex - a first approach. In Proceeding of the 2nd International Workshop on Compositional Data Analysis. Girona, October 2005. URL: http://ima.udg.edu/Activitats/CoDaWork05/ (visited on 2024-03-31).

[44]

David J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, Cambridge, UK ; New York, 2003. ISBN 978-0-521-64298-9. URL: http://www.inference.phy.cam.ac.uk/itila/book.html.

[45]

Roberto Trotta. Bayes in the sky: Bayesian inference and model selection in cosmology. Contemporary Physics, 49(2):71–104, March 2008. doi:10.1080/00107510802066753.

[46]

John Skilling. Nested sampling for general Bayesian computation. Bayesian analysis, 1(4):833–859, 2006.

[47]

Sergey Koposov, Josh Speagle, Kyle Barbary, Gregory Ashton, Ed Bennett, Johannes Buchner, Carl Scheffler, Ben Cook, Colm Talbot, James Guillochon, Patricio Cubillos, Andrés Asensio Ramos, Matthieu Dartiailh, Ilya, Erik Tollerud, Dustin Lang, Ben Johnson, jtmendel, Edward Higson, Thomas Vandal, Tansu Daylan, Ruth Angus, patelR, Phillip Cargile, Patrick Sheehan, Matt Pitkin, Matthew Kirk, Joel Leja, joezuntz, and Danny Goldstein. Joshspeagle/dynesty: v2.1.4. Zenodo, June 2024. doi:10.5281/zenodo.12537467.

[48]

Peter GrĂĽnwald and Teemu Roos. Minimum description length revisited. International Journal of Mathematics for Industry, March 2020. doi:10.1142/S2661335219300018.

[49]

Sumio Watanabe. A widely applicable Bayesian information criterion. Journal of Machine Learning Research, 14(27):867–897, 2013. URL: http://jmlr.org/papers/v14/watanabe13a.html.

[50]

Ravin Kumar, Colin Carroll, Ari Hartikainen, and Osvaldo Martin. ArviZ a unified library for exploratory analysis of Bayesian models in Python. Journal of Open Source Software, 4(33):1143, January 2019. doi:10.21105/joss.01143.

[51]

Astrid A. Prinz, Cyrus P. Billimoria, and Eve Marder. Alternative to Hand-Tuning Conductance-Based Models: Construction and Analysis of Databases of Model Neurons. Journal of Neurophysiology, 90(6):3998–4015, December 2003. doi:10.1152/jn.00641.2003.

[52]

Thomas Nowotny, Attila Szücs, Rafael Levi, and Allen I. Selverston. Models Wagging the Dog: Are Circuits Constructed with Disparate Parameters? Neural Computation, 19(8):1985–2003, August 2007. doi:10.1162/neco.2007.19.8.1985.

[53]

Kailai Xu and Eric Darve. Physics constrained learning for data-driven inverse modeling from sparse observations. Journal of Computational Physics, 453:110938, March 2022. doi:10.1016/j.jcp.2021.110938.

[54]

Tina Toni, David Welch, Natalja Strelkowa, Andreas Ipsen, and Michael P.H Stumpf. Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems. Journal of The Royal Society Interface, 6(31):187–202, July 2008. doi:10.1098/rsif.2008.0172.

[55]

Sixing Chen, Antonietta Mira, and Jukka-Pekka Onnela. Flexible model selection for mechanistic network models. Journal of Complex Networks, 8(2):cnz024, April 2020. doi:10.1093/comnet/cnz024.

[56]

Keith Beven and Andrew Binley. GLUE: 20 years on. Hydrological Processes, 28(24):5897–5918, November 2014. doi:10.1002/hyp.10082.

[57]

Paul D. Arendt, Daniel W. Apley, and Wei Chen. Quantification of Model Uncertainty: Calibration, Model Discrepancy, and Identifiability. Journal of Mechanical Design, September 2012. doi:10.1115/1.4007390.

[58]

Andrew Gelman, Xiao-Li Meng, and Hal Stern. Posterior Predictive Assessment of Model Fitness Via Realized Discrepancies. Statistica Sinica, 6(4):733–760, 1996. URL: https://www.semanticscholar.org/paper/POSTERIOR-PREDICTIVE-ASSESSMENT-OF-MODEL-FITNESS-Gelman-Meng/377339cd55087d503b855ae89d2126495cf104ee (visited on 2025-03-11).

[59]

David Swigon, Shelby R. Stanhope, Sven Zenker, and Jonathan E. Rubin. On the Importance of the Jacobian Determinant in Parameter Inference for Random Parameter and Random Measurement Error Models. SIAM/ASA Journal on Uncertainty Quantification, 7(3):975–1006, January 2019. doi:10.1137/17M1114405.

[60]

Sean Talts, Michael Betancourt, Daniel Simpson, Aki Vehtari, and Andrew Gelman. Validating Bayesian Inference Algorithms with Simulation-Based Calibration. arXiv:1804.06788 [stat], April 2018. arXiv:1804.06788, doi:10.48550/arXiv.1804.06788.

[61]

Martin Modrák, Angie H. Moon, Shinyoung Kim, Paul Bürkner, Niko Huurre, Kateřina Faltejsková, Andrew Gelman, and Aki Vehtari. Simulation-based calibration checking for Bayesian computation: The choice of test quantities shapes sensitivity. Bayesian Analysis, Advance publication:28, 2023. doi:10.1214/23-BA1404.

[62]

Tuomas Sivula, Måns Magnusson, Asael Alonzo Matamoros, and Aki Vehtari. Uncertainty in Bayesian Leave-One-Out Cross-Validation Based Model Comparison. October 2023. arXiv:2008.10296, doi:10.48550/arXiv.2008.10296.

[63]

Alexandre René. Efficient and flexible simulator of the lobster pyloric circuit. Zenodo, July 2024. doi:10.5281/zenodo.12797200.

[64]

Alexandre René. Solid colored noise using sparse convolutions. Zenodo, July 2024. doi:10.5281/zenodo.12805191.

[65]

J. P. Lewis. Algorithms for solid noise synthesis. In Proceedings of the 16th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH '89, 263–270. New York, NY, USA, July 1989. Association for Computing Machinery. doi:10.1145/74333.74360.

[66]

Philipp Rudiger, Jean-Luc Stevens, Simon Høxbro Hansen, James A. Bednar, Maxime Liquet, Andrew, Bas Nijholt, Jon Mease, Chris B, Achim Randelhoff, Vasco Tenner, maxalbert, Markus Kaiser, ea42gh, stonebig, Jordan Samuels, Douglas Raillard, Ian Thomas, Kim Pevey, Florian LB, Marc Skov Madsen, Peter Roelants, Andrew Tolmie, Daniel Stephan, Demetris Roumis, Justin Bois, Scott Lowe, Stan West, Stas, and John Bampton. Holoviz/holoviews: Version 1.18.1. Zenodo, November 2023. doi:10.5281/zenodo.10089811.

[67]

Crispin W. Gardiner. Handbook of Stochastic Methods: For Physics, Chemistry and the Natural Sciences ; with 29 Figures. Number 13 in Springer Series in Synergetics. Springer, Berlin, 1983. ISBN 978-3-540-11357-7.

[68]

H. Risken. The Fokker-Planck Equation: Methods of Solution and Applications. Number v. 18 in Springer Series in Synergetics. Springer-Verlag, Berlin ; New York, 2nd ed edition, 1989. ISBN 978-0-387-50498-8.

[69]

Werner Horsthemke and René Lefever. Noise-Induced Transitions: Theory and Applications in Physics, Chemistry, and Biology. Number 15 in Springer Series in Synergetics. Springer, Berlin, 2. print edition, 2006. ISBN 978-3-540-11359-1.

[70]

Alexandre René. Code for the paper "Selecting fitted models under epistemic uncertainty using a stochastic process on quantile functions". Zenodo, March 2025. doi:10.5281/zenodo.15032858.

[71]

Brian Conrey, James Gabbard, Katie Grant, Andrew Liu, and Kent E. Morrison. Intransitive Dice. Mathematics Magazine, 89(2):133–143, April 2016. doi:10.4169/math.mag.89.2.133.

[72]

Marco Taboga. Jeffreys' scale \textbar Grades or categories of evidence for the Bayes factor. In Lectures on Probability Theory and Mathematical Statistics, pages Online appendix. Kindle Direct Publishing, 2021. URL: https://www.statlect.com/fundamentals-of-statistics/Jeffreys-scale (visited on 2023-10-19).

[73]

A. C. Davison and D. V. Hinkley. Bootstrap Methods and Their Application. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, 1997. ISBN 978-0-521-57471-6. doi:10.1017/CBO9780511802843.