Christoph Molnar
Christoph Molnar
Independent researcher and book author
No verified email - Homepage
Cited by
Cited by
Interpretable Machine Learning - A Guide for Making Black Box Models Explainable
C Molnar, 2018
iml: An R package for Interpretable Machine Learning
C Molnar, G Casalicchio, B Bischl
Journal of Open Source Software 3, 786, 2018
Interpretable Machine Learning - A Brief History, State-of-the-Art and Challenges
C Molnar, G Casalicchio, B Bischl
arXiv preprint arXiv:2010.09337, 2020
TNF blockers inhibit spinal radiographic progression in ankylosing spondylitis by reducing disease activity: results from the Swiss Clinical Quality Management cohort
C Molnar, A Scherer, X Baraliakos, M de Hooge, R Micheroli, P Exer, ...
Annals of the rheumatic diseases 77 (1), 63-69, 2018
Interpretable machine learning: a guide for making black box models explainable. 2019
C Molnar
URL https://christophm. github. io/interpretable-ml-book, 2019
Visualizing the feature importance for black box models
G Casalicchio, C Molnar, B Bischl
Machine Learning and Knowledge Discovery in Databases: European Conference …, 2019
Multi-objective counterfactual explanations
S Dandl, C Molnar, M Binder, B Bischl
Bäck T. et al. (eds) Parallel Problem Solving from Nature – PPSN XVI. PPSN …, 2020
General pitfalls of model-agnostic interpretation methods for machine learning models
C Molnar, G König, J Herbinger, T Freiesleben, S Dandl, CA Scholbeck, ...
xxAI-Beyond Explainable AI: International Workshop, Held in Conjunction with …, 2022
Quantifying Model Complexity via Functional Decomposition for Better Post-Hoc Interpretability
C Molnar, G Casalicchio, B Bischl
Joint European Conference on Machine Learning and Knowledge Discovery in …, 2019
Explainable AI methods-a brief overview
A Holzinger, A Saranti, C Molnar, P Biecek, W Samek
xxAI-Beyond Explainable AI: International Workshop, Held in Conjunction with …, 2022
Model-agnostic feature importance and effects with dependent features: a conditional subgroup approach
C Molnar, G König, B Bischl, G Casalicchio
Data Mining and Knowledge Discovery, 1-39, 2023
Estimation of voter transitions based on ecological inference: An empirical assessment of different approaches
A Klima, PW Thurner, C Molnar, T Schlesinger, H Küchenhoff
AStA Advances in Statistical Analysis 100, 133-159, 2016
Errors in palliative care: kinds, causes, and consequences: a pilot survey of experiences and attitudes of palliative care professionals
I Dietz, GD Borasio, C Molnar, C Müller-Busch, A Plog, G Schneider, ...
Journal of palliative medicine 16 (1), 74-81, 2013
Relative Feature Importance
G König, C Molnar, B Bischl, M Grosse-Wentrup
2020 25th International Conference on Pattern Recognition (ICPR), 9318--9325, 2021
Sampling, intervention, prediction, aggregation: a generalized framework for model-agnostic interpretations
CA Scholbeck, C Molnar, C Heumann, B Bischl, G Casalicchio
Machine Learning and Knowledge Discovery in Databases: International …, 2020
Relating the partial dependence plot and permutation feature importance to the data generating process
C Molnar, T Freiesleben, G König, G Casalicchio, MN Wright, B Bischl
arXiv preprint arXiv:2109.01433, 2021
Beyond prediction: methods for interpreting complex models of soil variation
AMJC Wadoux, C Molnar
Geoderma 422, 115953, 2022
Recursive partitioning by conditional inference
C Molnar
Department of Statistics, University of Munich: Munich, Germany, 2013
Package ‘iml’
C Molnar, P Schratz
R CRAN, 2020
Scientific inference with interpretable machine learning: Analyzing models to learn about real-world phenomena
T Freiesleben, G König, C Molnar, A Tejero-Cantero
arXiv preprint arXiv:2206.05487, 2022
The system can't perform the operation now. Try again later.
Articles 1–20