How Shapley Values Work
In this article, we will explore how Shapley values work - not using cryptic formulae, but by way of code and simplified explanations
Industry Perspective: Tree-Based Models vs Deep Learning for Tabular Data
Tree-based models aren't just highly performant - they offer a host of other advantages
Supervised Clustering: How to Use SHAP Values for Better Cluster Analysis
Cluster analysis is a popular method for identifying subgroups within a population, but the results are often challenging to interpret
Utility vs Understanding: the State of Machine Learning Entering 2022
The empirical utility of some fields of machine learning has rapidly outpaced our understanding of the underlying theory: the models
Explaining Machine Learning Models: A Non-Technical Guide to Interpreting SHAP Analyses
With interpretability becoming an increasingly important requirement for machine learning projects, there's a growing need for the complex outputs of techniques such as SHAP to be communicated to non-technical stakeholders.