Revision

Back to ML System Design


Feature Evaluation

  1. Feature importance
  2. Feature generalization


Feature importance

Model specific

Some models like tree models can output features importance. For example XGBoost:


Model agnostic

SHAP: SHapley Additive exPlanations

SHAP can be applied to measure a feature’s contribution to a single prediction:

Or to measure a feature’s contribution to the entire model:


Resources

See: