Some models like tree models can output features importance. For example XGBoost:
SHAP can be applied to measure a feature’s contribution to a single prediction:
Or to measure a feature’s contribution to the entire model:
See: