ML Engineering Tools
Feature Importance Reference
Search feature importance methods. Covers SHAP, TreeSHAP, permutation importance, LIME, GradCAM, and correlated feature pitfalls.
No data is transmitted — everything runs locallyTool
About this tool
Feature Importance Reference
The Feature Importance Reference covers SHAP, TreeSHAP, permutation importance, LIME, and GradCAM with correlated feature and stability pitfalls.
• Choose SHAP vs permutation for a tree model
• Understand LIME instability before relying on it
• Reference GradCAM for CNN explanation
• Identify correlated feature splitting before presenting rankings
Next step
Batch Size Memory Calculator — Calculate GPU memory for training from model size, precision, and optimizer.
Open Batch Size Memory Calculator →
FAQ
What does this tool tell you?
The Feature Importance Reference covers SHAP, TreeSHAP, permutation importance, LIME, and GradCAM with correlated feature and stability pitfalls.
What affects the result most?
Permutation importance: shuffle feature → measure accuracy drop — model-agnostic, slow. SHAP: Shapley values — game-theoretic attribution, consistent, expensive for large models. TreeSHAP: exact SHAP for tree models in polynomial time — fast, widely adopted.
How should I use the result?
Use this tool to orient quickly to the concepts, field names, or values you are about to look up in a full specification or vendor documentation. It summarizes the common cases; the authoritative source remains whichever standard or vendor doc defines the values themselves.