ML Engineering Tools

Feature Importance Reference

Search feature importance methods. Covers SHAP, TreeSHAP, permutation importance, LIME, GradCAM, and correlated feature pitfalls.

No data is transmitted — everything runs locally

Feature Importance Reference

The Feature Importance Reference covers SHAP, TreeSHAP, permutation importance, LIME, and GradCAM with correlated feature and stability pitfalls.

• Choose SHAP vs permutation for a tree model

• Understand LIME instability before relying on it

• Reference GradCAM for CNN explanation

• Identify correlated feature splitting before presenting rankings

Batch Size Memory Calculator — Calculate GPU memory for training from model size, precision, and optimizer.
Open Batch Size Memory Calculator →
What does this tool tell you?
The Feature Importance Reference covers SHAP, TreeSHAP, permutation importance, LIME, and GradCAM with correlated feature and stability pitfalls.
What affects the result most?
Permutation importance: shuffle feature → measure accuracy drop — model-agnostic, slow. SHAP: Shapley values — game-theoretic attribution, consistent, expensive for large models. TreeSHAP: exact SHAP for tree models in polynomial time — fast, widely adopted.
How should I use the result?
Use this tool to orient quickly to the concepts, field names, or values you are about to look up in a full specification or vendor documentation. It summarizes the common cases; the authoritative source remains whichever standard or vendor doc defines the values themselves.
ML training and serving credential management. 1Password Teams for ML engineers managing cloud GPU credentials, model registry API keys, and data source secrets.
View ML credential management →
External site · Independent provider · We may receive a commission · Not a recommendation