# Feature importance Once you have trained some models in your experiment, you can then perform feature importance analyses to assess which features more most influential in the models' decision making. You can get to the Feature Importance page by clinking on **Feature Importance** on the left hand side of the page. ![Feature importance page](../_static/feature-importance-page.png) To begin explaining your models, you can click the **"Explain all models"** toggle and have all your models evaluated... ![Select all models](../_static/explain-all-models.png) ...or you can use the dropdown menu to select specific models to evaluate. ![Select specific models](../_static/select-specific-models.png) ## Global feature importance methods These methods evaluate the influence of individual features overall on a model's decisions. There are two methods available. - [Permutative importance](https://scikit-learn.org/1.5/modules/permutation_importance.html) - [SHAP (SHapely Additive exPlanations)](https://www.datacamp.com/tutorial/introduction-to-shap-values-machine-learning-interpretability) ![Global feature importance](../_static/global-fi.png) ## Ensemble feature importance methods Ensemble methods combine results from multiple feature importance techniques, enhancing robustness. To use ensemble methods, you must configure at least one global importance method. There are two methods available. - Mean Use the mean of importance estimates from the selected global methods. - Majority vote Take the majority vote of importance estimates from the selected global methods. ![Ensemble feature importance](../_static/ensemble-fi.png) ## Local feature importance methods These methods are used to interpret feature importance on a *per prediction* basis. You can see which features had the most influence - and in which direction - on each prediction. There are two methods available. - [LIME (Local Interpretable Model-agnostic Explanation)](https://towardsdatascience.com/understanding-model-predictions-with-lime-a582fdff3a3b) - [SHAP (SHapely Additive exPlanations)](https://www.datacamp.com/tutorial/introduction-to-shap-values-machine-learning-interpretability) ![Local feature importance](../_static/local-fi.png) ## Additional configuration options - Number of most important features to plot Change how many top features will be plotted. - Scoring function for permutative importance - Number of repetitions for permutation importance The number of times to permute the features using permutative importance. ![Additional feature importance configuration](../_static/additional-fi-config.png) ## Select outputs to save - Save feature importance options - Save feature importance results ![Select outputs to save](../_static/save-outputs.png) ## Run the analysis Press the **"Run Feature Importance"** button to run your analysis. Be patient as this can take a little more time than the model training.