There are at least three notions of feature importance:
How big impact does each feature contributes to the prediction. E.g. the coefficients of a linear regression model. SHAP also belongs to this category. Note, the notion of feature importance here has nothing to do with the performance of the a model.
The importance of a feature in contribution to the predictive accuracy. E.g. for decision trees, some form of split gain summary for a feature belongs to this category.