We can significantly reduce the number of features in our dataset by leveraging DataRobot's ability to train hundreds of high-quality ML models in a matter of minutes. 

Feature Importance Rank Ensembling (FIRE)  aggregates the rankings of individual features using Feature Impact from several blueprints on the leaderboard. This approach can provide greater accuracy and robustness over other feature reduction methods. 

 

About this Accelerator

This accelerator  shows how to apply FIRE to your dataset and dramatically reduce the number of features without impacting the performance of the final model.

 

What you will learn  

  1. Calculate the permutation feature importance for the top five performing models in the Leaderboard against the selected metric.
  2. For each model with computed feature importance, get the ranking of the features.
  3. Compute the median rank of each feature by aggregating the ranks of the features across all models.
  4. Sort the aggregated list by the computed median rank.
  5. Define the threshold number of features to select. In this case, use the number of features that account for 95% of the cumulative feature impact.
  6. Create a feature list based on the newly selected features.

 

Additional Resources

Using Feature Importance Rank Ensembling (FIRE) for Advanced Feature Selection

Labels (1)
Contributors
Version history
Last update:
‎09-05-2023 10:14 PM
Updated by: