Skip to content

[FEA] Add support for computing feature_importances in RF #3531

Open
@teju85

Description

@teju85

Is your feature request related to a problem? Please describe.
RF implementation should support computing feature_importances_ property, just like how it is exposed in sklearn.

Describe the solution you'd like

  1. By default, we should compute normalized feature_importances_ (ie. all the importances across the features sum to 1.0)
  2. Implementation that is done in sklearn is here. We have all of this information in our Node. We just need to, while building the tree, keep accumulating each feature's importance as we keep adding more nodes.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Algorithm API ChangeFor tracking changes to algorithms that might effect the APICUDA / C++CUDA issueCython / PythonCython or Python issuedocDocumentationfeature requestNew feature or requestimprovementImprovement / enhancement to an existing functioninactive-30d

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions