Skip to content

[FEA] Add support for weighted feature subsampling in RF #3525

Open
@teju85

Description

@teju85

Is your feature request related to a problem? Please describe.
Current RF implementation only supports uniform subsampling of features (as of 0.18). We also need to extend this to support weighted subsampling in RF.

Describe the solution you'd like
Ideally, we need to expose a feature_weights option in the constructor for both classifier/regressor. It's default value is None (aka uniform subsampling). If it is not None, then it must be a list of weights one for each feature in the dataset. Then, when max_features is less than 1 (meaning subsampling is enabled), we need to perform either uniform or weighted subsampling, respectively.

Additional context
JFYI, sklearn does NOT support such an option.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Algorithm API ChangeFor tracking changes to algorithms that might effect the APICUDA / C++CUDA issueCython / PythonCython or Python issueExperimentalUsed to denote experimental featuresdocDocumentationfeature requestNew feature or requestimprovementImprovement / enhancement to an existing functioninactive-30d

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions