Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support init arguments in MNMG LogisticRegression #5519

Merged
merged 3 commits into from
Aug 1, 2023

Conversation

lijinf2
Copy link
Contributor

@lijinf2 lijinf2 commented Jul 25, 2023

The init arguments are for LBFGS (the only algorithm in the current MNMG LogisticRegression).

The key code changes should be a few lines after PR 5516 for predict gets merged. Key code changes can be reviewed from here

@github-actions github-actions bot added the Cython / Python Cython or Python issue label Jul 25, 2023
@lijinf2 lijinf2 added improvement Improvement / enhancement to an existing function non-breaking Non-breaking change and removed Cython / Python Cython or Python issue labels Jul 25, 2023
@lijinf2 lijinf2 self-assigned this Jul 25, 2023
@lijinf2 lijinf2 marked this pull request as ready for review July 25, 2023 00:30
@lijinf2 lijinf2 requested a review from a team as a code owner July 25, 2023 00:30
Copy link
Member

@cjnolet cjnolet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks goot @lijinf2 just a few minor things!

def __init__(self, *, handle=None):
super().__init__(handle=handle)
def __init__(
self,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In other dask estimators, we pass **kwargs through to the super class so that we don't need to specify the defaults twice in the cython layer. Is there a reason we can't do that here? As an example.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The purpose was to restrict the set of arguments to those that support L-BFGS. The added class-level documentation should serve the purpose so I revised to **kwargs to avoid duplicates. Thanks for the suggestion!

@@ -31,7 +32,8 @@
np = cpu_only_import("numpy")


class LogisticRegression(BaseEstimator, SyncFitMixinLinearModel):
# class LogisticRegression(BaseEstimator, SyncFitMixinLinearModel):
class LogisticRegression(LinearRegression):
def __init__(self, *, client=None, verbose=False, **kwargs):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do think there's a case to be made for specifying defaults in the constructor at this layer, and they should end up getting propagated to the cython class. Another thing we missed in the review of the initial Dask estimator is that this should be documented at the class level. Can you add those docs in this PR too? I think most of the docs can likely be copied from the single gpu LogisticRegression

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good suggestion. I have revised the init function and added class-level documentation.

Copy link
Member

@cjnolet cjnolet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks again @lijinf2!

@cjnolet
Copy link
Member

cjnolet commented Aug 1, 2023

/merge

@rapids-bot rapids-bot bot merged commit 6fb5bf9 into rapidsai:branch-23.08 Aug 1, 2023
49 checks passed
@lijinf2 lijinf2 deleted the lr_init branch June 26, 2024 21:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Cython / Python Cython or Python issue improvement Improvement / enhancement to an existing function non-breaking Non-breaking change
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants