Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove optimizer in stfpm recipe #3743

Merged
merged 1 commit into from
Jul 16, 2024

Conversation

eunwoosh
Copy link
Contributor

Summary

Currently, stfpm model doesn't get optimizer as argument and uses own way to initialize optimizer as below.

class Stfpm(OTXAnomaly, OTXModel, AnomalibStfpm):
    def __init__(
        self,
        layers: Sequence[str] = ["layer1", "layer2", "layer3"],
        backbone: str = "resnet18",
        task: Literal[
            OTXTaskType.ANOMALY_CLASSIFICATION,
            OTXTaskType.ANOMALY_DETECTION,
            OTXTaskType.ANOMALY_SEGMENTATION,
        ] = OTXTaskType.ANOMALY_CLASSIFICATION,
        **kwargs,
    ) -> None:
    ...
    def configure_optimizers(self) -> tuple[list[Optimizer], list[Optimizer]] | None:
        """STFPM does not follow OTX optimizer configuration."""
        return AnomalibStfpm.configure_optimizers(self)

Due to that, when executing otx train --config .../stfpm.yaml --print_config, error is raised.
And changing optimizer values in recipe doesn't affect training also.
For these reasons, This PR removes optimizer from recipe.

How to test

Checklist

  • I have added unit tests to cover my changes.​
  • I have added integration tests to cover my changes.​
  • I have ran e2e tests and there is no issues.
  • I have added the description of my changes into CHANGELOG in my target branch (e.g., CHANGELOG in develop).​
  • I have updated the documentation in my target branch accordingly (e.g., documentation in develop).
  • I have linked related issues.

License

  • I submit my code changes under the same Apache License that covers the project.
    Feel free to contact the maintainers if that's a concern.
  • I have updated the license header for each file (see an example below).
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

Copy link

codecov bot commented Jul 16, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 80.02%. Comparing base (2b00009) to head (04ad481).
Report is 1 commits behind head on develop.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #3743      +/-   ##
===========================================
+ Coverage    79.97%   80.02%   +0.05%     
===========================================
  Files          252      252              
  Lines        25567    25567              
===========================================
+ Hits         20448    20461      +13     
+ Misses        5119     5106      -13     
Flag Coverage Δ
py310 80.02% <ø> (+0.24%) ⬆️
py311 ?

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@sovrasov
Copy link
Contributor

cc @ashwinvaidya17

@eunwoosh eunwoosh merged commit 749d089 into openvinotoolkit:develop Jul 16, 2024
20 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants