Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Pipeline methods [WIP] #114

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

khizirsiddiqui
Copy link
Collaborator

No description provided.

methods.

:param: steps (list) list of KD_Lib.KD or KD_Lib.Pruning or KD_Lib.Quantization
:param: epochs (int) number of iterations through whole batch for each method in
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can remove this parameter right? Since each individual method will have epochs in its initialization

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Distillation specifies these parameters in train_student method instead of class constructor.

Comment on lines 74 to 81
if hasattr(process, 'train_student'):
process.train_student(self.epochs, self.plot_losses, self.save_model, self.save_model_path)
elif hasattr(proces, 'prune'):
process.prune()
elif hasattr(process, 'quantize'):
process.quantize()
else:
raise TypeError("{} is not supported by the pipeline yet."
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Save the final model after all the steps in the process

@codecov-commenter
Copy link

codecov-commenter commented Jun 11, 2021

Codecov Report

Merging #114 (8f8d42b) into master (0ecccec) will decrease coverage by 55.04%.
The diff coverage is 8.92%.

@@            Coverage Diff             @@
##           master    #114       +/-   ##
==========================================
- Coverage   55.83%   0.79%   -55.05%     
==========================================
  Files          60      61        +1     
  Lines        2115    2012      -103     
==========================================
- Hits         1181      16     -1165     
- Misses        934    1996     +1062     
Impacted Files Coverage Δ
KD_Lib/utils/pipeline.py 7.27% <7.27%> (ø)
KD_Lib/utils/__init__.py 100.00% <100.00%> (ø)
KD_Lib/models/nin.py 0.00% <0.00%> (-100.00%) ⬇️
KD_Lib/models/lstm.py 0.00% <0.00%> (-100.00%) ⬇️
KD_Lib/models/lenet.py 0.00% <0.00%> (-100.00%) ⬇️
KD_Lib/models/shallow.py 0.00% <0.00%> (-100.00%) ⬇️
KD_Lib/models/__init__.py 0.00% <0.00%> (-100.00%) ⬇️
KD_Lib/KD/text/__init__.py 0.00% <0.00%> (-100.00%) ⬇️
KD_Lib/Pruning/__init__.py 0.00% <0.00%> (-100.00%) ⬇️
KD_Lib/KD/vision/KA/__init__.py 0.00% <0.00%> (-100.00%) ⬇️
... and 50 more

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Making a pipeline for Pruning, Quantization and Knowledge Distillation
3 participants