-
Notifications
You must be signed in to change notification settings - Fork 377
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add ArgillaSpaCyTransformersTrainer
& improve ArgillaSpaCyTrainer
#3256
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM and thanks for the fix. Perhaps it makes more sense to me to pass the variable via the update_config
to align with the usage of the rest of the frameworks. Also, I don't see the new config options being test right?
Otherwise the configuration for `spacy-transformers` generated via `init_config` doesn't support it
Hi @alvarobartt, could you also include the implementation and tests for the FeedbackDataset? |
Not sure I'll have time for this release, need to focus now on the |
I can have a look tomorrow afternoon. I implemented most stuff so it should only require some small changes. |
Hi @davidberenstein1957 can you have a look into the unit test that is failing? Is there anything that you changed w.r.t. |
@alvarobartt, I added the integration and some test. Can you have a final look tomorrow before merging? |
for more information, see https://pre-commit.ci
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## develop #3256 +/- ##
===========================================
- Coverage 90.91% 90.14% -0.77%
===========================================
Files 215 233 +18
Lines 11304 12493 +1189
===========================================
+ Hits 10277 11262 +985
- Misses 1027 1231 +204
Flags with carried forward coverage won't be shown. Click here to find out more.
☔ View full report in Codecov by Sentry. |
Description
This PR adds support for
spacy-transformers
via the newArgillaSpaCyTransformersTrainer
class allowing the user to lock thetransformer
model not to be updated, and theArgillaSpaCyTrainer
is improved to allow re-using thetok2vec
or freeze it, if available.Besides that, this PR also includes a new arg in
ArgillaTrainer
namedframework_kwargs
which is a Python dict to contain the framework-specific kwargs, in this case intended to be created as{"update_transformer": False}
and{"freeze_tok2vec": False}
for bothArgillaSpaCyTransformersTrainer
andArgillaSpaCyTrainer
, respectively. Ideally, we should also move thespaCy
specific-args already included asArgillaTrainer
args.Type of change
How Has This Been Tested
ArgillaSpaCyTransformersTrainer
Checklist