You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you Charles for providing bert_sklearn.
This is more a general question than a specific issue.
In the SciBert, BioBert tutorial for biomedial NER, tokenisation of the dataset is done manually.
I was wondering why a BERT tokenizer is not used and how the use of a manual tokenizer affects the performance?
Thanks
The text was updated successfully, but these errors were encountered:
Thank you Charles for providing bert_sklearn.
This is more a general question than a specific issue.
In the SciBert, BioBert tutorial for biomedial NER, tokenisation of the dataset is done manually.
I was wondering why a BERT tokenizer is not used and how the use of a manual tokenizer affects the performance?
Thanks
The text was updated successfully, but these errors were encountered: