Skip to content

NLPatVCU/multitasking_transformers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

54 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🔃 Multitasking Transformers 🔃

training nlp models that can perform multiple tasks with the same set of representations.

pre-trained models are currently available that multitask over eight clinical note tasks.

This codebase can be utilized to replicate results for our paper. See the Replication section for details.

Installation

Install with

pip install https://s3-us-west-2.amazonaws.com/ai2-s2-scispacy/releases/v0.2.0/en_core_sci_sm-0.2.0.tar.gz
pip install git+https://github.com/AndriyMulyar/multitasking_transformers

Use

Examples are available for training, evaluation and text prediction.

Running the script predict_ner.py will automatically download a pre-trained clinical note multi-tasking model, run the model through a de-identified clinical note snippet and display the entity tags in your browser with displacy.

Replication

See the directory /examples/experiment_replication.

Preprint

https://arxiv.org/abs/2004.10220

Acknowledgement

Implementation, development and training in this project were supported by funding from the McInnes NLP Lab at Virginia Commonwealth University.