Skip to content

okz12/HierarchicalAttention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HierarchicalAttention

This repository combines the use of Hierarchical Attention Network (HAN) with d3-based visualisations to give the user insight into what the network is detecting.

The HAN uses attention at both the sentence level and the word level and we can visualise the informative sentences and words in a sentence by extracting attention weights.

Below are the results for YahooAnswers dataset, using HuggingFace's DistilRoberta for tokenizing and frozen word embedding word and sentence attention layers added on.

HANYH

Sources:

  1. HAN Paper: https://www.cs.cmu.edu/~./hovy/papers/16HLT-hierarchical-attention-networks.pdf
  2. HAN Implementation I referred to to begin working (but have changed significantly since): https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Text-Classification
  3. D3 Visualisation was inspired by ecco: https://jalammar.github.io/explaining-transformers/

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published