In this article, we will introduce three methods to incorporate external knowledge in recurrent neural networks (RNNs) for text classification.
Why use external knowlege in text classification?
External knowlege, such as sentiment lexicons, may provide some useful information for text classification. We can incorporate them into our model to imporve the performance.
How to incorporate?
We can find three methods in paper: Attention-based Conditioning Methods for External Knowledge Integration
They are:
Attentional Concatenation
Attentional Feature-based Gating
Attentional Affine Transformation
The architecture of these three methods is:
Which method is best?
From this paper, we can find Attentional Feature-based Gating is best.
How to implement these three methods?
We an find an implementation in here.