For this project , i am planning ensemble of RNN,DPCNN and GBM models in order to achieve the appropriate diversity. The most interesting part of our solution was the neural networks on which we did:
train on pre-trained embeddings. (FastText, Glove twitter, BPEmb, Word2Vec, LexVec,seq2seq)
BPEmb pre-trained embeddings.
Train and test time augmentations as well as the embeddings gave as the boost to the model. I have also to mention that for this competition we used Amazon AWS and trained on 5fold CV.
Ensemble:
Finally we ended up with about 30 different models of which we took the average.