AI Bibliography

WIKINDX Resources  

Shin, J., Lee, Y., Yoon, S., & Jung, K. (2020). Fast and accurate deep bidirectional language representations for unsupervised learning. arXiv preprint arXiv:2004.08097. 
Resource type: Journal Article
BibTeX citation key: Shin2020
View all bibliographic details
Categories: Artificial Intelligence, Computer Science, Data Sciences, Decision Theory, General, Mathematics
Subcategories: Big data, Deep learning, Forecasting, Informatics, Machine learning, Markov models, Q-learning
Creators: Jung, Lee, Shin, Yoon
Publisher:
Collection: arXiv preprint arXiv:2004.08097
Attachments  
Abstract
Even though BERT achieves successful performance improvements in various supervised learning tasks, applying BERT for unsupervised tasks still holds a limitation that it requires repetitive inference for computing contextual language representations. To resolve the limitation, we propose a novel deep bidirectional language model called Transformer-based Text Autoencoder (T-TA). The T-TA computes contextual language representations without repetition and has benefits of the deep bidirectional architecture like BERT. In run-time experiments on CPU environments, the proposed T-TA performs over six times faster than the BERT-based model in the reranking task and twelve times faster in the semantic similarity task. Furthermore, the T-TA shows competitive or even better accuracies than those of BERT on the above tasks.
  
WIKINDX 6.7.0 | Total resources: 1621 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)