Sign in

Innovative Bert-based Reranking Language Models for Speech Recognition

By Shih-Hsuan Chiu and Berlin Chen
More recently, Bidirectional Encoder Representations from Transformers (BERT) was proposed and has achieved impressive success on many natural language processing (NLP) tasks such as question answering and language understanding, due mainly to its effective pre-training then fine-tuning paradigm as well as strong local contextual modeling ability. In view of the... Show more
April 11, 2021
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Innovative Bert-based Reranking Language Models for Speech Recognition
Click on play to start listening