Sign in

Exploring Self-Attention Mechanisms for Speech Separation

By Cem Subakan and others
Transformers have enabled impressive improvements in deep learning. They often outperform recurrent and convolutional models in many tasks while taking advantage of parallel processing. Recently, we proposed the SepFormer, which obtains state-of-the-art performance in speech separation with the WSJ0-2/3 Mix datasets. This paper studies in-depth Transformers for speech separation. In... Show more
May 27, 2023
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Exploring Self-Attention Mechanisms for Speech Separation
Click on play to start listening