Abstract: Large pretrained models, like BERT, GPT, and Wav2Vec, have demonstrated their ability to learn transferable representations for various downstream tasks. However, obtaining a substantial ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results