Sign in

Bio-xLSTM: Generative modeling, representation and in-context learning of biological and chemical sequences

By Niklas Schmidinger and others
Language models for biological and chemical sequences enable crucial applications such as drug discovery, protein engineering, and precision medicine. Currently, these language models are predominantly based on Transformer architectures. While Transformers have yielded impressive results, their quadratic runtime dependency on the sequence length complicates their use for long genomic sequences... Show more
November 6, 2024
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Bio-xLSTM: Generative modeling, representation and in-context learning of biological and chemical sequences
Click on play to start listening