Sign in

Generative Pretraining at Scale: Transformer-Based Encoding of Transactional Behavior for Fraud Detection

By Ze Zhao and others
In this work, we introduce an innovative autoregressive model leveraging Generative Pretrained Transformer (GPT) architectures, tailored for fraud detection in payment systems. Our approach innovatively confronts token explosion and reconstructs behavioral sequences, providing a nuanced understanding of transactional behavior through temporal and contextual analysis. Utilizing unsupervised pretraining, our model excels... Show more
December 22, 2023
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Generative Pretraining at Scale: Transformer-Based Encoding of Transactional Behavior for Fraud Detection
Click on play to start listening