Affinitention Nets: Kernel Perspective on Attention Architectures for Set Classification with Applications to Medical Text and Images

David Dov, Serge Assaad, Shijing Si, and Rui Wang (Duke University) , Hongteng Xu (Renmin University of China) , Shahar Ziv Kovalsky (UNC at Chapel Hill) , Jonathan Bell and Danielle Elliott Range (Duke University Hospital) , Jonathan Cohen (Kaplan Medical Center) , Ricardo Henao and Lawrence Carin (Duke University)

View paper in the ACM journal

Abstract: Set classification is the task of predicting a single label from a set comprising multiple instances. The examples we consider are pathology slides represented by sets of patches and medical text represented by sets of word embeddings. State of the art methods, such as the transformers, typically use attention mechanisms to learn representations of set-data by modeling interactions between instances of the set. These methods, however, have complex heuristic architectures comprising multiple heads and layers. The complexity of attention architectures hampers their training when only a small number of labeled sets is available, as is often the case in medical applications. To address this problem, we present a kernel-based representation learning framework that associates between learning affinity kernels to learning representations from attention architectures. We show that learning a combination of the sum and the product of kernels is equivalent to learning representations from multi-head multi-layer attention architectures. From our framework, we devise a simplified attention architecture which we term \emph{affinitention} (affinity-attention) nets. We demonstrate the application of affinitention nets to the classification of Set-Cifar10 dataset, thyroid malignancy prediction from pathology slides, as well as patient text message-triage. We show that affinitention nets provide competitive results compared to heuristic attention architectures and outperform other competing methods.

If the Livestream seems inaccessible, please try refreshing your browser. Clicking the "LIVE" button ensures you are in sync with the live content.