Set Transformer

Submitted by on Feb 15 2021 } Suggest Revision
By: Juho Lee, Yoonho Lee, Jungtaek Kim, Adam Kosiorek, Seungjin Choi, Yee Whye Teh
From: Juho Lee, Yoonho Lee, Jungtaek Kim, Adam Kosiorek, Seungjin Choi, Yee Whye Teh
Resource Type:
Code
License:
MIT License
Language:
Python
Data Format:

Description

A Framework for Attention-based Permutation-Invariant Neural Networks . The Set Transformer is specifically designed to model interactions among elements in the input set. The model consists of an encoder and a decoder, both of which rely on attention mechanisms. In an effort to reduce computational complexity, we introduce an attention scheme inspired by inducing point methods from sparse Gaussian process literature. It reduces the computation time of self-attention from quadratic to linear in the number of elements in the set.
Categorized in: Machine Learning | Neural Networks
Post comment
Cancel