Theoretical Machine Learning Seminar

Graph Nets: The Next Generation

In this talk I will introduce our next generation of graph neural networks. GNNs have the property that they are invariant to permutations of the nodes in the graph and to rotations of the graph as a whole. We claim this is unnecessarily restrictive and in this talk we will explore extensions of these GNNs to more flexible equivariant constructions. In particular, Natural Graph Networks for general graphs are globally equivariant under permutations of the nodes but can still be executed through local message passing protocols. Our mesh-CNNs on manifolds are equivariant under SO(2) gauge transformations and as such, unlike regular GNNs, entertain non-isotropic kernels. And finally our SE(3)-transformers are local message passing GNNs, invariant to permutations but equivariant to global SE(3) transformations. These developments clearly emphasize the importance of geometry and symmetries as design principles for graph (or other) neural networks.

Joint with: Pim de Haan and Taco Cohen (Natural Graph Networks) Pim de Haan, Maurice Weiler and Taco Cohen (Mesh-CNNs) Fabian Fuchs and Daniel Worrall (SE(3)-Transformers)

Date & Time

July 21, 2020 | 12:30pm – 1:45pm

Location

Remote Access Only - see link below

Speakers

Max Welling

Affiliation

University of Amsterdam

Notes

We welcome broad participation in our seminar series. To receive login details, interested participants will need to fill out a registration form accessible from the link below.  Upcoming seminars in this series can be found here.

Register Here