Logo

Works

Talks

Time

CV

Me

(???)


Talks

Pack your subgraphs: A journey into subgraphs for powerful Graph Neural Networks

Yves-Alexandre de Montjoye's Group Talks (Imperial College London)

tl;dr
Our latest two works on subgraphs for more expressive GNNs lined up to tell a coherent story. In the first part of the talk we show the design of a novel framework (ESAN) to process generic bags of subgraphs in an equivariant manner. We show that this approach effectively increases the expressive power of MPNNs. Then, we notice a surge in concurrent approaches which – sometimes unwittingly – make use of subgraphs for powerful graph representations. In the second part, a novel symmetry analysis allows to unify and better characterise these approaches when using a node-based selection policy. We prove an upper-bound on their expressive power and conceive a framework serving as a design space for equivariant node-based subgraph methods.

Based on: https://arxiv.org/abs/2206.11140, https://arxiv.org/abs/2110.02910


Pack your subgraphs: A journey into subgraphs for powerful Graph Neural Networks

Course on Geometric Deep Learning at Oxford University – Lecture

(slides)

tl;dr
Our latest two works on subgraphs for more expressive GNNs lined up to tell a coherent story. In the first part of the talk we show the design of a novel framework (ESAN) to process generic bags of subgraphs in an equivariant manner. We show that this approach effectively increases the expressive power of MPNNs. Then, we notice a surge in concurrent approaches which – sometimes unwittingly – make use of subgraphs for powerful graph representations. In the second part, a novel symmetry analysis allows to unify and better characterise these approaches when using a node-based selection policy. We prove an upper-bound on their expressive power and conceive a framework serving as a design space for equivariant node-based subgraph methods.

Based on: https://arxiv.org/abs/2206.11140, https://arxiv.org/abs/2110.02910


Exploring the practical and theoretical landscape of expressive Graph Neural Networks

Learning on Graphs Conference 2022

(recording) (slides)

tl;dr
The tutorial reviews the most prominent expressive GNNs, categorises them into different families, and draws interesting connections between them. This is accomplished through a series of practical coding sessions and an organic overview of the literature landscape. We aim to convey the importance of studying the expressive power of GNNs and make this field more accessible to our community, especially practitioners and newcomers.

Co-presented with: Beatrice Bevilacqua, Haggai Maron


Understanding and Extending Subgraph GNNs by Rethinking their Symmetries

Learning on Graphs and Geometry Reading Group

(recording)

tl;dr
Many concurrent works seem to share a similar underlying intuition: to represent a graph by processing subgraphs generated via some simple selection policy. Inspired by a novel symmetry analyses for the most prominent of these methods – those where subgraphs are in a 1:1 correspondence with nodes in the original graph – we prove an upper-bound on their expressive power and conceive a framework serving as a design space for future new equivariant node-based subgraph architectures.

Co-presented with: Beatrice Bevilacqua

Based on: https://arxiv.org/abs/2206.11140


Pack your subgraphs: A journey into subgraphs for powerful Graph Neural Networks

Simone Scardapane's Group Talks (Sapienza)

tl;dr
We line up our latest two works on subgraphs for more expressive GNNs to tell a coherent story. In the first part of the talk we show the design of a novel framework (ESAN) to process generic bags of subgraphs in an equivariant manner. We show that this approach effectively increases the expressive power of MPNNs. Then, we notice a surge in concurrent approaches which – sometimes unwittingly – make use of subgraphs for powerful graph representations. In the second part, a novel symmetry analysis allows to unify and better characterise these approaches when using a node-based selection policy. We prove an upper-bound on their expressive power and conceive a framework serving as a design space for equivariant node-based subgraph methods.

Co-presented with: Beatrice Bevilacqua

Based on: https://arxiv.org/abs/2206.11140, https://arxiv.org/abs/2110.02910


Pack your subgraphs: A journey into subgraphs for powerful Graph Neural Networks

Meta AI orgs Reading Meeting

tl;dr
We line up our latest two works on subgraphs for more expressive GNNs to tell a coherent story. In the first part of the talk we show the design of a novel framework (ESAN) to process generic bags of subgraphs in an equivariant manner. We show that this approach effectively increases the expressive power of MPNNs. Then, we notice a surge in concurrent approaches which – sometimes unwittingly – make use of subgraphs for powerful graph representations. In the second part, a novel symmetry analysis allows to unify and better characterise these approaches when using a node-based selection policy. We prove an upper-bound on their expressive power and conceive a framework serving as a design space for equivariant node-based subgraph methods.

Co-presented with: Beatrice Bevilacqua

Based on: https://arxiv.org/abs/2206.11140, https://arxiv.org/abs/2110.02910


Pack your subgraphs: A journey into subgraphs for powerful Graph Neural Networks

Maks Ovsjanikov's Group Talks (École Polytechnique, France)

tl;dr
We line up our latest two works on subgraphs for more expressive GNNs to tell a coherent story. In the first part of the talk we show the design of a novel framework (ESAN) to process generic bags of subgraphs in an equivariant manner. We show that this approach effectively increases the expressive power of MPNNs. Then, we notice a surge in concurrent approaches which – sometimes unwittingly – make use of subgraphs for powerful graph representations. In the second part, a novel symmetry analysis allows to unify and better characterise these approaches when using a node-based selection policy. We prove an upper-bound on their expressive power and conceive a framework serving as a design space for equivariant node-based subgraph methods.

Co-presented with: Beatrice Bevilacqua

Based on: https://arxiv.org/abs/2206.11140, https://arxiv.org/abs/2110.02910


Subgraphs for more expressive GNNs

Course on Geometric Deep Learning at the African Master's on Machine Intelligence (GDL100) – Seminar

(recording) (slides)

tl;dr
Many concurrent works seem to share a similar underlying intuition: to represent a graph by processing subgraphs generated via some simple selection policy. We discuss these methods together and try to find a common characterisation, while showing that the architecture proposed in our previous work 'Equivariant Subgraph Aggregation Networks' subsumes most of them.

Based on: https://arxiv.org/abs/2110.02910


Graph Representation Learning on Simplicial and Cellular Complexes

Dagstuhl Seminars – Graph Embeddings: Theory meets Practice

tl;dr
I gather together our works on message-passing on Simplicial and Cellular Complexes and present them together showing how these approaches in turn allow for more expressive graph representations.

Based on: https://arxiv.org/abs/2106.12575, https://arxiv.org/abs/2103.03212


Equivariant Subgraph Aggregation Networks

Author Interviews with Zak Jost

(recording)

tl;dr
We explore the idea of modeling a graph as a bag of subgraphs generated by simple, domain agnostic selection policies, such as node-deletion, edge-deletion, ego-networks. We formally show that this approach lead to the design Graph Neural Networks which can be made strictly more powerful than standard Message Passing ones. We prove this by defining a novel WL variant and by constructing layers equivariant to the emerging symmetry group.

Co-presented with: Derek Lim, Beatrice Bevilacqua

Based on: https://arxiv.org/abs/2110.02910


Subgraph Networks

Third Nepal Winter School in AI

(recording)

tl;dr
Many concurrent works seem to share a similar underlying intuition: to represent a graph by processing subgraphs generated via some simple selection policy. We discuss these methods together and try to find a common characterisation, while showing that the architecture proposed in our previous work 'Equivariant Subgraph Aggregation Networks' subsumes most of them.

Based on: https://arxiv.org/abs/2110.02910


Equivariant Subgraph Aggregation Networks

Learning on Graphs and Geometry Reading Group

(recording)

tl;dr
We explore the idea of modeling a graph as a bag of subgraphs generated by simple, domain agnostic selection policies, such as node-deletion, edge-deletion, ego-networks. We formally show that this approach lead to the design Graph Neural Networks which can be made strictly more powerful than standard Message Passing ones. We prove this by defining a novel WL variant and by constructing layers equivariant to the emerging symmetry group.

Co-presented with: Derek Lim, Beatrice Bevilacqua

Based on: https://arxiv.org/abs/2110.02910


Graph Convolutional Network for Disease Prediction with Imbalanced Data – Anees Kazi

London Machine Learning Meetup (Moderator)

(recording)

tl;dr
I moderate Anees Kazi's talk at the London Machine Learning Meetup.

Weisfeiler and Lehman Go Cellular: CW Networks

Learning on Graphs and Geometry Reading Group

(recording)

tl;dr
We extend message-passing from graphs and Simplicial Complexes to Cellular Complexes, combinatorial topological spaces generalising these two. By lifting graphs to Cellular Complexes we can naturally model molecules and unlock a series of advantages w.r.t. standard pairwise message-passing.

Co-presented with: Cristian Bodnar

Based on: https://arxiv.org/abs/2106.12575


Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks

Course on Geometric Deep Learning at the African Master's on Machine Intelligence (GDL100) – Seminar

(slides)

tl;dr
We re-define colour refinement and message-passing on Simplicial Complexes, topological generalisation of graphs. This allows to model higher-order interactions, model meso-scale structures and increase the expressive power of Graph Neural Networks.

Based on: https://arxiv.org/abs/2103.03212


The expressive power of GNNs by the WL test

London Geometry and Machine Learning Summer School 2021 – Tutorial

tl;dr
What is the current state of affair in provably more expressive Graph Neural Networks? Giorgos Bouritsas and I lead and moderate this interactive discussion on approaches to go beyond 1-WL discriminating power. Ultimately, we would like to tackle the following question: 'Is the Weisfeiler-Leman hierarchy suitable to study the representational power of learning models on graphs? Do we necessarily need to go beyond message-passing?'

Co-presented with: Giorgos Bouritsas


Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks

TopoNets 2021 – Networks Beyond Pairwise Interactions

tl;dr
We re-define colour refinement and message-passing on Simplicial Complexes, topological generalisation of graphs. This allows to model higher-order interactions, model meso-scale structures and increase the expressive power of Graph Neural Networks.

Co-presented with: Cristian Bodnar

Based on: https://arxiv.org/abs/2103.03212


Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks

Artificial Intelligence Research Group Talks (Cambridge University)

tl;dr
We re-define colour refinement and message-passing on Simplicial Complexes, topological generalisation of graphs. This allows to model higher-order interactions, model meso-scale structures and increase the expressive power of Graph Neural Networks.

Co-presented with: Cristian Bodnar, Yu Guang Wang

Based on: https://arxiv.org/abs/2103.03212


Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks

Graph Journal Club (Valence AI)

tl;dr
We re-define colour refinement and message-passing on Simplicial Complexes, topological generalisation of graphs. This allows to model higher-order interactions, model meso-scale structures and increase the expressive power of Graph Neural Networks.

Co-presented with: Cristian Bodnar, Yu Guang Wang

Based on: https://arxiv.org/abs/2103.03212


Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks

Math Machine Learning seminar MPI MIS + UCLA

(recording) (slides)

tl;dr
We re-define colour refinement and message-passing on Simplicial Complexes, topological generalisation of graphs. This allows to model higher-order interactions, model meso-scale structures and increase the expressive power of Graph Neural Networks.

Co-presented with: Cristian Bodnar, Yu Guang Wang

Based on: https://arxiv.org/abs/2103.03212


Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting

Graph Journal Club (Valence AI)

tl;dr
We study how isomorphism counting of small substructures can improve the expressive power of Message Passing Neural Networks, while retaining equivariance, a good inductive bias, and their tractable fwd-pass computational complexity.

Co-presented with: Giorgos Bouritsas

Based on: https://arxiv.org/abs/2006.09252





My personal website has been realised through Jekyll and Github Pages. The theme — slightly customised — is by orderedlist. Drop me a message if you'd fancy a chat!