Abstract by Andrew Carr
Graph Neural Processes: Towards Bayesian Graph Neural Networks
We introduce graph neural processes (GNP), inspired by the recent work in conditional and latent neural processes. We demonstrate graph neural processes in edge imputation and discuss benefits and draw backs of the method for other application areas. One major benefit of GNPs is the ability to quantify uncertainty in deep learning on graph structures. An additional benefit of this method is the ability to extend graph neural networks to inputs of dynamic sized graphs.