Representation Equivalent Neural Operators

Emmanuel de Bézenac, Postdoctoral Research Fellow at ETH Zurich, will talk about discretization in neural operators, the novel framework of Representation equivalent Neural Operators, as well as Convolutional Neural Operators.

Abstract

Recently, operator learning, or learning mappings between infinite-dimensional function spaces, has garnered significant attention, notably in relation to learning partial differential equations from data. Conceptually clear when outlined on paper, neural operators necessitate discretization in the transition to computer implementations. This step can compromise their integrity, often causing them to deviate from the underlying operators, with practical consequences.

This talk introduces a new take on neural operators, with a novel framework, Representation equivalent Neural Operators, designed to deal with the aforementioned issue. At its core is the concept of operator aliasing, which measures inconsistency between neural operators and their discrete representations. These concepts will be introduced and their practical applications will be discussed, introducing a novel convolutional based neural operator.

References

In this series