Influence Diagnostics Under Self-Concordance

In our sixth seminar, we have the pleasure of receiving Jillian Fisher from the statistics department of the University of Washington, who will be presenting her recent work accepted at AISTATS2023, “Influence Diagnostics under Self-concordance”.

Abstract: Influence diagnostics such as influence functions and approximate maximum influence perturbations are popular in machine learning and in AI domain applications. Influence diagnostics are powerful statistical tools to identify influential datapoints or subsets of datapoints. We establish finite-sample statistical bounds, as well as computational complexity bounds, for influence functions and approximate maximum influence perturbations using efficient inverse-Hessian-vector product implementations. We illustrate our results with generalized linear models and large attention based models on synthetic and real data.

References

In this series