Portfolio item number 1
Published:
Short description of portfolio item number 1
Published:
Short description of portfolio item number 1
Published:
Short description of portfolio item number 2
Published in arxiv, 2021
We investigate the connections between sparse approximation methods for making kernel methods and Gaussian processes (GPs) scalable to massive data, focusing on the Nyström method and Sparse Variational Gaussian Processes (SVGP).
Recommended citation: Veit Wild, Motonobu Kanagawa and Dino Sejdinovic (2010). "Connections and Equivalences between the Nyström Method and Sparse Variational Gaussian Processes." arXiv preprint arXiv:2106.01121 .
Published in Journal of Computational and Graphical Statistics, 2022
In modern data analysis, nonparametric measures of discrepancies between random variables are particularly important. The subject is well-studied in the frequentist literature, while the development in the Bayesian setting is limited where applications are often restricted to univariate cases…
Recommended citation: Qinyi Zhang, Veit Wild, Sarah Filippi, Seth Flaxman and Dino Sejdinovic. "Generalized Variational Inference in Function Spaces: Gaussian Measures meet Bayesian Deep Learning." Journal of Computational and Graphical Statistics.
Published in International Conference on Artificial Intelligence and Statistics, 2022
Variational Gaussian process (GP) approximations have become a standard tool in fast GP inference. This technique requires a user to select variational features to increase efficiency.
Recommended citation: Veit D. Wild and George Wynne (2022). "Variational Gaussian Processes: A Functional Analysis View." International Conference on Artificial Intelligence and Statistics.
Published in arxiv, 2022
We develop a framework for generalized variational inference in infinite-dimensional function spaces and use it to construct a method termed Gaussian Wasserstein inference (GWI)…
Recommended citation: Veit D. Wild, Robert Hu and Dino Sejdinovic (2022). "Generalized Variational Inference in Function Spaces: Gaussian Measures meet Bayesian Deep Learning." arXiv preprint arXiv:2205.06342.
Published:
The slides for a talk I gave about how we can use Gaussian Measures on the space of square-integrable functions to construct a highly flexible inference framework. We obtain state-of-the art results on benchmark data sets by combining deep neural networks with Gaussian measures in a novel way. The slides can be found here.
Published:
A presentation I gave at NeurIPS 2023 in New Orleans. The slides can be found here.
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.