2021-2022

Past Meetings 2021-2022

Monday July 18, 2022

Chris Hickey on “Everything You Wanted to Know About Verification of Solvers”.

Monday June 20, 2022

Mantas Mikaitis on “Anymatrix: An Extensible MATLAB Matrix Collection”.

Thursday June 16, 2022 (G.207)

Niel Van Buggenhout from Charles University (Prague) on “Solving non-autonomous ODEs  in the *-framework”.

Abstract:

The equation underlying nuclear magnetic resonance (NMR) spectroscopy is the Schrödinger equation U'(t) = A(t) U(t). Here A(t) is a given smooth matrix valued function.
We propose a procedure designed to approximate the solution U(t) for large systems. It relies on representing the discontinuous matrix valued function A(t)H(t-s), with H(t-s) the Heaviside step function, as a finite (double) Legendre series. The coefficients are grouped into a matrix, due to the Gibbs phenomenon we cannot expect all coefficients to decay. However, the coefficients away from the diagonal decay quickly, which means that the coefficient matrices are banded matrices. The solution U(t)H(t-s) can then be approximated by the resolvent of this banded coefficient matrix.
Relying on the bandedness of the matrices we can recover, very cheaply, some of the coefficients of the Legendre series for the smooth solution U(t) from the discontinuous solution U(t)H(t-s).
The effectiveness of the procedure is illustrated on some small examples from NMR.

Monday June 13, 2022

Xiaobo Liu on “Computing the Square Root of a Low-Rank Perturbation of the Scaled Identity Matrix”.

Abstract:

We consider the problem of computing the square root of a perturbation of the scaled identity matrix, $A = \alpha I_n + UV^*$, where $U$ and $V$ are $n \times k$ matrices with $k \le n$. This problem arises in various applications, including computer vision and optimization methods for machine learning. We derive a new formula for the $p$th root of $A$ that involves a weighted sum of powers of the $p$th root of the $k\times k$ matrix $\alpha I_k + V^*U$. This formula is particularly attractive for the square root, since the sum has just one term when $p = 2$. We also derive a new class of Newton iterations for computing the square root that exploit the low-rank structure. We test these new methods on random matrices and on positive definite matrices arising in applications. Numerical experiments show that the new approaches can yield much smaller residual than existing alternatives and can be significantly faster when the perturbation $UV^*$ has low rank.

Monday May 23, 2022

Bastien Vieuble on “Mixed precision strategies for preconditioned GMRES”.

Abstract:

The new promises of accessible and efficient hardware support for very low precision arithmetics are a potential source for major performance improvements in scientific computing. However, exploiting such low arithmetic precisions while keeping a satisfactory accuracy on the solution leads to rethink our algorithms within a mixed precision setting. In this talk, we particularly focus on the use of mixed precision inside preconditioned GMRES for the solution of linear systems. We will cover the state of the art on the topic and we will develop new strategies consisting in applying the matrix-vector products with the original matrix A and the preconditioner in two different precisions. In particular, in some cases, the preconditioner can be applied in a lower precision than the matrix-vector product with A, leading to possible performance improvement when the application of the preconditioner is the dominant operation in an iteration of GMRES. We will demonstrate why and when this strategy makes sense by carrying a rounding error analysis on the algorithm and by providing numerical experiments using different preconditioners in Julia.

Monday May 16, 2022

Nick Higham on “Creativity and Iterative Refinement”.

Monday May 9, 2022

Marcus Webb on “Randomized Preconditioning”.

Monday April 25, 2022

Alban Boor Riley on “The inverse eigenvalue problem in spin spectroscopy”.

Monday March 28, 2022

Nick Higham on “Probabilistic Rounding Error Analysis of Householder QR Factorization”.

Monday March 14, 2022

Bastien Vieuble on “Combining sparse approximate factorizations with mixed precision iterative refinement”.

Abstract:

Iterative refinement is seeing its popularity growing again with the new promises of accessible and efficient hardware support for half precision arithmetics. Novel variants of this methods were recently proposed that rely either on the LU factorization or a LU-preconditioned GMRES method for the solution of the correction equation and can employ up to three precisions. The effectiveness of these methods was extensively proved on dense linear systems but the case of sparse problems was not studied as much. Our work aims to fill this gap. First, we have extended the theoretical ground and proposed novel variants that can employ up to five precisions concurrently. Furthermore, we have studied the use of approximation techniques that are commonly used to improve the performance and scalability of sparse direct methods. In all cases we derived theoretical bounds for the convergence conditions and the associate solution accuracy. Second we have implemented these variants on top of a parallel sparse direct solver. We will present the performance of the algorithms on large, sparse problems coming from a variety of real-life and industrial applications showing that the proposed approach can lead to considerable reductions of both the time and memory consumption.

Monday February 21, 2022

Mantas Mikaitis on “Numerical Behavior of GPU Matrix Multiply-Accumulate Hardware”.

Monday February 14, 2022

Ian Mcinerney on “Numerical Methods for Model Predictive Control”.

Monday December 20, 2021

Ayana Mussabayeva on “Classification of EEG features in Brain-Computer Interface Speller”.

Monday December 13, 2021

Nick Higham on “Logarihtmic Norms”.

Monday November 22, 2021

Stefan Güttel on “CLASSIX: fast and explainable clustering based on sorting“.

Monday November 08, 2021

Xinye Chen on “An efficient aggregation method for the symbolic representation of temporal data”.

Monday November 01, 2021

Nick Higham on “Matrix Norms”.