Matthew Gibson-Lopez

Associate Professor, Department of Computer Science

The University of Texas at San Antonio

Algorithms Seminar - Spring 2026

The Algorithms Seminar features roughly bi-weekly talks on topics related to algorithms and computational methods. The topics are anything related to algorithms. We currently have in-house expertise in neuromorphic algorithms, quantum algorithms, computational geometry, optimization algorithms, coding based cryptography, symbolic computation, randomized numerical methods, reinforcement learning, and parallel algorithms.

Time: Fridays, 10:00am - 11:00am
Location: Large Conference Room, NPB 3.108A, Main Campus

Friday, January 30, 2026
Matthew Gibson-Lopez - UT San Antonio
Building Intuition for Quantum Advantage: From Deutsch-Jozsa to Amplitude Estimation
Completed

Abstract

Quantum computing promises computational speedups, but understanding when and why quantum algorithms outperform classical ones requires building the right intuition. Last semester, we explored Deutsch's algorithm—a simple quantum algorithm that solves a particular function evaluation problem with just one query, whereas classical approaches require two. This talk begins by examining the natural generalization commonly taught in textbooks: the Deutsch-Jozsa algorithm, which extends the same quantum technique to functions on n-bit inputs. We'll then push further by relaxing the strict constant-versus-balanced restriction to see what happens when we estimate the majority class size through repeated quantum measurements. Surprisingly, this provides no advantage over classical random sampling—a result that helps clarify the boundaries of quantum speedup. However, this raises a compelling question: when does repeated quantum sampling help? We'll introduce amplitude estimation, a powerful generalization of Grover's algorithm that achieves a quadratic speedup for certain estimation problems, demonstrating where quantum coherence can reduce sampling variance in ways classical approaches cannot. Throughout, we'll maintain our discrete math perspective. No background in quantum mechanics assumed.
Friday, February 6, 2026
No Talk
Friday, February 13, 2026
Jingbo Liu, Texas A&M University - San Antonio
Lattice Basis Reduction and Their Algorithms
Completed

Abstract

A lattice L is an additive discrete subgroup of ℝn. A set of linearly independent vectors B = {b1, ..., bn} ⊆ ℝn is called a basis of L if L = ℤb1 + ⋯ + ℤbn. When n ≥ 2, L has infinitely many possible bases. Bases consisting of relatively short and nearly orthogonal vectors are considered good bases, while others are regarded as bad bases. The lattice reduction theory studies how to obtain a good basis from a given one. In this talk, we will explore several lattice basis reduction methods and algorithms, along with their applications in cryptography and number theory.
Friday, February 20, 2026
No Talk
Friday, February 27, 2026
Henry Chimal-Dzul - UT San Antonio
Exploring Linear Codes for Code-Based Cryptography
Completed

Abstract

Code-based cryptography is one of the most promising candidates for Post-Quantum Cryptography. Indeed, the NIST has selected HQC as one of the standards for Post-Quantum Cryptography. In this talk, we explore some linear code constructions that have been used in this field, with emphasis on quasi-cyclic codes which are the ones implemented in HQC. The presentation will be informal and suitable for a wide range of student in STEM fields.
Friday, March 6, 2026
No Talk
Friday, March 13, 2026
Spring Break
Friday, March 20, 2026
No Talk
Friday, March 27, 2026
Alperen Ergur - UT San Antonio
Learning Fast Monomial Orders for Groebner Basis Computations
Completed

Abstract

Solving multivariate non-linear systems of polynomial equations is a ubiquitous task in computational science, essential to fields ranging from discrete optimization and statistical inference to computer vision, systems biology, and cryptography. The standard algorithmic engine for solving these systems is Gröbner basis computation. However, the efficiency of this method is highly sensitive to the choice of monomial ordering. Despite a near continuum of possible orders, current implementations overwhelmingly rely on static, expert-guided heuristics such as GrevLex. We take a data driven approach by casting the selection of monomial orderings as a Reinforcement Learning (RL) problem over the space of admissible orderings. The trained agents outperform human expert heuristics across the test cases, sometime reducing the computational cost up to 70%. Related paper: https://arxiv.org/pdf/2602.02972 Talk will assume no background in polynomials or related algorithms, necessary concepts will be reviewed. As much as time permits, concepts from RL will also be explained.
Friday, April 3, 2026
No Talk
Friday, April 10, 2026
Bradley Theilman - Sandia National Laboratories
Mental Math: Spiking Neuromorphic Circuits for Numerical Scientific Computing
Upcoming

Abstract

Brains are not commonly viewed as “good at arithmetic”, but counterintuitively, brain-like dynamics share deep analogies with iterative numerical algorithms. This talk will present a brain-inspired, spiking, neuromorphic algorithm for solving large, sparse linear systems (Ax = b) such as those arising in Finite Element Methods for solving partial differential equations (PDEs), one of the most important techniques in modern numerical science and engineering. The algorithm embeds the sparse matrix into the synaptic connections between subpopulations of neurons directly without training or learning. Neural dynamics are defined so that the collective spiking activity of the whole network flows to an efficient spiking representation of the solution vector x, with comparable numerical accuracy to traditional algorithms. I’ll demonstrate this algorithm on real neuromorphic hardware (Intel’s Loihi 2) and show close to ideal strong and weak scaling. I’ll also demonstrate the generality of the algorithm through several examples of PDEs in 2 and 3 dimensions, with nontrivial mesh topologies, and different boundary conditions. This work establishes a direct connection between established numerical methods for PDEs and brain-like spiking neural networks, demonstrating the value of brain inspiration, and expanding the neuromorphic footprint in scientific computing. More details are available in the paper: https://www.nature.com/articles/s42256-025-01143-2
Friday, April 17, 2026
Johannes Bauer - UT San Antonio
Title TBD
Upcoming

Abstract

Abstract to be announced.
Friday, April 24, 2026
No Talk
Friday, May 1, 2026
Dávid Papp - North Carolina State University
Dual certificates of set membership--rigorous proofs from inexact, numerical computation
Upcoming

Abstract

Numerical methods for continuous optimization problems usually return approximately feasible and approximately optimal solutions, and conventional wisdom suggests that this is the best we can do when the computations are done using inexact arithmetic. This is a problem in some applications such as the design and verification of mission-critical systems or computer-assisted theorem proving, where we need rigorous proofs that a solution attaining a certain (near-optimal) value exists, but solving the problems in exact arithmetic is prohibitively expensive. Because of this, numerical computation in these areas is followed up by expensive, often manual, post-processing that explicitly produces a rational vector that is exactly feasible or optimal. In this talk, we present a different approach, which allows us to compute numerical solutions of the dual optimization problem and then interpret them as rigorous (easily and independently verifiable) certificates. This circumvents the need for any problem-specific, manual post-processing. Counterintuitively, proofs of the existence of the desired solutions do not even involve the approximate computation of these solutions.