I’m a third year PhD student (graduating in 2024) at University College London where I’m supervised by Alexey Pokrovskiy. My area of research is extremal and probabilistic combinatorics.
When can we find perfect matchings in hypergraphs whose vertices represent group elements and edges represent solutions to systems of linear equations? A prototypical problem of this type is the Hall-Paige conjecture, which asks for a characterisation of the groups whose multiplication table (viewed as a Latin square) contains a transversal. Many problems in the area have a similar flavour, yet until recently they have been approached in completely different ways, using mostly algebraic tools ranging from the combinatorial Nullstellensatz to Fourier analysis. In the first paper, we give a unified approach to attack these problems, using tools from probabilistic combinatorics. In particular, we derive that a suitably randomised version of the Hall-Paige conjecture can be used as a black-box to settle many old problems in the area for sufficiently large groups. As a by-product, we obtain the first combinatorial proof of the Hall-Paige conjecture. The second paper refines these tools further to solve a problem concerning the existence of transversals with a prescribed cycle type.
Bulletin of the London Mathematical Society, 2023
These papers investigate when the minimum degree threshold and the rainbow minimum degree threshold for the containment of a spanning structure are asymptotically the same. The second paper, building on tools from the first, gives a sufficient condition for when this happens. The condition (roughly speaking) is the existence of an absorption-based proof for the uncoloured version of the problem. Such absorption-based proofs are ubiquitous in the literature, hence this condition is easy to check for a large family of spanning structures. This yields rainbow versions of many classical results in extremal graph theory.
How pseudorandom does a graph need to be before it contains a certain spanning subgraph? This problem is not very well understood, especially in comparison to its purely random analogue. For example, the best possible condition that forces an (n,d,λ)-graph to contain a Hamilton cycle is not known, where an (n,d,λ)-graph is an n-vertex, d-regular graph such that the second largest eigenvalue (in absolute value) of the adjacency matrix is bounded above by λ. The smaller λ is, the more control we have over the edge distribution, and the easier it becomes to embed spanning subgraphs. Alon, Krivelevich, and Sudakov asked in 2007 if the mild condition "λ=o(d)" is sufficient to guarantee that an (n,d,λ)-graph contains all bounded degree spanning trees. Improving significantly on previous results, we show that this condition is sufficient up to polylogarithmic factors. Contrary to other results in the area, our proof does not rely on any absorption technique. Instead, we take a new approach based on the existence of sorting networks of logarithmic depth, as given by a celebrated construction of Ajtai, Komlós and Szemerédi. Using this construction, we show that the classical vertex-disjoint paths problem can be solved for a set of vertices fixed in advance, which might be of independent interest.
Let G and H be graphs on n vertices, and suppose G has large enough minimum degree to necessarily contain H as a subgraph. We show that H in fact embeds into G with good "spread". This is a common generalisation of several streams of research surrounding the classical Dirac-type problem. For example, this gives an asymptotically tight lower bound on the number of embeddings of H on G. Moreover, using the recent developments surrounding the fractional Kahn-Kalai conjecture, this result implies that many Dirac-type results hold robustly, meaning H still embeds into random sparsifications of the edge-set of G. This recovers recent results of Kang-Kelly-Kühn-Osthus-Pfenninger and Pham-Sah-Sawhney-Simkin for perfect matchings, and gives novel results for Hamilton cycles and graph factors. Our randomised embedding algorithm is simple and self-contained. Notably, the algorithm does not rely on iterative absorption or the regularity lemma, unlike similar results in the area.
alp.muyesser.21@ucl.ac.uk
PhD Candidate
Department of Mathematics
University College London