In my work on linear system solvers for graph Laplacians I have pursued two threads
of work on constructing preconditioners. The
first thread on subgraph preconditioners, led to an
asymptotically optimal algorithm for planar graphs and a near-optimal algorithm for the general case, co-authored with Gary Miller and Richard Peng.
Our work improved dramatically upon the previous
celebrated work of Spielman and Teng. It was hailed as a
"Breakthrough in Algorithm Design" and a "new breed of ultrafast algorithms" for
Network Solutions . It also led to an invited
article in the prestigious Research Highlights section of the Communications of the ACM.
It was however the
second thread on 'super-graph' preconditioners that lived up to the promise
of practically fast speed. My implementation of a Combinatorial Multigrid Solver
(CMG) has proved very effective for difficult linear systems arising in optimization problems
and remains one of the fastest available solvers for graph Laplacians. This led to a number of applied works,
including algorithms for hypergraph partitioning, a fundamental problem in Electronic Design Automation.
This thread of work has led to a
best paper award at ACME/IEEE ICCAD 2022.
On a different front, I have introduced the general
"Algebraic Fingerprints"
method that has led to the design of faster exact and parameterized algorithms for a multitude of computationally hard problems, including
the major
breakthrough on the famous Hamiltonian Cycle problem by Andreas Björklund, after more than 50 years of stagnation.
About 8 years after my original ICALP 2008 paper, algebraic fingerprints were
featured by the ACM. A further collaboration with Andreas Björklund and Petteri Kaski led to more record-breaking algorithms
and won the
best paper award in ICALP 2017.
While I have personally not worked on practical implementations of these algorithms, I was happy to see that they have
been used to significantly push the frontier in
subgraph detection problems.
teaching
At NJIT, I have been teaching Algorithms, Complexity, Machine Learning and Deep Learning. I teach both at the Newark
campus and at the magnificent NJIT facility at
Jersey City.
In 2020, I received the Ying Wu College of Computing Excellence in Teaching Award.