Systems Research

We focus on extending the frontiers of machine learning and AI by developing novel algorithmic, software, and hardware techniques.

Our areas of research include computer languages, compilers, low-level optimization, distributed and parallel computing, computer arithmetic, HPC, GPU/FPGA/ASIC hardware applications, and HW/SW co-design.

Theory

Artificial intelligence has enjoyed immense practical success in recent years, largely due to advances in machine learning, especially deep learning via optimization. A rich mathematical theory explaining the many empirical results can help drive further advances, informed by feedback from those advances.

The latest results connect with celebrated techniques in learning theory, optimization, signal processing, and statistics. The interplay between rigorous theory and engineering advances pushes forward the frontiers of AI.