Guan-Horng Liu
Research Scientist @ FAIR (Meta AI)
Hi, I am Guan-Horng Liu (I go by "Guan"), a Research Scientist in FAIR, NYC.
I study fundamental algorithms for learning diffusion models with optimality structures. I’m actively contributing to nonlinear diffusion models—mainly Schrödinger Bridge and Mirror Diffusion— and large-scale methods for applications in generative modeling, image restoration, unpaired image translation, watermarked generation, opinion depolarization, and single-cell RNA sequencing. Prior to this, I worked on robust architecture-aware neural optimizers.
I’m generally interested in integrating optimality/domain structures into diffusion and flow models, with the goals of enhancing theoretical understanding and developing large-scale algorithms for novel applications. In terms of fundamental research, I combine dynamic optimal transport, stochastic optimal control, and statistical physics. I enjoy applying these tools to a variety of scientific and machine learning problems.
I’m extremely fortunate to intern in FAIR Lab, Meta and Nvidia Research during 2023 and 2022 Summer, working with many talented researchers, including (FAIR) Ricky T. Q. Chen, Yaron Lipman, Maximilian Nickel, Brian Karrer, (Nvidia) Weili Nie, Arash Vahdat, Anima Anandkumar, De-An Huang, (Google DeepMind) Valentin De Bortoli, and (Georgia Tech) Molei Tao. In Georgia Tech, I am advised by Evangelos Theodorou.
See my full CV here (updated in Sep 2024).
Contact:
ghliu [at] meta [dot] com
Follow:
Google Scholar
|
LinkedIn
|
ghliu
|
@guanhorng_liu
Updates
[04/2024] I co-organize ICML workshop on Structured Probabilistic Inference & Generative Modeling (SPIGM). |
[02/2024] Generalized Schrödinger Bridge Matching (with FAIR, Meta AI) accepted to ICLR. |
[09/2023] Two papers, Mirror Diffusion and Momentum Schrödinger Bridge, accepted to NeurIPS. |
[05/2023] I have joined FAIR Lab, Meta AI as a research scientist intern this summer. |
[04/2023] I co-organize ICML workshop on New Frontiers in Learning, Control, and Dynamical Systems. |
[03/2023] Image-to-Image Schrödinger Bridge (with Nvidia Research) accepted to ICML. |
[09/2022] Deep Generalized Schrödinger Bridge accepted to NeurIPS (Oral 1.9%). |
[02/2022] Likelihood training of Schrödinger Bridge accepted to ICLR. |
[09/2021] Three papers on architecture-aware neural optimizers accepted to NeurIPS (Spotlight 3.0%), ICML (Oral 3.0%), and ICLR (Spotlight 3.8%). |
Selected Publications
- React-OT: Optimal Transport for Generating Transition State in Chemical ReactionsPreprint, 2024
- Dynamic Game Theoretic Neural OptimizerInternational Conference on Machine Learning (ICML), 2021 [Long talk 3.0%]
- DDPNOpt: Differential Dynamic Programming Neural OptimizerInternational Conference on Learning Representations (ICLR), 2021 [Spotlight 3.8%]