Peng Wang
About Me
I am currently a postdoc research fellow advised by Professors Laura Balzano and Qing Qu at University of Michigan. Before that, I got my Ph.D. degree in Systems Engineering and Engineering Management advised by Professor Anthony Man-Cho So at The Chinese University of Hong Kong.
Research Interests
Broadly speaking, my research interest lies in the intersects of optimization, machine learning, and data science. Currently, I am devoted to understanding mathematical foundations of deep learning models, including supervised learning models, diffusion models, and large language models. I mainly study how low-complexity structures (e.g., low-rankness, sparsity, over-parameterization) in practical problems lead to favorable optimization properties and use them to mitigate the challenges caused by worst-case scenarios, enable efficient optimization, and improve our understanding of learning phenomena..
Feel free to email me if you are interested in my research. Remote collaboration is also welcome!
What's New
[January 2025] I will give a presenation in IMS Young Mathematical Scientists Forum at National University of Singapore!
[October 2024] I will chair a session and give a lecture presentation on diffusion models in Asilomar 2024 at Pacific Grove, CA!
[October 2024] Two paper on diffusion models get accepted by NeurIPS Workshop on Mathematics of Modern Machine Learning!
[September 2024] I will give a talk on diffusion models in ECE Communications and Signal Processing Seminars at the University of Michigan!
[September 2024] One paper on dffisuion models is accepted by NeurIPS 2024!
[September 2024] Two papers [paper 1, paper 2] on diffusion models are released!
[May 2024] Five papers [paper1, paper2, paper3, paper4, paper5] are accepted by ICML 2024!
[March 2024] One paper on studying reproducibility of diffusion model in memorization and generalization regimes is posted.
[November 2023] One paper on understanding hierarchical representation learning via low-dimensinoal modeling is posted!
Preprints (‘‘*’’ denotes equal contribution, ‘‘\(\dagger\)’’ denotes corresponding author.)
Peng Wang*, Huijie Zhang*, Zekai Zhang, Siyi Chen, Yi Ma, Qing Qu. Diffusion Models Learn Low-Dimensional Distributions via Subspace Clustering, 2024. Under review in ICLR 2024. [paper]
Peng Wang*, Xiao Li*, Can Yaras, Wei Hu, Zhihui Zhu, Laura Balzano, Wei Hu, Qing Qu. Understanding Deep Representation Learning via Layerwise Feature Compression and Discrimination. Under review in Journal of Machine Learning Research, 2024. [paper]
Peng Wang, Rujun Jiang, Qingyuan Kong, Laura Balzano. A Proximal DC Algorithm for Sample Average Approximation of Chance Constrained Programming. Under minor revision in INFORMS Journal on Computing, 2024. [paper]
Can Yaras*, Peng Wang*, Wei Hu, Zhihui Zhu, Laura Balzano, Qing Qu. The Law of Parsimony in Gradient Descent for Learning Deep Linear Networks, 2023. To be submitted. [paper]
Taoli Zheng, Peng Wang, Anthony Man-Cho So. A Linearly Convergent Algorithm for Rotationally Invariant L1-Norm Principal Component Analysis, 2022. [paper]
Journal Papers
Peng Wang, Huikang Liu, Anthony Man-Cho So. Linear Convergence of Proximal Alternating Minimization Method with Extrapolation for L1-Norm Principal Component Analysis. SIAM Journal on Optimization (2023) 33(2):684-712. [paper]
Peng Wang, Zirui Zhou, Anthony Man-Cho So. Non-Convex Exact Community Recovery in Stochastic Block Model. Mathematical Programming, Series A (2022) 195(1-2):793-829. [paper]
Conference Papers
Siyi Chen*, Huijie Zhang*, Minzhe Guo, Yifu Lu, Peng Wang, Qing Qu. Exploring Low-Dimensional Subspaces in Diffusion Models for Controllable Image Editing. NeurIPS 2024. [paper]
Peng Wang, Huikang Liu, Druv Pai, Yaodong Yu, Zhihui Zhu, Qing Qu, Yi Ma. A Global Geometric Analysis of Maximal Coding Rate Reduction. ICML 2024. [paper]
Can Yaras, Peng Wang, Laura Balzano, Qing Qu. Compressible Dynamics in Deep Overparameterized Low-Rank Learning & Adaptation. ICML 2024 (Oral, acceptance rate: 1.52%). [paper]
Huikang Liu*, Peng Wang*, Longxiu Huang, Qing Qu, Laura Balzano. Matrix Completion with ReLU Sampling. ICML 2024. [paper]
Jiachen Jiang, Jinxin Zhou, Peng Wang, Qing Qu, Dustin Mixon, Chong You, Zhihui Zhu. Generalized Neural Collapse for a Large Number of Classes. ICML 2024. [paper]
Huijie Zhang, Jinfan Zhou, Yifu Lu, Minzhe Guo, Peng Wang, Liyue Shen, and Qing Qu. The Emergence of Reproducibility and Consistency in
Diffusion Models. ICML 2024. [paper]
Can Yaras*, Peng Wang*, Wei Hu, Zhihui Zhu, Laura Balzano, Qing Qu. Invariant Low-Dimensional Subspaces in Gradient Descent for Learning Deep Matrix Factorizations. NeurIPS M3L 2023. [paper]
Jinxin Wang, Yuen-Man Pun, Xiaolu Wang, Peng Wang, Anthony Man-Cho So. Projected Tensor Power Method for Hypergraph Community Recovery. ICML 2023. [paper]
Peng Wang*, Huikang Liu*, Can Yaras*, Laura Balzano, Qing Qu. Linear Convergence Analysis of Neural Collapse with Unconstrained Features. NeurIPS Workshop on Optimization for Machine Learning, NeurIPS OPT 2022. [paper]
Can Yaras*, Peng Wang*, Zhihui Zhu, Laura Balzano, Qing Qu. Neural Collapse with Normalized Features: A Geometric Analysis over the Riemannian Manifold. NeurIPS 2022. [paper]
Peng Wang, Huikang Liu, Anthony Man-Cho So, Laura Balzano. Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering. ICML 2022. [paper]
Xiaolu Wang, Peng Wang, Anthony Man-Cho So. Exact Community Recovery over Signed Graphs. AISTATS 2022. [paper]
Peng Wang, Huikang Liu, Zirui Zhou, Anthony Man-Cho So. Optimal Non-Convex Exact Recovery in Stochastic Block Model via Projected Power Method. ICML 2021. [paper]
Peng Wang*, Zirui Zhou*, Anthony Man-Cho So. A Nearly-Linear Time Algorithm for Exact Community Recovery in Stochastic Block Model. ICML 2020. [paper]
Peng Wang, Huikang Liu, Anthony Man-Cho So. Globally Convergent Accelerated Proximal Alternating Maximization Method for L1-Principal Component Analysis. ICASSP 2019 (IEEE SPS Student Travel Award). [paper]
Huikang Liu, Peng Wang, Anthony Man-Cho So. Fast First-Order Methods for the Massive Robust Multicast Beamforming Problem with Interference Temperature Constraints. ICASSP 2019. [paper]
|