Peng Wang
About Me
I am currently a postdoc research fellow advised by Professors Laura Balzano and Qing Qu at University of Michigan. Before that, I got my Ph.D. degree in Systems Engineering and Engineering Management advised by Professor Anthony Man-Cho So at The Chinese University of Hong Kong.
Research Interests
My research interests mainly lie in non-convex optimization for low-complexity representation learning and its applications in machine learning, signal processing, and data science, such as subspace recovery, community detection, constrained PCA, and neural collapse. Currently, I am devoted to understanding mathematical insights into deep learning via low-complexity modeling and non-convex optimization.
Feel free to email me if you are interested in my research. Remote collaboration is also welcome!
What's New
[November, 2023] One paper on understanding hierarchical representation learning via low-dimensinoal modeling is posted!
[October, 2023] I was selected to receive the Rising Star Award at CPAL 2024!
[October, 2023] One paper on the generalized neural collapse is posted!
[June, 2023] One paper on the low-dimensional updates of gradient descent for deep networks is posted!
[April, 2023] One paper on chance constrained programming is posted!
[November, 2022] One paper on L1-PCA has been accepted by SIOPT!
[Septembe, 2022] One paper on neural collapse has been accepted by NeurIPS 2022!
Preprint Papers
Peng Wang*, Xiao Li*, Can Yaras, Wei Hu, Zhihui Zhu, Laura Balzano, Wei Hu, Qing Qu. Understanding Deep Representation Learning via Layerwise Feature Compression and Discrimination, 2023. In submission. [paper]
Can Yaras*, Peng Wang*, Wei Hu, Zhihui Zhu, Laura Balzano, Qing Qu. The Law of Parsimony in Gradient Descent for Learning Deep Linear Networks, 2023. In submission. [paper]
Peng Wang, Rujun Jiang, Qingyuan Kong, Laura Balzano. Proximal DC Algorithm for Sample Average Approximation of Chance Constrained Programming: Convergence and Numerical Results. Submitted to SIAM Journal on Optimization, 2023. [paper]
Jiachen Jiang, Jinxin Zhou, Peng Wang, Qing Qu, Dustin Mixon, Chong You, Zhihui Zhu. Generalized Neural Collapse for a Large Number of Classes. Submitted to ICLR 2024. [paper]
Taoli Zheng, Peng Wang, Anthony Man-Cho So. A Linearly Convergent Algorithm for Rotationally Invariant L1-Norm Principal Component Analysis, 2022. [paper]
Journal Papers
Peng Wang, Huikang Liu, Anthony Man-Cho So. Linear Convergence of Proximal Alternating Minimization Method with Extrapolation for L1-Norm Principal Component Analysis. SIAM Journal on Optimization (2023) 33(2):684-712. [paper]
Peng Wang, Zirui Zhou, Anthony Man-Cho So. Non-Convex Exact Community Recovery in Stochastic Block Model. Mathematical Programming, Series A (2022) 195(1-2):793-829. [paper]
Conference Papers (‘‘*’’ denotes equal contribution.)
Can Yaras*, Peng Wang*, Wei Hu, Zhihui Zhu, Laura Balzano, Qing Qu. Invariant Low-Dimensional Subspaces in Gradient Descent for Learning Deep Matrix Factorizations. To appear in NeurIPS M3L 2023.
Jinxin Wang, Yuen-Man Pun, Xiaolu Wang, Peng Wang, Anthony Man-Cho So. Projected Tensor Power Method for Hypergraph Community Recovery. ICML 2023. [paper]
Peng Wang*, Huikang Liu*, Can Yaras*, Laura Balzano, Qing Qu. Linear Convergence Analysis of Neural Collapse with Unconstrained Features. NeurIPS Workshop on Optimization for Machine Learning, NeurIPS OPT 2022. [paper]
Can Yaras*, Peng Wang*, Zhihui Zhu, Laura Balzano, Qing Qu. Neural Collapse with Normalized Features: A Geometric Analysis over the Riemannian Manifold. NeurIPS 2022. [paper]
Peng Wang, Huikang Liu, Anthony Man-Cho So, Laura Balzano. Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering. ICML 2022. [paper]
Xiaolu Wang, Peng Wang, Anthony Man-Cho So. Exact Community Recovery over Signed Graphs. AISTATS 2022. [paper]
Peng Wang, Huikang Liu, Zirui Zhou, Anthony Man-Cho So. Optimal Non-Convex Exact Recovery in Stochastic Block Model via Projected Power Method. ICML 2021. [paper]
Peng Wang*, Zirui Zhou*, Anthony Man-Cho So. A Nearly-Linear Time Algorithm for Exact Community Recovery in Stochastic Block Model. ICML 2020. [paper]
Peng Wang, Huikang Liu, Anthony Man-Cho So. Globally Convergent Accelerated Proximal Alternating Maximization Method for L1-Principal Component Analysis. ICASSP 2019 (IEEE SPS Student Travel Award). [paper]
Huikang Liu, Peng Wang, Anthony Man-Cho So. Fast First-Order Methods for the Massive Robust Multicast Beamforming Problem with Interference Temperature Constraints. ICASSP 2019. [paper]
|