Peng Wang

Peng Wang Photo 

Peng Wang
Postdoc Research Fellow

Department of Electrical Engineering and Computer Science
University of Michigan, Ann Arbor

Office: Room 3218, EECS Building, 1301 Beal Avenue, Ann Arbor, MI 48109-2122
Email: pengwa@umich.edu
Google Scholar

About Me

I am currently a postdoc research fellow advised by Professors Laura Balzano and Qing Qu at University of Michigan. Before that, I got my Ph.D. degree in Systems Engineering and Engineering Management advised by Professors Anthony Man-Cho So and Shiqian Ma at the Chinese University of Hong Kong.

Research Interests

My research interests mainly lie in optimization theory and methods for non-convex optimization problems and their applications in machine learning, signal processing, and data science. Recently, I am especially interested in theoretical understanding of simple and scalable methods for non-convex problems with low-dimensional structures, such as subspace recovery, community detection, constriained PCA, and neural collapse.

Feel free to email me if you are interested in my research. Remote collaboration is also welcome!

What's New

  • [November 7, 2022] One paper on L1-PCA has been accepted by SIOPT!

  • [September 15, 2022] One paper on neural collapse has been accepted by NeurIPS 2022!

  • [May 15, 2022] One paper on subspace clustering has been accepted by ICML 2022!

  • [January 18, 2022] One paper on community recovery has been accepted by AISTATS 2022!

Preprint and Working Papers

  • Peng Wang, Rujun Jiang, Qingyuan Kong, Laura Balzano. Proximal DC Algorithm for Sample Average Approximation of Chance Constrained Programming: Convergence and Numerical Results, Submitted to SIAM Journal on Optimization 2023. [paper]

  • Taoli Zheng, Peng Wang, Anthony Man-Cho So. A Linearly Convergent Algorithm for Rotationally Invariant L1-Norm Principal Component Analysis, 2022. [paper]

Journal Papers

  • Peng Wang, Huikang Liu, Anthony Man-Cho So. Linear Convergence of Proximal Alternating Minimization Method with Extrapolation for L1-Norm Principal Component Analysis. Accepted for publication in SIAM Journal on Optimization, 2022. [paper]

  • Peng Wang, Zirui Zhou, Anthony Man-Cho So. Non-Convex Exact Community Recovery in Stochastic Block Model. Mathematical Programming, Series A (2021): 1-37. [paper]

Conference Papers (‘‘*’’ denotes equal contribution.)

  • Peng Wang*, Huikang Liu*, Can Yaras*, Laura Balzano, Qing Qu. Linear Convergence Analysis of Neural Collapse with Unconstrained Features. Accepted by NeurIPS Workshop on Optimization for Machine Learning, OPT 2022.

  • Can Yaras*, Peng Wang*, Zhihui Zhu, Laura Balzano, Qing Qu. Neural Collapse with Normalized Features: A Geometric Analysis over the Riemannian Manifold. Accepted by NeurIPS 2022. [paper]

  • Peng Wang, Huikang Liu, Anthony Man-Cho So, Laura Balzano. Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering. Accepted by ICML 2022. [paper]

  • Xiaolu Wang, Peng Wang, Anthony Man-Cho So. Exact Community Recovery over Signed Graphs. AISTATS 2022. [paper]

  • Peng Wang, Huikang Liu, Zirui Zhou, Anthony Man-Cho So. Optimal Non-Convex Exact Recovery in Stochastic Block Model via Projected Power Method. ICML 2021. [paper]

  • Peng Wang*, Zirui Zhou*, Anthony Man-Cho So. A Nearly-Linear Time Algorithm for Exact Community Recovery in Stochastic Block Model. ICML 2020. [paper]

  • Peng Wang, Huikang Liu, Anthony Man-Cho So. Globally Convergent Accelerated Proximal Alternating Maximization Method for L1-Principal Component Analysis. ICASSP 2019 (IEEE SPS Student Travel Award). [paper]

  • Huikang Liu, Peng Wang, Anthony Man-Cho So. Fast First-Order Methods for the Massive Robust Multicast Beamforming Problem with Interference Temperature Constraints. ICASSP 2019. [paper]