# About Me I am a postdoctoral researcher at [École Normale Supérieure, Paris](https://www.ens.psl.eu/) , mentored by Prof. [Stéphane Mallat](https://www.di.ens.fr/~mallat/mallat.html). My research lies at the intersection of **machine learning**, **high-dimensional** statistics, and **statistical physics**. More specifically, I aim to understand the theoretical foundations of modern machine learning through statistical and physical models. Representative examples of my work can be found in the [publications](publication) and [news](#news) sections. I obtained my PhD in 2025 from the University of Basel supervised by Prof. [Ivan Dokmanić](https://dmi.unibas.ch/de/personen/ivan-dokmanic/). **Thesis**: *From Spins to Springs: Understanding and Leveraging Interactions in Artificial Neural Networks*. I received my Bachelor’s degree in 2019 from the University of Electronic Science and Technology of China (UESTC). I have also worked as a research assistant or intern at several institutions, including Harvard University, EPFL, ETH Zurich, and the Institute of Theoretical Physics, Chinese Academy of Sciences. My full CV and PhD thesis are available upon request ([contact](https://chatgpt.com/c/69527b4a-8220-8328-ad08-2740cf870267#contact) ). Beyond research, I am a happily well-fed 🫄🏻 food enthusiast who enjoys exploring cuisines from around the world, from Michelin-starred restaurants to local street food. Some of my food adventures can be found on my [food blog](Food%20map.md). ## News 1. **[2025]** Our paper about feature learning in DNN was accepted by _Physical Review Letters (PRL)_ as an **Editor’s Suggestion**. ([paper](https://journals.aps.org/prl/abstract/10.1103/ys4n-2tj3), also see [news report](https://phys.org/news/2025-08-geometry-physics-feature-deep-neural.html)) * Feature-learning deep nets progressively collapse data to a regular low-dimensional geometry. How this emerges from the collective action of nonlinearity, noise, learning rate, and other factors, has eluded first-principles theories built from microscopic neuronal dynamics. We exhibit a noise–nonlinearity phase diagram that identifies regimes where shallow or deep layers learn more effectively and propose a macroscopic mechanical theory that reproduces the diagram and links feature learning across layers to generalization. ![[[email protected]|500]] 2. **[2025]** Our paper about data preprocessing in Graph Neural Network was accepted by _ICLR_ as an **Oral Presentation** ([paper](https://openreview.net/forum?id=zBbZ2vdLzH)). * Learning from graph data is challenging because both graph structure and node features provide noisy label information. We propose JDR, a joint feature denoising and graph rewiring method that improves downstream GNN-based node classification. JDR aligns the leading spectral subspaces of the graph and feature matrices, approximately solving a nonconvex optimization problem that accommodates multiple classes and varying degrees of homophily or heterophily. We provide theoretical justification in a stylized setting and demonstrate that JDR consistently outperforms existing rewiring methods on diverse synthetic and real-world benchmarks. 3. **[2024]** Our paper about double descent in Graph Neural Network was accepted by _PNAS_([paper](https://www.pnas.org/doi/full/10.1073/pnas.2309504121)). * Using tools from statistical physics and random matrix theory, we characterize generalization in graph convolutional networks on stochastic block models, explaining differences between homophilic and heterophilic graphs and predicting double descent in GNNs. We show that risk is governed by the interplay of graph noise, feature noise, and label availability, with implications that extend to real-world datasets and improve performance on heterophilic graphs. # Contact - 🎓 [Google Scholar](https://scholar.google.com/citations?user=MXZ-VPcAAAAJ&hl=en&oi=ao) - 💻 [GitHub](https://github.com/DaDaCheng) - ✉️ [Email](mailto:[email protected])