学术报告(雷云文 2026.3.16)
Non-vacuous Generalization Bounds for Overparameterized Shallow Neural Networks
发布人:姚璐
发布日期:2026-03-09
主题
Non-vacuous Generalization Bounds for Overparameterized Shallow Neural Networks
活动时间
-
活动地址
新数学楼415
主讲人
雷云文 助理教授(香港大学)
主持人
王雄 副教授
摘要:Overparameterized neural networks often exhibit a benign overfitting phenomenon, achieving excellent generalization performance despite having more parameters than training samples. Traditional generalization analyses typically yield vacuous bounds due to overparameterization. In this talk, we establish non-vacuous generalization bounds by controlling the Rademacher complexity of overparameterized shallow neural networks (SNNs), supported by empirical studies involving highly overparameterized SNNs. Our complexity bounds are fully dependent on the distance from the initialization point and are expressed in terms of the path-norm of the networks.

