Two-layer networks with the ReLU^k activation function: Barron spaces and derivative approximation

发布时间:2024-04-08浏览次数:59

Speaker:  Shuai Lu, Fudan University.


Time:       15:00, Apr. 11th

Location: 1C 101,  SIST

Host:        Qifeng Liao

Abstract:

We investigate the use of two-layer networks with the rectified power unit, which is called the ReLU^k activation function, for function and derivative approximation. By extending and calibrating the corresponding Barron space, we show that two-layer networks with the ReLU^k activation function are well-designed to simultaneously approximate an unknown function and its derivatives. When the measurement is noisy, we propose a Tikhonov type regularization method, and provide error bounds when the regularization parameter is chosen appropriately. Several numerical examples support the efficiency of the proposed approach.

Bio:

陆帅,复旦大学数学科学学院教授,主要从事数学物理反问题计算方法与数学理论的研究,特别是反问题正则化方法收敛性分析及偏微分方程反问题稳定性理论等。至今在Inverse ProblemsSIAM系列、Numer. Math.Math. Comp.等权威期刊共发表学术论文五十余篇,合作出版英文学术专著一本。2019年获得国家杰出青年科学基金资助,现任《Inverse Problems》的编委,曾获上海市自然科学奖一等奖(第二完成人)。