題 目:Hierarchical Kernels in Deep Kernel Learning
内容簡介:Classical kernel methods enjoy a solid mathematical foundation while have difficulty handling very complicated learning problems. In contrast, deep learning based on deep neural networks has achieved great successes in complicated learning problems including face recognition, speech recognition, game intelligence, natural language processing, and autonomous navigation. However, current deep learning methods are not well understood mathematically, which hinders their interpretability. Recently, there have been efforts in developing deep kernel learning with the hope of combining the advantages of kernel methods and deep learning. Such approaches aim to construct hierarchical kernels via consecutive compositions from widely-used reproducing kernels. In this paper, we characterize the corresponding reproducing kernel Hilbert spaces of hierarchical kernels, and study conditions ensuring that the reproducing kernel Hilbert space will be expanding as the layer of hierarchical kernels increases. The results will yield guidance to the construction of hierarchical kernels for deep kernel learning.
報告人:張海樟
報告人簡介:中山大學數學學院(珠海)教授,研究興趣包括學習理論、應用調和分析和函數逼近。代表性成果有再生核的Weierstrass逼近定理,深度神經網絡的收斂性理論,以及在國際上首創的再生核巴拿赫空間理論。以再生核巴拿赫空間為基礎的心理學分類方法入選劍橋大學出版社的《數學心理學新手冊》。在Journal of Machine Learning Research、Applied and Computational Harmonic Analysis、Neural Networks, Neural Computation、Neurocomputing、Journal of Approximation Theory、IEEE Transactions系列等發表多篇原創性工作, 單篇最高他引超過360次。主持多項國家和省部級基金。
時 間:2024年11月15日(周五)下午15:30開始
熱烈歡迎廣大師生參加!
太阳集团1088vip
2024年11月7日