Seminar第2228讲 An Introduction to Hyperparameter Optimization

创建时间:  2021年12月13日 17:28  谭福平   浏览次数:   

报告题目 (Title):An Introduction to Hyperparameter Optimization(超参数优化问题简介)

报告人 (Speaker):方慧副教授(上海财经大学)

报告时间 (Time):2021年12月14日(周二) 16:00 - 17:00

报告地点 (Place):G507

邀请人(Inviter):余长君


报告摘要:Machine learning (ML) has been widely exploited in both academia and industry. Building an effective machine learning model is a time-consuming process that involves obtaining an optimal model architecture with fine-tuned hyperparameters. Besides, recent interest in complex ML models with a relatively large volume of hyperparameters (e.g., autoML and deep learning methods) has resulted in an increasing volume of studies on hypeparameter optimization (HPO).

In this talk, Iwill first formally define the HPO problem, and give an overview of existing wok in this field of research. Secondly, three types of HPO methods, i.e., sampling-based, model-based and gradient-based, are elaborated. Finally, I will conclude the talk by summarizing challenging issues on the topic.

上一条:Seminar第2229讲 Low regularity ill-posedness for 3D elastic waves and for 3D ideal compressible MHD driven by shock formation

下一条:Seminar第2227讲 On Ramanujan's cubic theory of elliptic functions

CopyRight © Shanghai University    沪ICP备09014157   Address : 99 Shangda Road, BaoShan District, Shanghai.(traffic)   Zip Code : 200444   Tel.
Technical Support : Information Technology Office of Shanghai University   Contact Us