- Organizer
- Akira Matsushima (KU, UTMD)
- Kyohei Atarashi (KU)
- Han Bao (KU)
- Koh Takeuchi (KU, RIKEN AIP)
- [KISS-001] Outlier-Robust Neural Network Training: Efficient Optimization of Transformed Trimmed Loss with Variation Regularization
- Presenter: Akihumi Okuno (Assistant Professor, ISM) [web]
- Date:10/8 13:30-15:00
- Location:Lecture room 3, Research Bldg. No. 7 (104, 1st foor)
- Abstruct:
In this study, we consider outlier-robust predictive modeling using highly-expressive neural networks. To this end, we employ (1) a transformed trimmed loss (TTL), which is a computationally feasible variant of the classical trimmed loss, and (2) a higher-order variation regularization (HOVR) of the prediction model. Note that using only TTL to train the neural network may possess outlier vulnerability, as its high expressive power causes it to overfit even the outliers perfectly. However, simultaneously introducing HOVR constrains the effective degrees of freedom, thereby avoiding fitting outliers. We newly provide an efficient stochastic gradient supergradient descent (SGSD) algorithm for optimization and its theoretical convergence guarantee. (This work is a joint work with Shotaro Yagishita (ISM))
Past schedule: