Last updated: Mar. 30, 2025
[日本語]
I am researching the “optimization” of lower-unbounded convex functions on Euclidean spaces and Hadamard manifolds. Although the minimization of lower-unbounded convex functions may seem useless at first sight, it has recently been shown that the asymptotic behaviors of optimization algorithms, such as gradient descent and accelerated gradient descent, have essential connections to unboundedness characteristics of the objective functions. Interesting applications include minimum-norm point problems, matrix/operator scaling problems, and (semi-)stability determination in geometric invariant theory.
From Apr. 2025, I will be a PhD candidate at Ruhr University Bochum in Germany, supervised by Michael Walter in the Quantum Information group. [university mypage]
Email: ksakabe (at) g.ecc.u-tokyo.ac.jp (soon to be changed)
Mar. 2025: Master of Information Science and Technology. Graduate School of Information Science and Technology, The University of Tokyo, Japan
Mar. 2023: Bachelor of Engineering. Department of Mathematical Engineering and Information Physics, The University of Tokyo, Japan
Mar. 2018: Graduated from Kaiyo Academy, Japan
Master’s Thesis: Convergence Analyses of First-Order Optimization Methods for Unbounded Convex Functions
Supervised by Takayasu Matsuo
Bachelor’s Thesis: 非有界な凸関数に対する最急降下法と行列スケーリングへの応用 (Gradient Descent for Unbounded Convex Functions and Its Application to Matrix Scaling)
Supervised by Hiroshi Hirai
In Japanese
Gradient descent for unbounded convex functions on Hadamard manifolds and its applications to scaling problems.
Hiroshi Hirai, Keiya Sakabe.
[arXiv]
Proceedings of the 2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS 2024), pp. 2387–2402.
Finding Hall blockers by matrix scaling.
Koyo Hayashi, Hiroshi Hirai, Keiya Sakabe.
[arXiv]
Mathematics of Operations Research, to appear.
Gradient descent for unbounded convex functions on Hadamard manifolds and its applications to scaling problems.
Feb. 2025: MFO-RIMS Tandem Workshop: Optimization, Theoretical Computer Science and Algebraic Geometry: Convexity and Beyond, Kyoto, Japan
Gradient descent for unbounded convex functions on Hadamard manifolds and its applications to scaling problems.
Oct. 2024: 65th IEEE Symposium on Foundations of Computer Science (FOCS) 2024, Chicago, IL, USA
Steepest descent algorithm for unbounded convex functions.
Dec. 2023: Quantum Information Colloquium, Bochum, Germany
Oct. 2023 – Mar. 2024: 数学1D (Mathematics 1-D), The University of Tokyo
Instructor: Motohiko Ezawa
Apr. 2023 – Sep. 2024: 数理情報工学実験第二 (Mathematical Informatics Engineering Laboratory 2), The University of Tokyo
Instructor: Shun Sato
優秀発表賞 (Good Presentation Award). 最適化の理論とアルゴリズム:未来を担う若手研究者の集い 2024, RAOTA, May. 2024, Tsukuba, Japan
優秀発表賞 (Good Presentation Award). 最適化の理論とアルゴリズム:未来を担う若手研究者の集い 2023, RAOTA, May. 2023, Tsukuba, Japan
工学部長賞 (Dean’s Award). Mar. 2023, Faculty of Engineering, The University of Tokyo, Japan
Internship at Preferred Networks, Inc., Jul. 2023 – Sep. 2023. [Tech Blog in Japanese]
Volunteering at science camps for high school students. [2019] [2021] [2023]