I am a PhD student at Berkeley AI Research (BAIR), advised by Jitendra Malik and Alexei Efros. I am also affiliated with the NVIDIA GEAR group led by Jim Fan and Yuke Zhu.
Previously, I obtained my BSc and MEng from MIT EECS, under the supervision of Phillip Isola. During my undergraduate years at MIT, I was also fortunately advised by Jiajun Wu and Antonio Torralba. Before transferring to MIT, I was an undergraduate student at The University of Tokyo. I spent the past summers interning at DeepMind (2021, 2019), Facebook (2018), and Google (2017, 2016).
I am currently building real-world robot systems that can achieve dexterous low-level skills through end-to-end learning.
I work on applying end-to-end learning methods to enable low-level dexterous manipulation skills on real robots, leveraging data from both the real world and simulators.
Learning Methods reinforcement learning, imitation learning, sim-to-real, generative models
Hardware Systems bimanual arms with dexterous hands, humanoids
Learning Visuotactile Skills with Two Multifingered Hands
Toru Lin, Yu Zhang*, Qiyang Li*, Haozhi Qi*, Brent Yi, Sergey Levine, and Jitendra Malik.
arXiv 2024.
Twisting Lids (Off) with Two Hands
Toru Lin*, Zhao-Heng Yin*, Haozhi Qi, Pieter Abbeel, and Jitendra Malik.
CoRL 2024.
MIMEx: Intrinsic Rewards from Masked Input Modeling
Toru Lin and Allan Jabri.
NeurIPS 2023.
Learning to Ground Multi-Agent Communication with Autoencoders
Toru Lin, Minyoung Huh, and Phillip Isola.
NeurIPS 2021.
Geometry-Aware Modeling of Rigid Body Physics
Toru Lin*, Kexin Yi*, and Phillip Isola.
ICML 2020, Workshop on Object-Oriented Learning.
Visual Grounding of Learned Physical Models
Yunzhu Li, Toru Lin*, Kexin Yi*, Daniel Bear, Daniel LK Yamins, Jiajun Wu, Joshua B Tenenbaum, and Antonio Torralba.
ICML 2020.
Model-Based Planning with Energy-Based Models
Yilun Du, Toru Lin, and Igor Mordatch.
CoRL 2019.
UC Berkeley CS 294-277 Robots That Learn (Fall 2024)
MIT 6.819/6.869 Advances in Computer Vision (Spring 2021)
MIT 6.036 Introduction to Machine Learning (Fall 2019)
MIT 6.863/9.611 Natural Language Processing (Spring 2018)
I speak English, Japanese, Mandarin, Cantonese and Singlish natively.
Outside of research, I enjoy reading philosophy, learning physics, and making art.
Copyright © Made by Toru
Template from Plain Academic