PhD Student · Zhejiang University

Qingtao Liu

My work sits at the intersection of dexterous manipulation and multimodal representation learning. I enjoy building systems where tactile and visual signals come together to make robots act with intuition. Long-term goal: teach robots to learn from humans and serve people as collaborative partners.

Dexterous Manipulation Multimodal Learning Reinforcement Learning
Qingtao Liu portrait
Hangzhou · Robotics & AI

About

I am a 5th-year PhD student at the College of Control Science and Engineering, Zhejiang University, advised by Prof. Qi Ye and Prof. Jiming Chen. I develop generalizable hand–object representations (DexRepNet++/T-RO 2025, IJRR 2025), visual–tactile datasets and benchmarks (VTDexManip/ICLR 2025), and multi-task manipulation policies (ICRA/IROS/CoRL) that close the gap between robot and human dexterity.

News

Publications

Recent and representative papers.

Journals

Conferences

Projects

Skills