SwingBot: Learning Physical Features from In-Hand Tactile Exploration

for Dynamic Swing-up Manipulation


Chen Wang*1,2   Shaoxiong Wang*1   Branden Romero1   Filipe Veiga1   Edward H. Adelson1


MIT1   Shanghai Jiao Tong University2


Several robot manipulation tasks are extremely sensitive to variations of the physical properties of the manipulated objects. One such task is manipulating objects by using gravity or arm accelerations, increasing the importance of mass, center of mass, and friction information.

We present SwingBot, a robot that is able to learn the physical features of a held object through tactile exploration. Two exploration actions (tilting and shaking) provide the tactile information used to create a physical feature embedding space. With this embedding, SwingBot is able to predict the swing angle achieved by a robot performing dynamic swing-up manipulations on a previously unseen object. Using these predictions, it is able to search for the optimal control parameters for a desired swing- up angle. We show that with the learned physical features our end-to-end self-supervised learning pipeline is able to substantially improve the accuracy of swinging up unseen objects. We also show that objects with similar dynamics are closer to each other on the embedding space and that the embedding can be disentangled into values of specific physical properties.


Video (with audio)




If you cannot access the video, please download our video here.


Paper



SwingBot: Learning Physical Features from In-Hand Tactile Exploration for Dynamic Swing-up Manipulation
Chen Wang*, Shaoxiong Wang*, Branden Romero, Filipe Veiga, and Edward H. Adelson
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020
Best Paper Award
[Paper](* indicates equal contributions)