Motion Teaching
Here we describe a sequence of procedures from motion teaching to motion generation using robot simulator Robosuite.
Setup
Clone the robosuite repository
Install the requirements with
Teleoperation
Teach object grasping motions to the robot using keyboard teleoperation. The following figure shows an overview of this experimental task, in which the robot grasps a red cube. The red letters in the figure are the teaching positions and the blue letters are the untaught positions, and the position of the grasping object was shifted by 5 cm intervals. The robot is moved up, down, left, and right with reference to the commands in the table. To facilitate learning the relationship between the position of the grasping object and the motion, a slight offset is applied to the position of the object each time. In this experiment, five grasping motions were taught at each position.
The following is a description of how to use the motion teaching program (1_teaching.py).
The argument pos
is the position of the object, and ep_dir
is the save directory.
We provide a demonstration dataset
to eliminate the hassle of data collection. For more information, click here.
The collected data (model.xml and state.npz) are stored in the ep_dir
folder (e.g. ./data/raw_data/).
Data Re-Saving
The following two processes are performed. The first is the removal of guide lines. During keyboard teleoperation, green guide lines appear in the image data. By playback of the collected data, the image data without the guide line is saved. The second is downsampling. During tele-operation, the collected data sequence is long (default is 600 step) because the robot is controlled with a high frequency to teach fine movements. Therefore, sensor data is recollected every 5 step to downsample the sequence length to 120 step.
Check whether the motion generated by playback is correct. Tasks rarely fail due to downsampling. If the playback data fails the task, the following error is displayed [ERROR] This data set has failed task during playback.
. If the task is successful, the gif animation is saved in the output folder.
Generate Dataset
If the playbacked file was stored in ./data/raw_data/
, training/test data will be automatically generated by the following command.
Check for proper normalization range of joint angles. The visual image of the robot, raw joint angles, and normalized joint angle data are saved as a gif animation in the output folder.
Demonstration Dataset
Download the demonstration dataset from this link and extract it into ./data/raw_data/
to eliminate the need for data collection. Note that after downloading the file, you must perform Step 2 or later.