Speaker: Klara Brandstätter (193-02 Computer Graphics)
The control of legged robots is still a challenging task. Machine learning approaches already work well in simulation. However, the discrepancy between simulation and reality sometimes causes difficulties when applying simulation results to the real robot. Learning algorithms also require a huge amount of training data. The goal of this work is to build a sandbox that provides a detailed comparison between simulated and real-world robots and offers a way of controlled and continuous data collection and exploration. The sandbox consists of a motion capture and a simulation component. The motion capture component is responsible for the continuous data collection and is realized with a system from OptiTrack with six high-precision infrared cameras. The simulation component is realized with Simulink. This component is responsible for the exploration and comparison of simulated data with real-world data. For this work, a small four-legged puppy robot from ROBOTIS is selected. To integrate the robot into the sandbox, the robot's controller is reprogrammed to make a transfer from motion data to the robot easier and to control the robot remotely. The robot is programmed with a straight walking gait and equipped with reflective markers to track its movements. A 3D model of the puppy robot is constructed that is used for simulation in Simulink. The result is a system that accurately gathers 6 degrees of freedom (DOF) data of a small robot. This data is transferred to the simulation and can be compared to simulated data. Data from the simulation can also be tested easily on the real robot and tracked again. This way, a closed-loop system is provided for iterative robot exploration. Two datasets are compared with the help of the resulting sandbox: A dataset containing ideal joint angles for the robot, and a dataset that is obtained with the motion capture system, containing tracked joint angles. The datasets are simulated, and the position and orientation of the robot are compared to the data from the motion capture. The simulated robot kept a similar direction and was only a few centimetres off from the real robot.