
A Teleoperation Pipeline for Imitation Learning
Training SOTA imitation learning algorithms, like diffusion policy and ACT, has become easier with recent open source libraries like Huggingface’s LeRobot. But building the pipeline to teleoperate an arbitrary pair of robot arms in real time, recording demonstrations for imitation learning in an intuitive way, requires some expertise to set up. I built a browser app to make this easy: WebXR-based handtracking, connecting via WebRTC to you laptop, which solves the inverse kinematics for arbirary URDFs in WASM in the browser, exporting trajectories containing movements and webcam-vision that are ready for imitation learning. All you need to write is a little python server forwarding joint angles from localhost to your robot.