Robots & how you control them
6 min read · for SO-101 users
What this means for you: if you record demos with an SO-101 leader arm today, nothing in the upgrade affects you. If you ever want to move to a different teleop style — keyboard, gamepad, phone — there's first-class support already in the box.
Joint-mirror (SO-101, the common case)
you grip ──▶ leader arm ──▶ identical follower arm ──▶ motors move
(every joint angle is copied 1-for-1)
IK-mapped (phone, or any unmatched leader)
phone pose ──▶ 6-DoF target ──▶ inverse kinematics ──▶ joint angles
(where should the hand be?) (solve for joints)
Cartesian-EE-delta (keyboard, gamepad)
key press ──▶ "move 1cm right" ──▶ inverse kinematics ──▶ joint angles
(nudge the end-effector) (solve for joints)
The basics
What is a leader arm?
A smaller, kinematically identical twin of the follower robot. You grip it with your hand and move it through the demo; the follower mirrors every joint. Because the two arms have the same shape, no math is needed — just copy joint angles. SO-101 ships with a leader; so do Koch, OpenManipulator-X, OpenArm, and the bimanual setups.
Three teleop styles
Joint-mirror when your input has the same shape as the robot (SO-101 leader). IK-mapped — IK is inverse kinematics, "given a target gripper pose, solve for joint angles" — for 6-DoF inputs like a phone. Cartesian-EE-delta (EE = end-effector, the gripper tip) for nudge-style inputs like keyboard or gamepad.
Hardware LeRobot supports
First-party drivers for SO-100/101, Koch v1.0/v1.1, OpenManipulator-X, OpenArm (single and bimanual), LeKiwi (mobile arm), Hope Jr., Reachy 2, Unitree G1, and the Earthrover Mini Plus rover. Twelve robots and fourteen teleoperators in v0.5.0.
Cameras
Four backends: USB webcams (OpenCV), Intel RealSense (color + depth), ZMQ-streamed remote cameras, and Reachy 2's built-in SDK stream.
How you actually use it
The standard first-time SO-101 flow:
Calibration is saved under
~/.cache/huggingface/lerobot/calibration/, so you only do
it once per arm. The teleop loop itself is three lines:
obs = robot.get_observation();
action = teleop.get_action();
robot.send_action(action).
Things to know
Of the 14 teleoperators in the tree, every send_feedback()
call raises NotImplementedError except the Unitree G1's,
which only forwards wireless-remote bytes. There is no force, torque,
or vibration channel anywhere in the codebase. If you wanted true
bilateral teleop on SO-101, you'd be writing the loop yourself.
For an arm that doesn't ship with a leader, the natural fits are the phone teleop (6-DoF tracking through ARKit on iOS or WebXR on Android) or the gamepad/keyboard EE-delta routes. Both feed a generic IK solver downstream.
The big v0.5.0 hardware story is the Unitree G1 humanoid getting a real driver split: arm IK in one file, two swappable RL locomotion controllers (NVIDIA GR00T at 50 Hz, Amazon FAR Holosoma at 200 Hz) in their own files, and a much smaller robot module that just glues them together. None of it touches the SO-101 path.
Show the full teleop taxonomy (every concrete teleop class)
Teleoperator ABC at
src/lerobot/teleoperators/teleoperator.py:29
declares: action_features, feedback_features,
is_connected, connect(),
is_calibrated, calibrate(),
configure(), get_action() -> RobotAction,
send_feedback(dict), disconnect().
Calibration is JSON-persisted via draccus to
HF_LEROBOT_CALIBRATION/teleoperators/<name>/<id>.json
(teleoperator.py:47-53).
Robot ABC at
src/lerobot/robots/robot.py:30 mirrors
that, swapping get_action→get_observation
and send_feedback→send_action. Both ABCs
are bytewise identical between 0.4.4 and 0.5.0.
| Teleop | Style | DoF | Pairs with | file:line |
|---|---|---|---|---|
so_leader (SO-100/101) |
Joint-mirror | 6× <joint>.pos |
so_follower |
teleoperators/so_leader/so_leader.py:33; calib :88; get_action :140 |
koch_leader (Koch v1.0/v1.1) |
Joint-mirror (gripper trigger w/ haptic) | 6× .pos |
koch_follower |
teleoperators/koch_leader/koch_leader.py:34; gripper trick :150 |
bi_so_leader |
Joint-mirror (bimanual) | 12× <side>_<joint>.pos |
bi_so_follower |
teleoperators/bi_so_leader/bi_so_leader.py:30, :97 |
omx_leader (OpenManipulator-X) |
Joint-mirror | 6× .pos |
omx_follower |
teleoperators/omx_leader/omx_leader.py:34 |
openarm_leader (Enactic OpenArm) |
Joint-mirror (CAN MIT-mode; torque exposed) | per-motor .pos, .vel, .torque |
openarm_follower |
teleoperators/openarm_leader/openarm_leader.py:32, :57, :73 |
bi_openarm_leader |
Joint-mirror (bimanual) | 2× pos/vel/torque | bi_openarm_follower |
teleoperators/bi_openarm_leader/bi_openarm_leader.py |
openarm_mini |
Joint-mirror (Feetech bimanual) | 14–16× .pos |
OpenArm Mini follower | teleoperators/openarm_mini/openarm_mini.py:35 |
homunculus_arm |
Joint-mirror, EMA-filtered (α = 2/(N+1), N=50) | 7× .pos |
hope_jr_arm |
teleoperators/homunculus/homunculus_arm.py:34, :57-75 |
homunculus_glove |
Joint-translated (per-side flip lists) | ~15+ finger joints | hope_jr_hand |
teleoperators/homunculus/homunculus_glove.py:60 |
phone |
6-DoF SE(3) pose → IK | phone.pos(3,), phone.rot, raw_inputs, enabled |
any IK-capable follower | teleoperators/phone/teleop_phone.py:72 iOS, :211 Android, :358 dispatcher |
keyboard / keyboard_ee |
Cartesian EE deltas | per-key axis | any IK-capable follower | teleoperators/keyboard/teleop_keyboard.py:51, :145 |
gamepad |
Cartesian EE deltas (HID/pygame; HID forced on macOS) | delta_x/y/z (+gripper) |
any IK-capable follower | teleoperators/gamepad/teleop_gamepad.py:44, :80 |
reachy2_teleoperator |
Joint-mirror via SDK | ~16 joints (neck 3, antennae 2, arms 2×8, base 3 vel) | reachy2 |
teleoperators/reachy2_teleoperator/reachy2_teleoperator.py:38-75 |
unitree_g1 |
IK-mapped (exo FK → G1 IK) + voice/buttons | 14 G1 arm .q + 4 axes + 16 buttons |
unitree_g1 robot |
teleoperators/unitree_g1/unitree_g1.py:45, :159 |
ALOHA leader has no dedicated dir in either version — the
ALOHA-style bimanual loop is realized through bi_so_leader
+ bi_so_follower. There is no vr_teleoperator;
phone covers the head-mount-style pose stream (ARKit on
iOS, WebXR on Android via the dispatcher at
teleop_phone.py:358).
The follower side (12 robots)
All "arm" followers expose action_features = {<motor>.pos:
float} and observation = <motor>.pos ∪
camera tuples. Control is joint-position goal at the bus
rate — torque is followed implicitly by the motor's PID
inside each servo. so_follower at
robots/so_follower/so_follower.py:37;
koch_follower at
robots/koch_follower/koch_follower.py:37;
openarm_follower (CAN-FD Damiao) at
robots/openarm_follower/openarm_follower.py:39;
lekiwi at
robots/lekiwi/lekiwi.py:41;
reachy2 (gRPC SDK, no local motor bus) at
robots/reachy2/robot_reachy2.py:75;
unitree_g1 (29-DoF, 250 Hz default) at
robots/unitree_g1/unitree_g1.py:110.
No PTP, no GenICam. Each camera stamps on receiver-side capture;
read_latest(max_age_ms) is a freshness gate, not
cross-camera alignment. For multi-RealSense bin picking, drive both
cams off one Pipeline upstream — do not rely on
ZMQ stamps or independent RealSenseCamera instances for
sub-frame alignment.
Show the v0.5.0 G1 refactor (g1_kinematics, gr00t_locomotion, holosoma_locomotion)
In v0.4.4 the G1 robot was monolithic: joystick byte-parser as inner
UnitreeG1.RemoteController class
(v0.4.4/.../unitree_g1.py:75-91); IK
imported from robot_kinematic_processor.py; no RL
locomotion path. v0.5.0 splits this into three pluggable pieces behind
a LocomotionController Protocol at
unitree_g1.py:69-75
(control_dt, run_step, reset);
the robot picks one via
make_locomotion_controller(config.controller) at
g1_utils.py:66 — a string →
importlib.import_module switch on
"GrootLocomotionController" /
"HolosomaLocomotionController".
Single file, all-in-one
One robot_kinematic_processor.py, 313 lines.
Kinematics and robot wiring in the same module. Joystick parser is
an inner class inside UnitreeG1. No RL locomotion path.
src/lerobot/robots/unitree_g1/
robot_kinematic_processor.py 313 lines
unitree_g1.py RemoteController inline (:75-91)
IK math: G1_29_ArmIK in casadi/pinocchio, IPOPT
max_iter=50, smooth-cost regularizer,
WeightedMovingFilter backed by a list with
pop(0). Convergence-failure handling is two branches
at
v0.4.4/.../robot_kinematic_processor.py:275-294.
Three pluggable modules
src/lerobot/robots/unitree_g1/
g1_kinematics.py 287 lines IK only
gr00t_locomotion.py 205 lines NEW — NVIDIA GR00T-WBC ONNX, 50 Hz
holosoma_locomotion.py 214 lines NEW — Amazon FAR Holosoma ONNX, 200 Hz
unitree_g1.py thin DDS↔dict bridge
-
g1_kinematics.py— same casadi/pinocchio IPOPT IK as before.solve_ikat src/lerobot/robots/unitree_g1/g1_kinematics.py:232. -
gr00t_locomotion.py— loads NVIDIA GR00T-WBC ONNX policies. 86-D obs (cmd, height_cmd, orientation_cmd, gyro, gravity, scaled qj/dqj over 29 joints, 15-D previous action), stacks 6 frames → 516-D, runs balance for‖cmd‖ < 0.05else walk, returnstarget_dof_pos[:15].control_dt = 0.02 s(50 Hz). -
holosoma_locomotion.py— Amazon FAR Holosomafastsac/ppoONNX policies. 100-D obs (last_action 29, ang_vel 3, cmd 3, sin/cos phase 4, qj 29, dqj 29, gravity 3); 200 Hz (CONTROL_DT = 0.005).
Cleanups: WeightedMovingFilter uses
deque(maxlen=window) instead of list+pop(0)
(g1_kinematics.py:32); convolve-loop
replaced by data_array.T @ self._weights
(:39). Identical math, less LOC.
Holosoma's "zero arm obs" trick. Inside
run_step, before policy inference, the controller
overwrites the arm-joint slots in qj / dqj
with DEFAULT_ANGLES and zeros at
src/lerobot/robots/unitree_g1/holosoma_locomotion.py:161-163.
Intentional: hides the teleop's arm motion from the locomotion
policy so the legs don't react when the operator drives the arms.
Whole-body teleop becomes additive — the legs balance the
body, the operator owns the arms.
Teleop-side complement: v0.4.4 had the joystick parser inside the
robot; v0.5.0 moves it into
teleoperators/unitree_g1/unitree_g1.py:45
— a bigger RemoteController handling both the
official wireless dongle (set_from_wireless parses
24-byte packets via unitree_sdk2py.utils.joystick.Joystick)
AND a 12-bit ADC exoskeleton-mounted joystick
(set_from_exo). Wireless takes priority when non-zero
(:285-290).
scripts/lerobot_teleoperate.py:166 adds a
G1-specific teleop.send_feedback(obs) so wireless bytes
flow back to the teleop each loop — the only such backchannel in
the codebase.
Show motor drivers and bus protocols (Dynamixel, Feetech, Damiao, Robstride)
Two layers under src/lerobot/motors/:
MotorsBusBase ABC at
motors/motors_bus.py:47
(connect, read/write,
sync_read/sync_write); SerialMotorsBus
concrete UART at
motors/motors_bus.py:292 (handshake,
sync-read groups, set_half_turn_homings,
record_ranges_of_motion).
| Driver | Inherits | Bus protocol | Used by |
|---|---|---|---|
FeetechMotorsBus motors/feetech/feetech.py:98 |
SerialMotorsBus |
UART/RS-485 half-duplex via pyserial + Feetech SCS |
so_*, bi_so_*, hope_jr_*, lekiwi, openarm_mini |
DynamixelMotorsBus motors/dynamixel/dynamixel.py:102 |
SerialMotorsBus |
Dynamixel 2.0 via dynamixel_sdk |
koch_*, omx_* |
DamiaoMotorsBus motors/damiao/damiao.py:76 |
MotorsBusBase directly (no serial) |
CAN(-FD) via python-can, MIT mode only |
openarm_*, bi_openarm_* |
RobstrideMotorsBus motors/robstride/robstride.py:67 |
MotorsBusBase |
CAN MIT-mode | No shipped robot uses it — available for custom integrations |
Calibration GUI motors/calibration_gui.py is used only by
Hope-Jr (robots/hope_jr/hope_jr_arm.py:23,
RangeFinderGUI); other arms use the text flow.
motors_bus.py is unchanged across versions except
TypeAlias → PEP 695 type at
motors/motors_bus.py:41-42.
Camera backends
Camera ABC at
src/lerobot/cameras/camera.py:26:
read(), async_read(timeout_ms=200),
read_latest(max_age_ms=...),
connect/disconnect. OpenCVCamera
(cameras/opencv/camera_opencv.py:52),
RealSenseCamera
(cameras/realsense/camera_realsense.py:44;
read_depth at :323;
depth+color hardware-aligned by RealSense pipeline),
ZMQCamera
(cameras/zmq/camera_zmq.py:47; receives
base64-JPEG + JSON metadata), Reachy2Camera
(SDK-managed gRPC stream).
At src/lerobot/cameras/zmq/image_server.py:46,
v0.5.0 adds a CameraCaptureThread that reads +
JPEG-encodes in the background so the publish socket isn't blocked
by encode latency. v0.4.4 encoded synchronously. This is the only
architectural camera change between versions.
Force feedback gap (the data path exists, the loop doesn't)
Across all 14 concrete teleops, every send_feedback()
raises NotImplementedError except
unitree_g1.send_feedback at
teleoperators/unitree_g1/unitree_g1.py:295,
which only forwards the wireless-remote raw bytes. There is no force,
torque, or vibration channel anywhere in the codebase.
The OpenArm leader is the closest thing to ready: it exposes
.torque in action_features at
teleoperators/openarm_leader/openarm_leader.py:73,
and the Damiao MIT-mode CAN driver
(motors/damiao/damiao.py:76) reads/writes
torque on the same bus. The data path exists; the bilateral loop does
not.