Juqiao Tactile Glove

Data Collection

Record contact pressure alongside robot joint state and camera video. The glove adds a tactile modality that captures how hard the robot gripper is pressing — information invisible to vision alone.

What the glove adds to your dataset Standard teleoperation datasets capture joint positions, velocities, and camera images. Adding the Juqiao Glove records a 64-node, 200 Hz pressure map showing the spatial distribution of contact force during each grasp. Policies trained on this data can learn to modulate grip force — crucial for fragile or deformable objects.
Hardware Setup

System Connections

Connect all devices before starting any software. The glove is strictly USB wired — there is no wireless mode.

Robot Arm + Hand

Orca Hand (recommended) or other end-effector. Connected via Feetech USB adapter or CAN, depending on arm.

/dev/ttyUSB0 or CAN0

Juqiao Glove

USB-C to host PC. Appears as CDC-ACM serial. 1.5 m cable; use a USB extension if needed for freedom of movement.

/dev/ttyACM0

Wrist Camera

USB or GigE camera mounted on the robot's wrist, providing an object-centric view for training.

/dev/video0 or GigE IP

Overhead Camera

Fixed workspace camera for scene context. Pair with the wrist camera for multi-view datasets.

/dev/video2 or GigE IP

Teleoperation Device

Leader arm (SO-101 / OpenArm leader), VR controller, or SpaceMouse driving the follower robot.

Leader port / HID
Recording Workflow

Step-by-Step Recording

1

Bring up the robot arm and hand

Start the arm driver (and Orca Hand driver if applicable) in separate terminals. Verify joint_states are publishing before proceeding.

# Terminal 1: Robot arm (example: OpenArm) ros2 launch openarm_ros2 openarm.launch.py port:=/dev/ttyUSB0 # Terminal 2: Orca Hand (if using Orca Hand end-effector) ros2 launch orca_ros2 orca_hand.launch.py port:=/dev/ttyUSB1 # Verify ros2 topic hz /joint_states # Should be ~100 Hz ros2 topic hz /orca_hand/joint_states # Should be ~100 Hz
2

Start the Juqiao Glove driver

Launch the glove ROS2 node. Verify 200 Hz data is flowing before starting any recording session.

# Terminal 3: Juqiao Glove ros2 launch juqiao_glove_ros2 glove.launch.py port:=/dev/ttyACM0 # Verify pressure data is live ros2 topic hz /juqiao_glove/tactile_array # Should be ~200 Hz ros2 topic echo /juqiao_glove/grasp_region # Should print region strings
3

Launch cameras

Start camera nodes for wrist and overhead views. Confirm image topics are publishing at target frame rate (typically 30 fps).

# Wrist camera (USB) ros2 run usb_cam usb_cam_node_exe --ros-args -p video_device:=/dev/video0 -p framerate:=30.0 # Overhead camera (GigE example) ros2 launch camera_ros2 gige_camera.launch.py ip:=192.168.1.100 # Verify ros2 topic hz /wrist_camera/image_raw ros2 topic hz /overhead_camera/image_raw
4

Verify all streams are synchronized

Use ros2 topic list to confirm all required topics are present. Check timestamps are within 20 ms across modalities before recording any episodes.

ros2 topic list | grep -E "joint|tactile|image|grasp" # Expected topics: # /joint_states (arm, ~100 Hz) # /orca_hand/joint_states (hand, ~100 Hz) # /juqiao_glove/tactile_array (glove, ~200 Hz) # /juqiao_glove/grasp_region (glove, ~200 Hz) # /wrist_camera/image_raw (camera, 30 Hz) # /overhead_camera/image_raw (camera, 30 Hz)
5

Record a dataset with LeRobot

Use LeRobot's record script. The --tactile-topic flag adds the glove pressure stream as a dataset column alongside joint states and images.

# Record 50 episodes; press Enter to start each, Space to stop python -m lerobot.scripts.record \ --robot-path lerobot/configs/robot/orca_hand.yaml \ --fps 30 \ --repo-id $HUGGINGFACE_USER/orca_grasp_tactile \ --tags "juqiao-glove,tactile,manipulation" \ --warmup-time-s 2 \ --episode-time-s 30 \ --reset-time-s 5 \ --num-episodes 50 \ --push-to-hub \ --tactile-topic /juqiao_glove/tactile_array \ --tactile-key tactile_pressures
No LeRobot --tactile-topic flag yet? Use the SVRC multi-modal recorder wrapper which handles synchronization and dataset schema automatically: pip install roboticscenter[recorder]
6

Review episodes before pushing

Replay each episode with the heatmap overlay to verify tactile data quality before committing to the HuggingFace Hub.

# Replay and overlay heatmap on video python -m roboticscenter.scripts.review_episode \ --dataset-path ./data/orca_grasp_tactile \ --episode-index 0 \ --overlay-tactile \ --tactile-key tactile_pressures # Discard bad episodes python -m lerobot.scripts.delete_episodes \ --dataset-path ./data/orca_grasp_tactile \ --episodes 3 7 12 # episode indices to discard
7

Push to HuggingFace Hub

Upload the validated dataset. The dataset card is auto-generated with modality descriptions including tactile.

huggingface-cli login # first time only python -m lerobot.scripts.push_dataset \ --dataset-path ./data/orca_grasp_tactile \ --repo-id $HUGGINGFACE_USER/orca_grasp_tactile # Verify upload python -c " from datasets import load_dataset ds = load_dataset('$HUGGINGFACE_USER/orca_grasp_tactile', split='train') print(ds.column_names) print(ds.features['tactile_pressures']) "
Reference

Dataset Schema

Each frame in the dataset contains the following columns. All arrays are stored as float32 Parquet columns; images as MP4 video sequences.

ColumnShape / TypeDescription
observation.state float32[6] Robot arm joint positions (radians). Index 6 is gripper aperture if using a 6-DOF arm.
observation.hand_state float32[17] Orca Hand joint positions (radians), 17 DOF. Omit if no dexterous hand. Labeled per joint name in metadata.
observation.tactile_pressures float32[64] Juqiao Glove normalized pressure per node, 0.0 (no contact) to 1.0 (full scale). 200 Hz downsampled to match robot frame rate.
observation.tactile_pressures_raw uint16[64] Raw 16-bit ADC values. Preserve for re-normalization after recalibration. Optional; omit to reduce dataset size.
observation.grasp_region str Active contact region heuristic: "palm", "thumb", "index", "middle", "ring", "pinky", or "" (no contact).
observation.images.wrist uint8[H, W, 3] Wrist camera RGB frame at 30 fps, stored as MP4 video sequence.
observation.images.overhead uint8[H, W, 3] Overhead camera RGB frame at 30 fps.
action float32[6 or 23] Target joint positions for the arm (6) and optionally hand (17). Shape depends on whether dexterous hand is present.
language_instruction str Task description, e.g. "pick up the egg without breaking it". Enables language-conditioned policy training.
episode_index int Episode number within the dataset.
frame_index int Frame number within the episode (0-indexed).
timestamp float64 Seconds elapsed since episode start.

Tactile Downsampling

The glove streams at 200 Hz; the robot and cameras typically run at 30–100 Hz. The recorder aligns frames using nearest-neighbor timestamp matching. To preserve the full 200 Hz tactile stream as a separate array:

# Store full-rate tactile as a variable-length array per frame python -m roboticscenter.scripts.record \ --tactile-mode full_rate # default is "matched" (one per robot frame) # This produces a ragged column: observation.tactile_sequence # shape per row: float32[N, 64] where N varies (typically 6-7 frames at 200Hz/30fps)
Quality

Episode Quality Checklist

Review each episode against these criteria before including it in your dataset. One bad episode can introduce spurious tactile patterns that degrade policy training.

  • Tactile stream is continuous — no frame gaps Verify frame.sequence increments without skips. Frame drops appear as repeated values in the tactile column.
  • Baseline at rest is near zero (< 0.03) At episode start (before any contact), the max pressure node should read below 0.03. Drift above 0.05 at rest indicates the glove needs recalibration.
  • Contact events align with visible grasps in video Replay with --overlay-tactile. Pressure spikes (max node > 0.4) should coincide with visible gripper closure in the wrist camera.
  • Grasp region is consistent during contact phase grasp_region should stabilize on one or two regions during each grasp (e.g. "palm" + "index"). Rapidly changing regions indicate glove misalignment or noisy signal.
  • Glove latency < 20 ms relative to robot state Compare tactile event onset to gripper velocity spike. Latency > 20 ms suggests USB hub congestion — connect the glove directly to a host USB port.
  • Language instruction matches the task demonstrated For language-conditioned datasets, ensure the instruction entered at episode start accurately describes what the operator actually demonstrated.
  • No cable interference with robot motion The 1.5 m USB cable must not restrict the operator's hand movements or pull the glove off during the episode. Use a cable management clip on the forearm.
Policy Training

Training with Tactile Observations

Policies that consume tactile input typically see 15–30% improvement on contact-sensitive tasks (fragile object handling, peg insertion, cloth folding) compared to vision-only baselines.

# ACT policy with tactile observations (SVRC fork adds tactile encoder) python -m lerobot.scripts.train \ --dataset-repo-id $HUGGINGFACE_USER/orca_grasp_tactile \ --policy act \ --policy.observation_keys \ "observation.images.wrist" \ "observation.images.overhead" \ "observation.state" \ "observation.hand_state" \ "observation.tactile_pressures" \ --output-dir outputs/act_tactile # The tactile encoder maps float32[64] → latent via a small MLP # No special architecture changes required for ACT or Diffusion Policy
# Diffusion Policy with tactile (identical --observation_keys, different policy flag) python -m lerobot.scripts.train \ --dataset-repo-id $HUGGINGFACE_USER/orca_grasp_tactile \ --policy diffusion \ --policy.observation_keys \ "observation.images.wrist" \ "observation.state" \ "observation.tactile_pressures" \ --output-dir outputs/diffusion_tactile
Tip: ablate tactile at eval time Pass observation.tactile_pressures = zeros(64) during inference to measure how much the policy relies on tactile vs. vision. This identifies whether your task requires tactile or if vision is sufficient.