File size: 3,933 Bytes
7d57308 3d53cb4 bf62400 62e0e79 bf62400 62e0e79 bf62400 2b6c7d9 bf62400 f99ab3e 3fcee7e f99ab3e 3fcee7e f99ab3e 3fcee7e f99ab3e 3fcee7e f99ab3e bf62400 fd20759 1e471f3 bf62400 1e471f3 7d57308 1e471f3 bf62400 030316f bf62400 1be9d0d bf62400 69410e4 7d57308 3d53cb4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 |
---
license: mit
task_categories:
- robotics
tags:
- tactile
---
# π¦ FreeTacman
## Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation [ICRA 2026]
## π― Overview
This dataset supports the paper **[FreeTacman: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation](http://arxiv.org/abs/2506.01941)**.
It contains a large-scale, high-precision visuo-tactile manipulation dataset with over 3000k visuo-tactile image pairs, more than 10k trajectories across 50 tasks.
We provide π€ Script ([Hugging Face](https://huggingface.co/datasets/OpenDriveLab/FreeTacMan)) and πΎ Script ([ModelScope](https://www.modelscope.cn/datasets/OpenDriveLab/FreeTacMan)) (users in China) for downloading the dataset.

Please refer to our π [Website](http://opendrivelab.com/freetacman) | π [Paper](http://arxiv.org/abs/2506.01941) | π» [Code](https://github.com/OpenDriveLab/FreeTacMan) | π οΈ [Hardware Guide](https://docs.google.com/document/d/1Hhi2stn_goXUHdYi7461w10AJbzQDC0fdYaSxMdMVXM/edit?addon_store&tab=t.0#heading=h.rl14j3i7oz0t) | πΊ [Video](https://opendrivelab.github.io/FreeTacMan/landing/FreeTacMan_demo_video.mp4) | π [X](https://x.com/OpenDriveLab/status/1930234855729836112) for more details.
## π¬ Potential Applications
The FreeTacman dataset enables diverse research directions in visuo-tactile learning and manipulation:
- **System Reproduction**: For researchers interested in hardware implementation, you can reproduce FreeTacMan from scratch using our π οΈ [Hardware Guide](https://docs.google.com/document/d/1Hhi2stn_goXUHdYi7461w10AJbzQDC0fdYaSxMdMVXM/edit?addon_store&tab=t.0#heading=h.rl14j3i7oz0t) and π» [Code](https://github.com/OpenDriveLab/FreeTacMan).
- **Multimodal Imitation Learning**: Transfer to other LED-based tactile sensors (such as GelSight) for developing robust multimodal imitation learning frameworks.
- **Tactile-aware Grasping**: Utilize the dataset for pre-training tactile representation models and developing tactile-aware reasoning systems.
- **Simulation-to-Real Transfer**: Leverage the dynamic tactile interaction sequences to enhance tactile simulation fidelity, significantly reducing the sim2real gap.
## π Dataset Structure
The dataset is organized into 50 task categories, each containing:
- **Video files**: Synchronized video recordings from the wrist-mounted and visuo-tactile cameras for each demonstration
- **Trajectory files**: Detailed tracking data for tool center point pose and gripper distance
## π§Ύ Data Format
### Video Files
- **Format**: MP4
- **Views**: Wrist-mounted camera and visuo-tactile camera perspectives per demonstration
### Trajectory Files
Each trajectory file contains the following data columns:
#### Timestamp
- `timestamp` - Unix Timestamp
#### Tool Center Point (TCP) Data
- `TCP_pos_x`, `TCP_pos_y`, `TCP_pos_z` - TCP position
- `TCP_euler_x`, `TCP_euler_y`, `TCP_euler_z` - TCP orientation (euler angles)
- `quat_w`, `quat_x`, `quat_y`, `quat_z` - TCP orientation (quaternion representation)
#### Gripper Data
- `gripper_distance` - Gripper opening distance
## π Citation
If you use this dataset in your research, please cite:
```bibtex
@article{wu2025freetacman,
title={FreeTacMan: Robot-free visuo-tactile data collection system for contact-rich manipulation},
author={Wu, Longyan and Yu, Checheng and Ren, Jieji and Chen, Li and Jiang, Yufei and Huang, Ran and Gu, Guoying and Li, Hongyang},
journal={IEEE International Conference on Robotics and Automation},
year={2026}
}
```
## πΌ License
This dataset is released under the MIT License. See LICENSE file for details.
## π§ Contact
For questions or issues regarding the dataset, please contact: Longyan Wu (im.longyanwu@gmail.com). |