YCB-Sight is a visuo-tactile dataset including the simulated and real data from a GelSight tactile sensor and Kinect Azure RGB-D camera on the YCB dataset.
You can find the whole dataset here, or download partial data below
Simulated tactile and depth data with Taxim and pyrender
| Object Name | Size (MB) | Link |
|---|---|---|
| 002_master_chef_can | 64.3 | [Link] |
| 003_cracker_box | 63.2 | [Link] |
| 004_sugar_box | 61.2 | [Link] |
| 005_tomato_soup_can | 63.8 | [Link] |
| 006_mustard_bottle | 63.8 | [Link] |
| 007_tuna_fish_can | 63.2 | [Link] |
| 008_pudding_box | 61.5 | [Link] |
| 009_gelatin_box | 60.3 | [Link] |
| 010_potted_meat_can | 62.8 | [Link] |
| 011_banana | 63.7 | [Link] |
| 012_strawberry | 64.2 | [Link] |
| 013_apple | 63.4 | [Link] |
| 014_lemon | 63.4 | [Link] |
| 017_orange | 63.2 | [Link] |
| 019_pitcher_base | 64.5 | [Link] |
| 021_bleach_cleanser | 62.6 | [Link] |
| 024_bowl | 65.1 | [Link] |
| 025_mug | 64.2 | [Link] |
| 029_plate | 66.3 | [Link] |
| 035_power_drill | 64.7 | [Link] |
| 036_wood_block | 60.1 | [Link] |
| 037_scissors | 64.2 | [Link] |
| 042_adjustable_wrench | 64.7 | [Link] |
| 043_phillips_screwdriver | 63.9 | [Link] |
| 048_hammer | 64.1 | [Link] |
| 055_baseball | 63.5 | [Link] |
| 056_tennis_ball | 63.4 | [Link] |
| 072-a_toy_airplane | 65.5 | [Link] |
| 072-b_toy_airplane | 63.8 | [Link] |
| 077_rubiks_cube | 61.3 | [Link] |
Collected tactile and depth data from real world experiments
| Object Name | Size (GB) | Link |
|---|---|---|
| 002_master_chef_can | 0.97 | [Link] |
| 004_sugar_box | 1.15 | [Link] |
| 005_tomato_soup_can | 1.09 | [Link] |
| 010_potted_meat_can | 1.09 | [Link] |
| 021_bleach_cleanser | 1.23 | [Link] |
| 036_wood_block | 1.02 | [Link] |
YCBSight-Sim
├── obj1
│ ├── gt_contact_mask
│ │ ├── <idx>.npy
│ │ └── ...
│ ├── gt_height_map
│ │ ├── <idx>.npy
│ │ └── ...
│ ├── gelsight
│ │ ├── <idx>.jpg
│ │ └── ...
│ ├── pose.txt
│ ├── depthCam.npy
│ └── depthCam.pdf
├── obj2
└── ...YCBSight-Real
├── obj1
│ ├── gelsight
│ │ ├── gelsight_<idx>_<timestamp>.jpg
│ │ └── ...
│ ├── depth
│ │ └── depth_0_<timestamp>.tif
│ ├── pc
│ │ └── pc_0_<timestamp>.npy
│ ├── rgb
│ │ ├── rgb_<idx>_<timestamp>.jpg
│ │ └── ...
│ ├── robot.csv
│ ├── tf.json
│ └── obj1.mp4
├── obj2
└── ...The visualization and data processing are implemented in python3 and require numpy, scipy, matplotlib, cv2.
To install dependencies: pip install -r requirements.txt.
scripts/lookup_mapping/lookup.pyreconstructs the height maps from the tactile readings. Here are several parameters to set:path2model: the path to the directory storing the YCBSight-Real and/or YCBSight-Simsim: True/False, visualize whether the simulated data or real dataobj: specify a certain object's data to visualize, or set to None to visualize all the data
scripts/data_visualization/data_visualizer.pyvisualize the data in YCB-Sight dataset.
Please refer to this repo (pytorch version) and this repo (tensorflow version).
This dataset is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License, with the accompanying processing code licensed under the MIT License.
If you use YCB-Sight dataset in your research, please cite:
@article{suresh2021efficient,
title={Efficient shape mapping through dense touch and vision},
author={Suresh, Sudharshan and Si, Zilin and Mangelson, Joshua G and Yuan, Wenzhen and Kaess, Michael},
journal={arXiv preprint arXiv:2109.09884},
year={2021}
}

