hgupt3 commited on
Commit
c4160bf
Β·
verified Β·
1 Parent(s): 75857a3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +160 -3
README.md CHANGED
@@ -1,3 +1,160 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - en
5
+ ---
6
+ # SITR Dataset
7
+
8
+ This repository hosts the dataset for the Sensor-Invariant Tactile Representation (SITR) paper. The dataset supports training and evaluating models for sensor-invariant tactile representations across simulated and real-world settings.
9
+ The codebase implementing SITR is available on GitHub: [SITR Codebase](https://github.com/hgupt3/gsrl)
10
+
11
+ For more details on the underlying methods and experiments, please visit our [project website](https://hgupt3.github.io/sitr/) and read the [arXiv paper](https://arxiv.org/abs/2502.19638).
12
+
13
+ ---
14
+
15
+ ## Dataset Overview
16
+
17
+ The SITR dataset consists of three main parts:
18
+
19
+ 1. **Simulated Tactile Dataset**
20
+ A large-scale synthetic dataset generated using physics-based rendering (PBR) in Blender. This dataset spans 100 unique simulated sensor configurations with tactile signals, calibration images, and corresponding surface normal maps. It includes 10K unique contact configurations generated using 50 high-resolution 3D meshes of common household objects, resulting in a pre-training dataset of 1M samples.
21
+
22
+ 2. **Real-World Tactile Dataset – Classification**
23
+ Data collected from 7 real sensors (including variations of GelSight Mini, GelSight Hex, GelSight Wedge, and DIGIT). For the classification task, 20 objects are pressed against each sensor at various poses and depths, accumulating 1K tactile images per object (140K images in total, with 20K per sensor). We decided to only use 16 of the objects for our classification experiments and some of the items were deemed unsuitable (this was decided before experimentation). The dataset is provided as separate train (80%) and test sets (20%).
24
+
25
+ 3. **Real-World Tactile Dataset – Pose Estimation**
26
+ For pose estimation, tactile signals are recorded using a modified Ender-3 Pro 3D printer equipped with 3D-printed indenters. This setup provides accurate ground truth (x, y, z coordinates) for contact points. Data were collected for 6 indenters across 4 sensors, resulting in 1K samples per indentor (24K images in total, 6K per sensor). This dataset is also organized into train and test sets.
27
+
28
+ ---
29
+
30
+ ## Download and Setup
31
+
32
+ ### Simulated Tactile Dataset
33
+
34
+ The simulated dataset is split into two parts due to its size:
35
+
36
+ - `renders_part_aa.zip`
37
+ - `renders_part_ab.zip`
38
+
39
+ **To merge and unzip:**
40
+
41
+ 1. **Merge the parts into a single zip file:**
42
+
43
+ ```bash
44
+ cat renders_part_aa.zip renders_part_ab.zip > renders.zip
45
+ ```
46
+
47
+ 2. **Unzip the merged file:**
48
+
49
+ ```bash
50
+ unzip renders.zip -d your_desired_directory
51
+ ```
52
+
53
+ ### Real-World Datasets (Classification & Pose Estimation)
54
+
55
+ The real-world tactile datasets for classification and pose estimation are provided as separate zip files. Each of these zip files contains two directories:
56
+
57
+ - `train_set/`
58
+ - `test_set/`
59
+
60
+ Simply unzip them in your desired directory:
61
+
62
+ ```bash
63
+ unzip classification_dataset.zip -d your_desired_directory
64
+ unzip pose_dataset.zip -d your_desired_directory
65
+ ```
66
+
67
+ ---
68
+
69
+ ## File Structure
70
+
71
+ Below are examples of the directory trees for each dataset type.
72
+
73
+ ### 1. Simulated Tactile Dataset
74
+
75
+ ```
76
+ data_root/
77
+ β”œβ”€β”€ sensor_0000/
78
+ β”‚ β”œβ”€β”€ calibration/ # Calibration images
79
+ β”‚ β”‚ β”œβ”€β”€ 0000.png # Background image
80
+ β”‚ β”‚ β”œβ”€β”€ 0001.png
81
+ β”‚ β”‚ └── ...
82
+ β”‚ β”œβ”€β”€ samples/ # Tactile sample images
83
+ β”‚ β”‚ β”œβ”€β”€ 0000.png
84
+ β”‚ β”‚ β”œβ”€β”€ 0001.png
85
+ β”‚ β”‚ └── ...
86
+ β”‚ β”œβ”€β”€ dmaps/ # (Optional) Depth maps
87
+ β”‚ β”‚ β”œβ”€β”€ 0000.npy
88
+ β”‚ β”‚ └── ...
89
+ β”‚ └── norms/ # (Optional) Surface normals
90
+ β”‚ β”œβ”€β”€ 0000.npy
91
+ β”‚ └── ...
92
+ β”œβ”€β”€ sensor_0001/
93
+ └── ...
94
+ ```
95
+
96
+ ### 2. Real-World Classification Dataset
97
+
98
+ Each of the `train_set/` and `test_set/` directories follows this structure:
99
+
100
+ ```
101
+ train_set/ (or test_set/)
102
+ β”œβ”€β”€ sensor_0000/
103
+ β”‚ β”œβ”€β”€ calibration/ # Calibration images
104
+ β”‚ β”œβ”€β”€ samples/ # Organized by class
105
+ β”‚ β”‚ β”œβ”€β”€ class_0000/
106
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 0000.png
107
+ β”‚ β”‚ β”‚ └── ...
108
+ β”‚ β”‚ β”œβ”€β”€ class_0001/
109
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 0000.png
110
+ β”‚ β”‚ β”‚ └── ...
111
+ β”‚ β”‚ └── ...
112
+ β”œβ”€β”€ sensor_0001/
113
+ └── ...
114
+ ```
115
+
116
+ ### 3. Real-World Pose Estimation Dataset
117
+
118
+ Similarly, each of the `train_set/` and `test_set/` directories is structured as follows:
119
+
120
+ ```
121
+ train_set/ (or test_set/)
122
+ β”œβ”€β”€ sensor_0000/
123
+ β”‚ β”œβ”€β”€ calibration/ # Calibration images
124
+ β”‚ β”œβ”€β”€ samples/ # Tactile sample images
125
+ β”‚ β”‚ β”œβ”€β”€ 0000.png
126
+ β”‚ β”‚ β”œβ”€β”€ 0001.png
127
+ β”‚ β”‚ └── ...
128
+ β”‚ └── locations/ # Pose/Location data
129
+ β”‚ β”œβ”€β”€ 0000.npy
130
+ β”‚ β”œβ”€β”€ 0001.npy
131
+ β”‚ └── ...
132
+ β”œβ”€β”€ sensor_0001/
133
+ └── ...
134
+ ```
135
+
136
+ ---
137
+
138
+ ## Citation
139
+
140
+ If you use this dataset in your research, please cite:
141
+
142
+ ```bibtex
143
+ @misc{gupta2025sensorinvarianttactilerepresentation,
144
+ title={Sensor-Invariant Tactile Representation},
145
+ author={Harsh Gupta and Yuchen Mo and Shengmiao Jin and Wenzhen Yuan},
146
+ year={2025},
147
+ eprint={2502.19638},
148
+ archivePrefix={arXiv},
149
+ primaryClass={cs.RO},
150
+ url={https://arxiv.org/abs/2502.19638},
151
+ }
152
+ ```
153
+
154
+ ---
155
+
156
+ ## License
157
+
158
+ This dataset is licensed under the MIT License. See the LICENSE file for details.
159
+
160
+ If you have any questions or need further clarification, please feel free to reach out.