An implementation of GelSight Wedge.
Test passed on Ubuntu 18.04 and Windows 10. 🚩
python>=3.7
opencv-python
opencv4.x (C++ Version)
pybind11
pyyaml
argparse
glob
numpy
matplotlib
scipy
skimage
open3d
Visual Studio 2022
🔨 🔨
We connect the camera module to a raspberrypi zero for image capture.
We used surface mounted LEDs with 120 degree angle (in RGB colors) as the light source. Three LED arrays (each contains 2 LEDs) are circly distributted under the silicone.
- We 3D printed the mould (with resin) and laser-cut the acrylic sheet (placed at the bottom of the mould) for transparent silicone base manufacturing.
- we use Solaris (part A and part B) with Shore A 15 and Slacker (used to increase softness) from vendor Smooth-on® to produce the transparent elastomeric base. A ratio of 1:1:3 for each component has proven to be ideal for making an elastomeric base with the appropriate hardness. The mixture is then degassed and cured for 12 hours.
We painted markers on top surface of the transparent base with Silc-Pig (black colorant). The distance between each marker is around 1 mm.
- We dip a small amount of aluminum powder and spread it evenly upon the black markers.
- we use aluminum powder, Psycho Paint (part A and part B) and Novocs Matte (silicone diluter) to produce the reflective membrane. A ratio of 1:5:5:30 for each component has proven to be ideal for making a moderate membrane. The mixture is degassed and then sprayed on top of the transparent base surface. The membrane is cured for 4 hours.
camid
for cv2.VideoCapture(camid)
.
sample: from
for {data_path}/sample_{sample_from}.jpg
.
sample: to
for {data_path}/sample_{sample_to}.jpg
.
data_path
for {data_path}
python pref_ref_and_sample.py -r -s
-r
or --ref
for capturing {data_path}/ref.jpg
.
Click left button
to take ref.jpg
, you can click more than once until you are satiesfied. Then press q
to exit or continue.
-s
or --ref
for capturing {data_path}/sample_xx.jpg
.
Click left button
to take sample_xx.jpg
. If sample_to - sample_from
pictures are captured, it will terminate automatically. Or you can manually press q
to exit in advance or continue.
Click once on the first keypoint, then click once on the other keypoint, you will see an arrow linking 2 keypoints with their distance. Eg: distance = 103.07764
In line 13
of calibration.py
:
self.Pixmm
=
Get reconstruction result.
Get tracking result.
🔨Coming soon!🔨 Thin plate spline for inpaint of markers?
Use model to map color to gradients.
We used https://github.com/siyuandong16/gelsight_heightmap_reconstruction for calibration and heightmap reconstruction and https://github.com/GelSight/tracking for tracking.
https://arxiv.org/abs/2106.08851
https://github.com/siyuandong16/gelsight_heightmap_reconstruction
https://github.com/GelSight/tracking
https://tutorial.cytron.io/2020/12/29/raspberry-pi-zero-usb-webcam/