My PORTFOLIO

This project focuses on using deep neural networks to predict 3D contact geometry from monocular images of a GelSight tactile sensor. Tactile sensors, inspired by the biological sense of touch, capture intricate 3D geometry during interactions. Our objective was to extend these capabilities by predicting local heightmaps and contact areas from tactile images. By designing and training a neural network, we developed a model that reconstructs 3D geometry based on tactile data. The process involved creating a custom dataset, preprocessing raw tactile sensor data, and constructing a neural network with both coarse and fine prediction layers. The project leveraged supervised learning to generate accurate depth maps and contact predictions, improving the ability of robots to interpret tactile data. This work builds on cutting-edge research like GelSight and DIGIT tactile sensors, advancing tactile-based depth prediction technology in robotics.