Giter VIP home page Giter VIP logo

eit_touch_sensor's Introduction

EIT Touch Sensor

This is my UROP (Undergrad Research) project in the Bio-Inspired Robotics Lab (Cambridge).

I investigate the use of EIT (electrical impedance tomography) as a method of determining touch inputs on robot skins (e-skins). Link to paper here. I focus on multi-touch applications in non-circular geometries, for which the traditional inverse-EIT algorithms are less accurate, and instead use a machine learning approach.

End-goal

To develop a robot e-skin which can be pressed with up to six fingers and determine the location(s) at which it was pressed without using any sensors within the skin.

Initial Setup

two_finger_training_short.mp4

Setup

A sensorised gelatin hydrogel is made and cast into a laser-cut rectangular frame. Electrodes are in contact with the skin and connected to a microcontroller (ATtiny AVR) which can measure impedances at high frequency.

A set of plastic fingers are 3D printed and connected to an Arduino-powered servo motor via two gear and track pairs for relative transverse and vertical motion. This setup is attached to a robot (Universal Robots, 5 kg max payload).

The robot and microcontrollers are programmed (Python, C++) to press the skin at a large number of random locations using two fingers at random separation distances and orientations. The EIT experiment is performed during each press down. The UR5 robot is a linkage of several rigid arms which allows for several degrees of freedom. The robot uses its own coordinate system relative to itself as follows:

Robot coordinates

The EIT readings (voltages relative to untouched baseline) and true positions (encoded as a binary displacement map) are used as the $(\mathbf{X}, \mathbf{y})$ datasets to train a series of machine learning algorithms: regression and neural networks.

Once an optimally trained model is found, when the skin receives a new press, it will perform the EIT experiment, use the model to predict the position, and show the position on the computer screen.

An initial idea was to use COMSOL to simulate the EIT process to gather the training data automatically, however the interplay between stress, conductivity and displacement for these soft composite materials was deemed too complex to be reliable, so the real robot was used instead.

As a demonstration, Braille (how blind people read) can be pressed into the skin to find the positions of the dots, and then passed into another algorithm to identify the letter it represents, allowing the skin to read Braille.

Supervised by and worked with: https://github.com/DSHardman/HydrogelEIT

eit_touch_sensor's People

Contributors

lorcan2440 avatar

Stargazers

 avatar  avatar piao peng avatar

Watchers

 avatar

Forkers

irobot-chf

eit_touch_sensor's Issues

baseline calibration

Hello, I would like to ask how you do baseline calibration. Can you share your experience?
image

Data Collection Issues

Hello, sorry to bother you. I have been troubled by a problem for a long time. After reading the article you wrote, I realized that it may be related to the article published by MIT. Because I am also studying this issue, but my EIT equipment uses the first ESP32 version. After the equipment is completed, I use the 16-electrode adjacent excitation measurement mode. I found that all the measured data fluctuate around 0.4 , the fluctuation will not be great, and now I don't know where the problem lies. If you have time, can you point me to it?
Thanks!
Part of the data is shown in the figure below:
1

EIT equipment issues

Sorry to bother you again. I don’t know if you remember the first question I asked on Github: about the EIT_ESP32 version. I haven't solved the problem at that time. Now I use the same device as you: EIT-Teensy4.0. However, the data collection is the same as the previous version. The data collection is wrong. This problem has not been solved. I would like to ask if you were able to collect normal data for the first time when you manufactured the equipment? If not, how did you solve it? Can you share your experience? The following is the data collected by EIT-Teensy4.0, which is obviously wrong.
2
1
3
Finally, I wish you all the best!
PengPiao

AD5930 signal

Hello, sorry to disturb you again. Recently I measured the sinusoidal differential signal output of the AD5930 with an oscilloscope and found that the output signal was very confusing, how did you solve this problem?
E1F5016BE797179FD63CC1259C405CA4
F1D405A897982ADC8605C3C0147082C9

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.