Laser-stripe-detection-neural-network
Update on 2020/07/20. Provide model trained on my own datasets.
TODO
- Support different backbones
- Multi-GPU training
Backbone | train/eval epoch | mIoU in val |
---|---|---|
ResNet | 300 | 74.31% |
MobileNet | 300 | 70.04% |
DRN | 300 | 63.14% |
paper
A Robust Laser Stripe Extraction Method for Structured-Light Vision Sensing https://www.mdpi.com/1424-8220/20/16/4544
Introduction
This is a PyTorch(1.4.0+cu100) implementation of Laser-stripe-detection-neural-network(LSDNN) which can eliminate the interference of reflective noise and haze noise and realize the highly robust extraction of laser stripes region.
The result of segmentation and extraction dealing with different noises. the input image (1920×1920 pixels). The middle column is the output of our network, where different colors denote different categories. The right column is the result of the center extraction based on the segmentation. The laser stripe’s center is marked as green.
The result of 3D reconstruction
Dataset
Dataset we used is made by ourselves. You can download the dataset from [baiduyun](链接:https://pan.baidu.com/s/1FA93B6Gzby5eESHk-RnT_Q 密码:gq6u).
More result image
You can download the more result image from [baiduyun](链接:https://pan.baidu.com/s/1TWJXz_bkGmUVgUi4qZUCIA 密码:61uo).
Installation
The code was tested with Anaconda and Python 3.6. After installing the Anaconda environment:
-
Clone the repo:
git clone [email protected]:zhaocongyang/LSDNN.git cd LSDNN
-
Install dependencies:
For PyTorch dependency, see pytorch.org for more details.
For custom dependencies:
pip install matplotlib pillow tensorboardX tqdm
Training
Fellow steps below to train your model:
-
Configure your dataset path in mypath.py.
-
To train LSDNN using our dataset and ResNet as backbone:
sh train_voc.sh
Testing
Fellow steps below to test your model:
-
You can download model trained by us from [baiduyun](链接:https://pan.baidu.com/s/1h4X0UaKIeff4ZBC1oU1C2A 密码:r3k7).
-
To test the model
CUDA_VISIBLE_DEVICES=0 python test_center_gpu_sensors.py
demo with single image
Fellow steps below to run a demo:
-
Configure your image path
-
To run a demo
CUDA_VISIBLE_DEVICES=0 python single_demo.py