Giter VIP home page Giter VIP logo

partial-stem's Introduction

Partial Scanning Transmission Electron Microscopy

DOI

This repository is for the preprint|paper "Partial Scanning Transmission Electron Microscopy with Deep Learning". Supplementary information is here and Bethany Connolly published a summary in Towards Data Science.

Our repository contains TensorFlow code for a multi-scale generative adversarial network that completes 512x512 electron micrographs from partial scans. For spiral scans selected by a binary mask with 1/17.9 px coverage and non-adversarial pre-training, it has a 3.8% root mean square intensity error.

Examples show adversarial and non-adversarial completions of test set 512ร—512 1/20 coverage blurred spiral partial scans. Adversarial completions have realistic noise characteristics and colouration whereas non-adversarial completions are blurry. The bottom row shows a failure case where detail is too fine for the generator to resolve. Enlarged 64ร—64 regions from the top left of each image are inset to ease comparison.

A set of directories for spiral scans selected with binary masks is in pstem. Coverages are listed in notes.txt files. Each directory contains source code, notes, and script variants used to calculate test set performances and create sheets of examples.

A set of directories for systematic error experiments is in systematic_errors.

Architecture

Our training configuration can be partitioned into six subnetworks: an inner and outer generator, inner generator trainer and small, medium and large scale discriminators. The generators are all that is needed for inference.

The inner generator produces large-scale features from inputs. These are mapped to half-size completions by a trainer network and recombined with the input to generate full-size completions by the outer generator. Multiple discriminators assess multi-scale crops from input images and full-size completions.

Example Usage

This short script is available as inference.py and gives an example of inference where the generator is loaded once to complete multiple scans:

from inference import Generator, get_example_scan, disp

#Use get_example_scan to select an example partial scan, ground truth pair from the project repository
#Try replacing this with your own (partial scan, ground truth) pair!
partial_scan, truth = get_example_scan() #Uses one of the examples from this repo

#Initialize generator so it's ready for repeated use
my_ckpt_dir = "path/to/model/checkpoint/" #Replace with path to your checkpoint
gen = Generator(ckpt_dir=my_ckpt_dir)

#Complete the scan
complete_scan = gen.infer(crop) 

#The generator can be reused multiple times once it has been initialised
# ... 

#Display results
disp(partial_scan) #Partial scan to be completed
disp(truth) #Ground truth
disp(complete_scan) #Scan completed by neural network

Download

Training and inference scripts can be downloaded or cloned from this repository

git clone https://github.com/Jeffrey-Ede/partial-STEM.git
cd partial-STEM

Dependencies

This neural network was trained using TensorFlow and requires it and other common python libraries. Most of these libraries come with modern python distributions by default. If you don't have some of these libraries, they can be installed using pip or another package manager. We used python version 3.6.

  • tensorFlow
  • numpy
  • cv2
  • functools
  • itertools
  • collections
  • six
  • os
  • argparse
  • random
  • scipy
  • Image
  • time
  • PIL

Training

To continue training the neural network; from scratch or to fine-tune it, you will need to adjust some of the variables at the top of train.py. Specifically, variables indicating the location of your datasets and locations to save logs and checkpoints to. Note that there may be minor differences between the script and the paper due to active development.

The last saved checkpoint for a fully trained 1/20 px coverage system of neural networks is available here. A 1/40 px coverage model is also available. Both networks were trained on artificially noisy scans.

Training Data

A training dataset with 161069 non-overlapping 512x512 crops from STEM images is available here.

Misc Scripts

Python scripts used to create some of the images in our paper are in the misc folder.

Contact

Jeffrey M. Ede: [email protected]
Richard Beanland: [email protected]

partial-stem's People

Contributors

jeffrey-ede avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.