Training, testing and inference with multi-input multi-output LSTM-based Recurrent Neural Networks for trajectory forecasting
In order to use the training, testing, inference scripts and utility tools, you need to create/clone an Anaconda dedicated environment containing predifined packages (i.e. tensorflow-gpu). If you do not already have Anaconda installed on your machine, please follow the installation steps avalaible at: https://www.digitalocean.com/community/tutorials/how-to-install-anaconda-on-ubuntu-18-04-quickstart
Once you have successfully installed Anaconda, create a new environment with from a previously created Anaconda environment by using the environment.yaml
file. Then, please activate your recently created environment.
# Create a new environment based on `environment.yaml` file
conda env create -f environment.yaml
# Activate your recently created environment
conda activate <my_environment>
CLI for Training Path Prediction LSTM-based neural models.
train_lstm.py [-h] dataset output config
Positional arguments:
dataset Path to input train-test dataset in .hdf5 format
output Path to the output directory where resulting models, graphs and history results are saved.
config Path to configuration file used for training.
optional arguments:
-h, --help show this help message and exit
python3 train_lstm.py path/to/training_lstm_dataset.hdf5 path/to/output/directory config_lstm.json
Command line tool for making Path Predictions using trained models
evaluate_lstm.py [-h] dataset output configuration_file
positional arguments:
dataset Path to input test dataset in .hdf5 format
output Path to the output directory where resulting training graphs and prediction results are computed.
config Path to configuration file used for testing.
optional arguments:
-h, --help show this help message and exit
python3 evaluate_lstm.py path/to/test_lstm_dataset.hdf5 path/to/trained/model/output/directory config_lstm.json
Command line utility to create a training dataset in HDF5 format for the track recurrent neural network from CSV files created by the classifier + Kalman filter tracker.
dataset-creator [-h] [-a] input_directory output
positional arguments:
input_directory Path of a directory with CSV files to extract tracks.
output Path of the output training dataset file with .hdf5 extension.
optional arguments:
-h, --help show this help message and exit
-a, --append Flag to indicate that an existing HDF5 file can be used and new datasets should be appended to it.
python3 dataset-creator path/to/raw/classifier/detection/files/ path/to/output_raw_dataset.hdf5
Command line tool for generating train and test sets for LSTM-based models from an input raw dataset created using Raw Dataset Creator tool.
dataset_transformer.py [-h] raw_dataset out_dataset configuration_file
positional arguments:
raw_dataset Path to input raw dataset in .h5 format
out_dataset Path to output train/test dataset in .h5 format
config Path to configuration file used for dataset transformer tool.
optional arguments:
-h, --help show this help message and exit
python3 dataset_transformer.py path/to/raw_dataset.hdf5 path/to/output/train_test_dataset.hdf5 config_transformer.json
Command line tool for merging two HDF5 datasets already generated by dataset_transformer.py
.
dataset_merger.py [-h] input_dataset_1 input_dataset_2 out_dataset
positional arguments:
input_dataset_1 Full path to first input dataset in .hdf5 format to merge
input_dataset_2 Full path to second input dataset in .hdf5 format to merge
out_dataset Path to output train/test dataset in .h5 format
optional arguments:
-h, --help show this help message and exit
python3 dataset_merger.py path/to/input_dataset_1.hdf5 path/to/input_dataset_2.hdf5 path/to/output/merged_dataset.hdf5
The keras_to_tensorflow.py
is a CLI that converts a trained keras model into a ready-for-inference TensorFlow PB model.
-
In the default behaviour, this tool freezes the nodes (converts all TF variables to TF constants), and saves the inference graph and weights into a binary protobuf (.pb) file. During freezing, TensorFlow also applies node pruning which removes nodes with no contribution to the output tensor.
-
This tool supports multiple output networks and enables the user to rename the output tensors via the
--output_nodes_prefix
flag. -
If the
--output_meta_ckpt
flag is set, the checkpoint and metagraph files for TensorFlow will also be exported which can later be used in thetf.train.Saver
class to continue training.
Keras models can be saved as a single [.hdf5
or h5
] file, which stores both the architecture and weights, using the model.save()
function.
This model can be then converted to a TensorFlow model by calling this tool as follows:
python keras_to_tensorflow.py
--input_model="path/to/keras/model.h5"
--output_model="path/to/save/model.pb"
Keras models can also be saved in two separate files where a [.hdf5
or h5
] file stores the weights, using the model.save_weights()
function, and another .json
file stores the network architecture using the model.to_json()
function.
In this case, the model can be converted as follows:
python keras_to_tensorflow.py
--input_model="path/to/keras/model.h5"
--input_model_json="path/to/keras/model.json"
--output_model="path/to/save/model.pb"
Try
python keras_to_tensorflow.py --help
to learn about other supported flags (quantize, output_nodes_prefix, save_graph_def).
- keras
- tensorflow
- absl
- pathlib