For the Deep Learning Challenge, Module 21, I constructed a neural networks tool for the non-profit foundation Alphabet Soup. Specifically, I use the features of a dataset provided, charity_data.csv, to create a binary classifier that can predict whether applicants will be successful in their ventures if granted the requested funding. All programming was done in Python on a Google Colab platform.
In this Deep-Learing-Challenge repository, you will find the following:
The initial neural networks model construction program and model h5 file output:
AlphabetSoupCharity.ipynb
AlphabetSoupCharity.h5
Four optimization model programs and their respective model h5 file output:
AlphabetSoupCharity_Optimization_1.ipynb
AlphabetSoupCharity_Optimization_1.h5
AlphabetSoupCharity_Optimization_2.ipynb
AlphabetSoupCharity_Optimization_2.h5
AlphabetSoupCharity_Optimization_3.ipynb
AlphabetSoupCharity_Optimization_3.h5
AlphabetSoupCharity_Optimization_4.ipynb
AlphabetSoupCharity_Optimization_4.h5
The original dataset utilized in model construction:
charity_data.csv
The final Analysis Report, complete with a chart, bullet points, and even a touch of color and a few smiles sprinkled about:
Analysis_Report.docx
In conclusion, with multiple optimization runs, the accuracy goal of 75% is not only met, but surpassed with a whoppin’ 79% !!!! The complete details of the neural networks journey are provided within the Analysis Report…. not quite your adrenaline pumping thriller, but still, I hope you enjoy the reading 😊.