This lesson summarizes the topics we'll be covering in section 41 and why they'll be important to you as a data scientist.
You will be able to:
- Understand and explain what is covered in this section
- Understand and explain why the section will help you to become a data scientist
In the previous section you have been given quite some insight on how neural networks work. In this section, you'll learn why deeper networks sometimes lead to better results, and we'll generalize what you have learned before to get your matrix dimensions right in deep networks. You'll build deeper neural networks from scratch, and learn how to build neural networks in Keras.
You'll learn that deep representation are really good at automating what used to be a tedious and time-consuming process process of feature engineering. In this section, you'll see that you can actually build a smaller but deeper neural network with exponentially less hidden units which performs even better than a network with more hidden units. The reason for this is that learning happens in each layer, and adding more layers (even with fewer limits) can lead to very powerful predictions! You'll learn about matrix notation for these deep networks and how to build a network like that from scratch.
You'll be introduced to Keras, open source neural network library in Python, which makes building neural networks surprisingly easy. Before building your first neural network model in Keras, you'll learn about tensors and why they are important when building deep learning models.
In this section, you'll extend your deep learning knowledge by learning about deeper neural networks. You'll also learn how to use Keras to build deep learning models!