Giter VIP home page Giter VIP logo

new_grad's Introduction

new_grad

Introduction

This is an optional bonus homework assignment for the course 11-485/11-685/11-785 Introduction to Deep Learning at Carnegie Mellon University.

Most modern machine learning and deep learning frameworks rely on a technique called "Automatic Differentiation" (or Autodiff for short) to compute gradients. In this homework assignment, we introduce a new Autodiff based framework for computing these gradients (called new_grad) with a focus on the backbone of the autodiff framework - the Autograd Engine - without the complexity of dealing with a special "Tensor" class, or the need to perform DFS during the backward pass over the computational graph.

Learning Objectives

In this (optional) bonus homework assignment, you will biuld your own version of an automatic differentiation library, in the context of the Deep Learning concepts that you learn in the course. What you will learn:

  1. What is a computational graph? How are mathematical operations recorded on a computational graph?
  2. The benefits of thinking at the granularity of operations, instead of layers,
  3. The simplicity of the chain rule of differentiation, and backpropogation,
  4. Building a PyTorch-like API without the use of a 'Tensor' class, and instead working directly with numpy arrays,

Instructions

Though we provide some starter code, you will get to complete key components of the library including the main Autograd engine. For specific instructions, please refer to the writeup included in this repository. Students enrolled in the course: submit your solutions through Autolab.

A quick demo of the API

Once you have completed all the key components of the assignment, you will be able to build and train simple neural networks:

  1. Import mytorch and the nn module:
>>> from mytorch import autograd_engine
>>> import mytorch.nn as nn
  1. Declare the autograd object, layers, activations, and loss:
>>> autograd_engine = autograd_engine.Autograd()
>>> linear1 = nn.Linear(input_shape, output_shape, autograd_engine)
>>> activation1 = nn.ReLU(autograd_engine)
>>> loss = nn.SoftmaxCrossEntropy(autograd_engine)
  1. Calculate the loss, and kick-off backprop
>>> y_hat = activation1(linear1(x))
>>> loss_val = loss(y, y_hat)
>>> loss.backward()

Developed by: Kinori Rosnow, Anurag Katakkar, David Park, Chaoran Zhang, Shriti Priya, Shayeree Sarkar, and Bhiksha Raj.

new_grad's People

Contributors

anuragkatakkar avatar chaoranzhang avatar fanzhou-kirito avatar kinorisr avatar monshri avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.