Giter VIP home page Giter VIP logo

lak19's Introduction

The 9th International

Learning Analytics & Knowledge Conference

Tempe, Arizona

March 4-8, 2019 #LAK19

Python Bootcamp for Learning Analytics Practioners

Instructors: Alfred Essa, Lalitha Agnihotri

Teaching Assistants: Kim Pham, Eddie Lin

Note: Course Materials, including detailed syllabus and Jupyter Notebooks will be available at the Course Wiki.

Summary: The hands-on tutorial will provide a rigorous introduction to python for learning analytics practitioners. The intensive tutorial consists of five parts: a) basic and intermediate python; b) statistics and visualization; c) machine learning d) causal inferencing and d) deep learning. The tutorial will be motivated throughout by educational datasets and examples. The aim of the tutorial is to provide a thorough introduction to computation and statistical methodologies in modern learning analytics.

1.1 Python. Python is the de facto language for scientific computing and one of the principal languages, along with R, for data science and machine learning. Along with foundational concepts such as data structures, functions, and iteration we will cover intermediate concepts such as comprehensions, collections, generators, map/filter/reduce, and object orientation. Special emphasis will be given to coding in “idiomatic Python”.

1.2 Data Analysis and Basic Statistics. In this section we will introduce the core python libraries for data analysis and basic statistics: numpy, pandas, matplotlib and seaborn. We will use the Jupyter Notebook environment for interactive data analysis, annotation, and collaboration. We will emphasize exploratory data analysis is a foundational step for deriving insights from data. It also serves as a prelude to building formal models and simulations.

1.3 Machine Learning. In this section we will introduce participants to basic machine learning concepts and their application using the scikit-learn library. We will show how to predict continuous and categorical outcomes, for example, using linear and logistic regression. This demonstration will show how to create an entire prediction pipeline from scratch: starting from loading and pre-processing data to building and evaluating the model. Some discussion of what an educator might do with such a model will be included.

1.4 Causal Inferencing. In this section of the tutorial we build on our statistical understanding of correlation to study causality. Randomized control trials (RCTs) are considered the gold standard in efficacy studies because they aim to establish causality of interventions. But RCTs are very often impractical to carry out and have other limitations. Causal inference from Observational Studies (OS) is another form of statistical analysis to evaluate intervention effects. In causal inference, the causal effect of an intervention on a particular outcome is studied using observed data, without the need for randomization in advance. In this tutorial, we will show design of an OS to leverage the large amounts of data available through online learning platforms and student information systems to draw causal claims about their effectiveness.

1.5 Deep Learning. In this section we introduce how to build deep learning models. Deep learning is one of the fastest growing areas of machine learning and is particularly well suited for very large datasets. We begin by building a toy deep learning model by scratch in python. This is to understand the five foundational concepts of deep learning: neurons as the atomic computational unit of deep learning networks; neurons as organized in stacked layers to achieve increasingly abstract data representations; forward propagation as the end-to-end computational process for generating predictions; loss and cost functions as the method for quantifying the error between prediction and ground truth; and back propagation as the computational process for systematically reducing the error by adjusting the network’s parameters. After developing a conceptual understanding of deep learning, we apply some standard Python libraries such as Keras, PyTorch, and TensorFlow to build deep learning models.

lak19's People

Contributors

alfredessa avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.