Introduction to Deep Learning (Part 2)

Speaker: Andrew Collier

Track: Data Science

Type: Tutorial

Room: Boundary Room

Time: Oct 09 (Wed), 13:30

Duration: 4:00

This is the second half of the 2-session tutorial, and follows on from Part 1.

Deep Learning is a vast and convoluted topic. It’s hard to know where to start. This workshop will help you take your first steps with Deep Learning.

The workshop will introduce you to the fundamental concepts behind Deep Learning and show you how to get started building models using Python and Keras. You'll learn some of the underlying maths (a PhD in Mathematics will not be required!) and work through several examples.

You’ll walk away with an appreciation for what’s possible with Deep Learning and sufficient hands-on experience to start building your own models.

All material will be available as Jupyter Notebooks.

Contents

  • Introduction to Neural Networks
    • Weights and bias
    • Activation functions
    • Loss functions
    • Chain rule and back-propagation
    • Project — Simple binary classifier
    • Where neural networks fail: images
  • Deep Learning
    • An overview of TensorFlow
    • First steps with Keras
  • Convolutional Neural Networks
    • Convolution layers
    • Filters and padding
    • Pooling
    • Activation functions: sigmoid, relu and softmax
    • Project — Image classification
  • Recurrent Neural Networks
    • Back-propagation through time
    • Long Short-Term Memory (LSTM)
    • Project — Text prediction

Target Audience

This workshop is aimed at people with little or no prior experience with Deep Learning. If you're already a Deep Learning ninja, then this is not for you!

Prerequisites

Familiarity with programming in Python. A basic understanding of Machine Learning concepts will be helpful but certainly not essential.

Setup

See Part 1 for the setup instructions.