Deep Learning Booklet

Deep Learning Booklet

H2O.ai
Buy on Leanpub

Table of Contents

Deep Learning Booklet

  • What is H2O?
  • Introduction
    • Installation
    • Support
    • Deep Learning Overview
  • H2O’s Deep Learning Architecture
    • Summary of Features
    • Training Protocol
    • Regularization
    • Advanced Optimization
    • Loading Data
    • Additional Parameters
  • Use Case: MNIST Digit Classification
    • MNIST Overview
    • Performing a Trial Run
    • Web Interface
    • Grid Search for Model Comparison
    • Checkpoint Models
    • Achieving World Record Performance
  • Deep Autoencoders
    • Nonlinear Dimensionality Reduction
    • Use Case: Anomaly Detection
  • Appendix A: Complete Parameter List
  • Appendix B: References
Deep Learning Booklet/Appendix B: References

Appendix B: References

H2O website

H2O documentation

H2O Github Repository

H2O Training

H2O Training Scripts and Data

Code for this Document

H2O support

H2O JIRA

YouTube Videos

Learning Deep Architectures for AI. Bengio, Yoshua, 2009.

Efficient BackProp LeCun et al, 1998.

Maxout Networks Goodfellow et al, 2013

HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent. Niu et al, 2011

Improving neural networks by preventing co-adaptation of feature detectors Hinton et al., 2012.

On the importance of initialization and momentum in deep learning. Sutskever et al, 2014.

ADADELTA: AN ADAPTIVE LEARNING RATE METHOD. Zeiler, 2012.

H2O GitHub repository for the H2O Deep Learning documentation

MNIST database

Reducing the Dimensionality of Data with Neural Networks. Hinton et al, 2006

The Definitive Performance Tuning Guide for H2O Deep Learning Candel, Arno, 2015.

In this chapter

  • Appendix B: References