AdaLearner: An adaptive distributed mobile learning system for neural networks

Published

Conference Paper

© 2017 IEEE. Neural networks hold a critical domain in machine learning algorithms because of their self-adaptiveness and state-of-the-art performance. Before the testing (inference) phases in practical use, sophisticated training (learning) phases are required, calling for efficient training methods with higher accuracy and shorter converging time. Many existing studies focus on the training optimization on high-performance servers or computing clusters, e.g. GPU clusters. However, training neural networks on resource-constrained devices, e.g. mobile platforms, is an important research topic barely touched. In this paper, we implement AdaLearner-an adaptive distributed mobile learning system for neural networks that trains a single network with heterogenous mobile resources under the same local network in parallel. To exploit the potential of our system, we adapt neural networks training phase to mobile device-wise resources and fiercely decrease the transmission overhead for better system scalability. On three representative neural network structures trained from two image classification datasets, AdaLearner boosts the training phase significantly. For example, on LeNet, 1.75-3.37X speedup is achieved when increasing the worker nodes from 2 to 8, thanks to the achieved high execution parallelism and excellent scalability.

Full Text

Duke Authors

Cited Authors

  • Mao, J; Qin, Z; Xu, Z; Nixon, KW; Chen, X; Li, H; Chen, Y

Published Date

  • December 13, 2017

Published In

Volume / Issue

  • 2017-November /

Start / End Page

  • 291 - 296

International Standard Serial Number (ISSN)

  • 1092-3152

International Standard Book Number 13 (ISBN-13)

  • 9781538630938

Digital Object Identifier (DOI)

  • 10.1109/ICCAD.2017.8203791

Citation Source

  • Scopus