Jointly Optimizing Dataset Size and Local Updates in Heterogeneous Mobile Edge Learning

12 Jun 2020  ·  Umair Mohammad, Sameh Sorour, Mohamed Hefeida ·

This paper proposes to maximize the accuracy of a distributed machine learning (ML) model trained on learners connected via the resource-constrained wireless edge. We jointly optimize the number of local/global updates and the task size allocation to minimize the loss while taking into account heterogeneous communication and computation capabilities of each learner. By leveraging existing bounds on the difference between the training loss at any given iteration and the theoretically optimal loss, we derive an expression for the objective function in terms of the number of local updates. The resulting convex program is solved to obtain the optimal number of local updates which is used to obtain the total updates and batch sizes for each learner. The merits of the proposed solution, which is heterogeneity aware (HA), are exhibited by comparing its performance to the heterogeneity unaware (HU) approach.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here