In this talk we discuss approaches for distributed machine learning (ML) in resource-constrained Internet of Things (IoT) networks. Federated Learning (FL) and Split Learning (FL) are popular approaches in such environments. First, we present Early Exit of Communication (EEoC), which adaptively splits ML inference in an IoT edge computing environment to meet latency and energy constraints. This layer-based (vertically partitioned) approach has been extended by Distributed Micro-Split Deep Learning in Heterogeneous Dynamic IoT (DISNET), which adds horizontal partitioning to better support flexible, distributed, and parallel execution of neural network models on heterogeneous IoT devices under dynamic conditions. Then, we also consider the training aspect by developing and evaluating Adaptive REsource-aware Split-learning (ARES), a scheme for efficient model training in IoT systems. Recent work suggests Dynamic FL (DFL) for heterogeneous IoT, which uses resource-aware SL and FL based on similarity-based layer-wise model aggregation. This work has been performed jointly with Eric Samikwa, see also cds.unibe.ch/research/publications/conference_and_journal_papers for most recent
publications
Imparte
Prof. Dr. Torsten Braun
Communication and Distributed Systems
Group, University of Bern