A Docker-based federated learning framework design and deployment for multi-modal data stream classification

被引:0
作者
Arijit Nandi
Fatos Xhafa
Rohit Kumar
机构
[1] Universitat Politècnica de Catalunya,Department of CS
[2] Eurecat,undefined
[3] Centre Tecnològic de Catalunya,undefined
来源
Computing | 2023年 / 105卷
关键词
Federated learning; High performance computing; Multi-modal data streaming; Docker-container; Real-time emotion classification; 68W15; 94A16; 68M20; 68T05; 68T07; 68P27;
D O I
暂无
中图分类号
学科分类号
摘要
In the high-performance computing (HPC) domain, federated learning has gained immense popularity. Especially in emotional and physical health analytics and experimental facilities. Federated learning is one of the most promising distributed machine learning frameworks because it supports data privacy and security by not sharing the clients’ data but instead sharing their local models. In federated learning, many clients explicitly train their machine learning/deep learning models (local training) before aggregating them as a global model at the global server. However, the FL framework is difficult to build and deploy across multiple distributed clients due to its heterogeneous nature. We developed Docker-enabled federated learning (DFL) by utilizing client-agnostic technologies like Docker containers to simplify the deployment of FL frameworks for data stream processing on the heterogeneous client. In the DFL, the clients and global servers are written using TensorFlow and lightweight message queuing telemetry transport protocol to communicate between clients and global servers in the IoT environment. Furthermore, the DFL’s effectiveness, efficiency, and scalability are evaluated in the test case scenario where real-time emotion state classification is done from distributed multi-modal physiological data streams under various practical configurations.
引用
收藏
页码:2195 / 2229
页数:34
相关论文
共 107 条
[11]  
Wang T(2017)Adaptive random forests for evolving data stream classification Mach Learn 8 2755-2790
[12]  
Bai G(2007)Dynamic weighted majority: an ensemble method for drifting concepts J Mach Learn Res 22 1517-1531
[13]  
Choi B-Y(2011)Incremental learning of concept drift in nonstationary environments IEEE Trans Neural Netw 31 497-508
[14]  
Nandi A(2001)Learn++: an incremental learning algorithm for supervised neural networks IEEE Trans Syst Man Cybernet Part C (Appl Rev) 8 5476-5497
[15]  
Xhafa F(2021)A survey on federated learning: the journey from centralized to distributed on-site learning and beyond IEEE Internet Things J 22 102-3
[16]  
Damián Segrelles Quilis J(2023)Modularfed: leveraging modularity in federated learning frameworks Internet of Things 32 19463-19476
[17]  
López-Huguet S(2015)Docker [software engineering] IEEE Softw 9 171-186
[18]  
Lozano P(2022)Kfiml: Kubernetes-based fog computing iot platform for online machine learning IEEE Internet Things J 81 18-31
[19]  
Blanquer I(2022)Information fusion for edge intelligence: a survey Inf Fusion 60 91332-91345
[20]  
Zou Z(2023)On the feasibility of federated learning towards on-demand client deployment at the edge Inf Process Manag 3 196-203