RT Journal Article T1 An open source framework based on Kafka-ML for Distributed DNN inference over the Cloud-to-Things continuum A1 Torres, Daniel R. A1 Martín-Fernández, Cristian A1 Díaz, Manuel A1 Rubio-Muñoz, Bartolomé K1 Computación K1 Inteligencia artificial AB The current dependency of Artificial Intelligence (AI) systems on Cloud computing implies higher transmission latency and bandwidth consumption. Moreover, it challenges the real-time monitoring of physical objects, e.g., the Internet of Things (IoT). Edge systems bring computing closer to end devices and support time-sensitive applications. However, Edge systems struggle with state-of-the-art Deep Neural Networks (DNN) due to computational resource limitations. This paper proposes a technology framework that combines the Edge-Cloud architecture concept with BranchyNet advantages to support fault-tolerant and low-latency AI predictions. The implementation and evaluation of this framework allow assessing the benefits of running Distributed DNN (DDNN) in the Cloud-to-Things continuum. Compared to a Cloud-only deployment, the results obtained show an improvement of 45.34% in the response time. Furthermore, this proposal presents an extension for Kafka-ML that reduces rigidness over the Cloud-to-Things continuum managing and deploying DDNN. PB Elsevier YR 2021 FD 2021-06-16 LK https://hdl.handle.net/10630/22511 UL https://hdl.handle.net/10630/22511 LA eng NO Journal of Systems Architecture, Volume 118, September 2021, 102214 DS RIUMA. Repositorio Institucional de la Universidad de Málaga RD 20 ene 2026