An open source framework based on Kafka-ML for Distributed DNN inference over the Cloud-to-Things continuum
| dc.centro | E.T.S.I. Informática | es_ES |
| dc.contributor.author | Torres, Daniel R. | |
| dc.contributor.author | Martín-Fernández, Cristian | |
| dc.contributor.author | Díaz, Manuel | |
| dc.contributor.author | Rubio-Muñoz, Bartolomé | |
| dc.date.accessioned | 2021-07-01T11:25:34Z | |
| dc.date.available | 2021-07-01T11:25:34Z | |
| dc.date.created | 2021 | |
| dc.date.issued | 2021-06-16 | |
| dc.departamento | Lenguajes y Ciencias de la Computación | |
| dc.description.abstract | The current dependency of Artificial Intelligence (AI) systems on Cloud computing implies higher transmission latency and bandwidth consumption. Moreover, it challenges the real-time monitoring of physical objects, e.g., the Internet of Things (IoT). Edge systems bring computing closer to end devices and support time-sensitive applications. However, Edge systems struggle with state-of-the-art Deep Neural Networks (DNN) due to computational resource limitations. This paper proposes a technology framework that combines the Edge-Cloud architecture concept with BranchyNet advantages to support fault-tolerant and low-latency AI predictions. The implementation and evaluation of this framework allow assessing the benefits of running Distributed DNN (DDNN) in the Cloud-to-Things continuum. Compared to a Cloud-only deployment, the results obtained show an improvement of 45.34% in the response time. Furthermore, this proposal presents an extension for Kafka-ML that reduces rigidness over the Cloud-to-Things continuum managing and deploying DDNN. | es_ES |
| dc.identifier.citation | Journal of Systems Architecture, Volume 118, September 2021, 102214 | es_ES |
| dc.identifier.doi | https://doi.org/10.1016/j.sysarc.2021.102214 | |
| dc.identifier.uri | https://hdl.handle.net/10630/22511 | |
| dc.language.iso | eng | es_ES |
| dc.publisher | Elsevier | es_ES |
| dc.rights.accessRights | open access | es_ES |
| dc.subject | Computación | es_ES |
| dc.subject | Inteligencia artificial | es_ES |
| dc.subject.other | Distributed deep neural networks | es_ES |
| dc.subject.other | Cloud computing | es_ES |
| dc.subject.other | Fog/edge computing | es_ES |
| dc.subject.other | Distributed processing | es_ES |
| dc.subject.other | Low-latency fault-tolerant framework | es_ES |
| dc.title | An open source framework based on Kafka-ML for Distributed DNN inference over the Cloud-to-Things continuum | es_ES |
| dc.type | journal article | es_ES |
| dc.type.hasVersion | VoR | es_ES |
| dspace.entity.type | Publication | |
| relation.isAuthorOfPublication | bf2870d3-5cc6-414d-8d71-60e242c18554 | |
| relation.isAuthorOfPublication | 5d31c256-428d-41f8-a525-6549736c3b2e | |
| relation.isAuthorOfPublication.latestForDiscovery | bf2870d3-5cc6-414d-8d71-60e242c18554 |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Paper_JSA_2021.pdf
- Size:
- 1.16 MB
- Format:
- Adobe Portable Document Format
- Description:
- Artículo publicado
Description: Artículo publicado

