RT Journal Article T1 Functions as a service for distributed deep neural network inference over the cloud-to-things continuum A1 Bueno Calvente, Altair A1 Rubio-Muñoz, Bartolomé A1 Martín-Fernández, Cristian A1 Díaz-Rodríguez, Manuel K1 Redes de ordenadores AB The use of serverless computing has been gaining popularity in recent years as an alternative to traditional Cloud computing. We explore the usability and potential development benefits of three popular open-source serverless platforms in the context of IoT: OpenFaaS, Fission, and OpenWhisk. To address this we discuss our experience developing a serverless and low-latency Distributed Deep Neural Network (DDNN) application. Our findings indicate that these serverless platforms require significant resources to operate and are not ideal for constrained devices. In addition, we archived a 55% improvement compared to Kafka-ML's performance under load, a framework without dynamic scaling support, demonstrating the potential of serverless computing for low-latency applications. PB Wiley YR 2024 FD 2024-02-11 LK https://hdl.handle.net/10630/30436 UL https://hdl.handle.net/10630/30436 LA spa NO Bueno A, Rubio B, Martín C, Díaz M. Functions as a service for distributed deep neural network inference over the cloud-to-things continuum. Softw: Pract Exper. 2024; 1-15. doi: 10.1002/spe.3318 NO Funding for open Access charge: Universidad de Málaga / CBUA.Ministerio de Ciencia, Innovación y Universidades DS RIUMA. Repositorio Institucional de la Universidad de Málaga RD 20 ene 2026