TY - GEN
T1 - Demo
T2 - 27th IEEE Symposium on Computers and Communications, ISCC 2022
AU - Symeonides, Moysis
AU - Trihinas, Demetris
AU - Georgiou, Joanna
AU - Kasioulis, Michalis
AU - Pallis, George
AU - Dikaiakos, Marios D.
AU - Toliopoulos, Theodoros
AU - Michailidou, Anna Valentini
AU - Gounaris, Anastasios
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - With the proliferation of raw Internet of Things (IoTs) data, Fog Computing is emerging as a computing paradigm for delay-sensitive streaming analytics with operators deploying big data distributed engines on Fog resources [1]. Nevertheless, the current (Cloud-based) distributed analytics solutions are unaware of the unique characteristics of Fog realms. For instance, task placement algorithms consider homogeneous underlying resources without considering the Fog nodes' heterogeneity and the non-uniform network connections, resulting in sub-optimal processing performance. Moreover, data quality can play an important role, where corrupted data, and network uncertainty may lead to less useful results. In turn, energy consumption can critically impact the overall cost and liveness of the underlying processing infrastructure. Specifically, scheduling tasks on nodes with energy-hungry profiles or battery-powered devices may temporarily be beneficial for the performance, but it may increase the overall cost, or/and the battery-powered devices may not be available when needed. A Fog-enabled analytics stack must allow users to optimize Fog-specific indicators or trade-offs among them. For instance, users may sacrifice a portion of the execution performance to minimize energy consumption or vice versa. Except for the performance issues raised by Fog, the state-of-the-art distributed processing engines offer only low-level procedural programming interfaces with operators facing a steep learning curve to master them. So, query abstractions are crucial for minimizing the deployment time, errors, and debugging.
AB - With the proliferation of raw Internet of Things (IoTs) data, Fog Computing is emerging as a computing paradigm for delay-sensitive streaming analytics with operators deploying big data distributed engines on Fog resources [1]. Nevertheless, the current (Cloud-based) distributed analytics solutions are unaware of the unique characteristics of Fog realms. For instance, task placement algorithms consider homogeneous underlying resources without considering the Fog nodes' heterogeneity and the non-uniform network connections, resulting in sub-optimal processing performance. Moreover, data quality can play an important role, where corrupted data, and network uncertainty may lead to less useful results. In turn, energy consumption can critically impact the overall cost and liveness of the underlying processing infrastructure. Specifically, scheduling tasks on nodes with energy-hungry profiles or battery-powered devices may temporarily be beneficial for the performance, but it may increase the overall cost, or/and the battery-powered devices may not be available when needed. A Fog-enabled analytics stack must allow users to optimize Fog-specific indicators or trade-offs among them. For instance, users may sacrifice a portion of the execution performance to minimize energy consumption or vice versa. Except for the performance issues raised by Fog, the state-of-the-art distributed processing engines offer only low-level procedural programming interfaces with operators facing a steep learning curve to master them. So, query abstractions are crucial for minimizing the deployment time, errors, and debugging.
UR - http://www.scopus.com/inward/record.url?scp=85141151283&partnerID=8YFLogxK
U2 - 10.1109/ISCC55528.2022.9913026
DO - 10.1109/ISCC55528.2022.9913026
M3 - Conference contribution
AN - SCOPUS:85141151283
T3 - Proceedings - IEEE Symposium on Computers and Communications
BT - 2022 IEEE Symposium on Computers and Communications, ISCC 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 30 June 2022 through 3 July 2022
ER -