The Computing Continuum: Beyond the Cloud Data Centers
The advent of fog and edge computing has prompted predictions that they will take over the traditional cloud for information processing and knowledge extraction in Internet of Things (IoT) systems. Notwithstanding the fact that fog and edge computing have undoubtedly large potential, these predictions are probably oversimplified and wrongly portray the relations between cloud, fog and edge computing. Concretely, fog and edge computing have been introduced as an extension of the cloud services towards the data sources, thus forming the computing continuum. The computing continuum enables the creation of a new type of services, spanning across distributed infrastructures, supporting various IoT applications. These applications have a large spectrum of requirements, burdensome to meet with „distant“ cloud data centers. However, the introduction of the computing continuum raises multiple challenges for management, deployment and orchestration of complex distributed applications, such as: increased network heterogeneity, limited resource capacity of edge devices, fragmented storage management, high mobility of edge devices and limited support of native monolithic applications. These challenges primarily concern the complexity and the large diversity of the devices, managed by different entities (cloud providers, universities, private institutions), which range from single-board computers such as Raspberry Pis to powerful multi-processor servers.Therefore, in this talk, we will discuss novel algorithms for low latency, scalable, and sustainable computing over heterogeneous resources for information processing and reasoning, thus enabling transparent integration of IoT applications. We will tackle the heterogeneity challenge of dynamically changing topologies of the computing infrastructure and present a novel concept for sustainable processing at scale.