Skip to Main Content (Press Enter)

Logo UNIMI
  • ×
  • Home
  • Persone
  • Attività
  • Ambiti
  • Strutture
  • Pubblicazioni
  • Terza Missione

Expertise & Skills
Logo UNIMI

|

Expertise & Skills

unimi.it
  • ×
  • Home
  • Persone
  • Attività
  • Ambiti
  • Strutture
  • Pubblicazioni
  • Terza Missione
  1. Pubblicazioni

Latency-Aware Placement of Microservices in the Cloud-to-Edge Continuum via Resource Scaling

Contributo in Atti di convegno
Data di Pubblicazione:
2025
Citazione:
Latency-Aware Placement of Microservices in the Cloud-to-Edge Continuum via Resource Scaling / A. Bertoncini, A. Ceselli, C. Quadri (IEEE INTERNATIONAL CONFERENCE ON SMART COMPUTING). - In: 2025 IEEE International Conference on Smart Computing (SMARTCOMP)[s.l] : IEEE, 2025 Jun. - ISBN 979-8-3315-8647-8. - pp. 420-425 (( convegno International Conference on Smart Computing (SMARTCOMP) tenutosi a Cork nel 2025 [10.1109/smartcomp65954.2025.00103].
Abstract:
Latency-sensitive applications, such as autonomous driving in smart cities and smart industries, require a networking and computing infrastructure to support their operations. Cloud-to-edge continuum represents a promising architecture to provide computational capability close to edge devices. However, deploying latency-sensitive applications in the continuum is challenging due to the heterogeneity and the geographical distribution of the computing nodes. In this paper, we address the deployment problem in a tele-operated autonomous driving scenario, formulating the orchestration task as a Virtual Network Function Placement Problem (VNFPP) with multi-tier performance levels, enabling vertical scaling of computational resources per microservice. Our MILP model, MORAL, minimizes node centrality-based deployment costs while satisfying resource and end-to-end latency constraints. We tested our approach through extensive simulations on realistic network topologies and synthetic applications, showing that the proposed model improves deployment feasibility, latency compliance, and resource efficiency compared to single performance tier versions and baseline strategies.
Tipologia IRIS:
03 - Contributo in volume
Keywords:
Service Orchestration; Cloud-to-Edge Continuum; Mathematical Optimization
Elenco autori:
A. Bertoncini, A. Ceselli, C. Quadri
Autori di Ateneo:
BERTONCINI ALBERTO ( autore )
CESELLI ALBERTO ( autore )
QUADRI CHRISTIAN ( autore )
Link alla scheda completa:
https://air.unimi.it/handle/2434/1175699
Link al Full Text:
https://air.unimi.it/retrieve/handle/2434/1175699/3111258/2025_CAVIA_SmartSys.pdf
Titolo del libro:
2025 IEEE International Conference on Smart Computing (SMARTCOMP)
Progetto:
CAVIA: enabling the Cloud-to-Autonomous-Vehicles continuum for future Industrial Applications
  • Aree Di Ricerca

Aree Di Ricerca

Settori


Settore INFO-01/A - Informatica
  • Informazioni
  • Assistenza
  • Accessibilità
  • Privacy
  • Utilizzo dei cookie
  • Note legali

Realizzato con VIVO | Progettato da Cineca | 25.11.5.0