Machine learning over networks
Information and Communication Technologies (ICT) are permeating our everyday life and represent the driving force behind the development of the so called Information Society. The current roadmap foreseen in the 5G roadmap is to design a common communication infrastructure enabling the deployment of diverse innovative sectors, such as automated driving, Industry 4.0, smart grids, virtual or augmented reality, e-health, etc. This new reality, sometimes called fourth industrial revolution, can be realized by a new architecture able to meet advanced requirements, especially in terms of latency (below 5 ms), reliability, coverage (with density up to 100 devices/m2), and bandwidth. Next generation networks will exhibit a deep interplay and synergy with machine learning (ML) tools. On the one side, ML will provide a powerful toolbox to handle the complexity of next generation networks; on the other hand, new networks will enable the pervasive deployment of intelligent services, especially the ones requiring very low latency and high data rates, like virtual reality or autonomous driving.
Within this context, our research activity focuses on the joint optimization of computation and communication resources within the so called edge cloud, i.e. a distributed cloud whose servers are co-located with densely distributed radio access points. We started with a static optimization, in the multicell scenario, proposing a very effective successive convex optimization framework to deal with multicell interference [1]. Lately, we generalized the approach to the dynamic case, using Lyapunov stochastic optimization [2], with goal of finding the optimal balance between average energy consumption, delay and accuracy
S. Sardellitti, G. Scutari, S. Barbarossa, “Joint Optimization of Radio and Computational Resources for Multicell Mobile-Edge Computing”, IEEE Trans. on Signal and Information Processing over Networks, June 2015, pp 89-103 (IEEE SPS Best Paper Award)
M. Merluzzi, P. Di Lorenzo, S. Barbarossa, "Wireless Edge Machine Learning: Resource Allocation and Trade-offs", IEEE Access, March 2021, pp. 45377-45398.