To support the fast growth of IoT and cyber physical systems, as well as the advent of 6G, there is a need for communication and networking models that enable more efficient modes for machine-type communications. This calls for a departure from the assumptions of classical communication theoretic problem formulations as well as the traditional network layers. This new communication paradigm is referred to as goal or task oriented communication, or in a broader sense, is part of the emerging area of semantic communications. Over the past decade, there have been a number of approaches towards novel performance metrics, starting from measures of timeliness such as the Age of Information (AoI), Query Age of Information (QAoI), to those that capture goal oriented nature, tracking or control performance such as Quality of Information (QoI), Value of Information (VoI) and Age of Incorrect Information (AoII), moving toward to more sophisticated end-to-end distortion metrics (e.g. MSE), ML performance, or human perception of the reproduced data, and the application of finite-blocklength information theory in the context of the remote monitoring of stochastic processes, and real-time control. We invite original papers that contribute to the fundamentals, as well as the applications of semantic metrics, and protocols that use them, in IoT or automation scenarios.
This 8th special issue will focus on exploring how new advances in information theory can impact future communication systems. Next generation wireless networks will incorporate a large number of devices, dense and intelligent antenna arrays, and operate in higher frequencies. New task-aware communication modalities, such as sensing, learning and inference, will accelerate the shift from human-to-human to machine-to-machine type communications. Accordingly, communication systems will be designed with capacity, latency and accuracy in mind. Increasingly complex communication tasks will need to be carried out on devices with energy and hardware constraints, but will also be able to take advantage of in-network storage and computation.
This special issue will focus on information theoretic aspects of distributed coding and computing. While applications and platforms such as distributed learning, cloud storage and computing, content delivery networks, and distributed ledgers are increasingly popular, there is a tremendous need to evaluate the fundamental limits of the existing solutions and develop efficient approaches to run them. This is particularly important considering the growing list of constrains and requirements in terms available resources, scalability, privacy, security, fault tolerance, speed, accuracy, and verifiability. In this context, information theory and coding can play a major role in expanding and employing various tools and techniques to deal with those challenging tasks. This special issue aims to attract contributions investigating the fundamental limits of distributed information systems and developing efficient coding techniques to meet those limits, satisfying the essential constraints.
Deep learning methods have emerged as highly successful tools for solving inverse problems. They achieve state-of-the-art performance on tasks such as image denoising, inpainting, super-resolution, and compressive sensing. They are also starting to be used in inverse problems beyond imaging, including for solving inverse problems arising in communications, signal processing, and even on non-Euclidean data such as graphs. However, a wide range of important theoretical and practical questions remain unsolved or even completely open, including precise theoretical guarantees for signal recovery, robustness and out-of-distribution generalization, architecture design, and domain-specific applications and challenges. This special issue aims to advance cutting-edge research in this area, with an emphasis on its intersection with information theory.
Modern computation environments are struggling to store, communicate and process data in unprecedented volumes. These data, which come in new and evolving structures and formats, necessitate compression, lossless and lossy. Recent years have witnessed the emergence of new techniques, approaches, and modes for data compression. This special issue will focus on cutting edge research in this space.