Fall 2021 — Issue 1, Volume 1
The inaugural issue of IEEE BITS the Information Theory Magazine
Column article(s)
Scroll to see all columns in this issue.
The Arts and Humanities Research Council (AHRC), part of UK Research and Innovation (UKRI) has supported independent academic research involving information processing tools since its founding in 2005. The Arts and Humanities Research Council (AHRC), part of UK Research and Innovation (UKRI) has supported independent academic research involving information p...
The initial vision of cellular communications was to deliver ubiquitous voice communications to anyone anywhere. In a simplified view, 1G delivered voice services for business customers, and only 2G for consumers. Next, this also initiated the appetite for cellular data, for which 3G was designed. However, Blackberry delivered business smartphones, and 4G ma...
Group testing is the technique of pooling together diagnostic samples in order to increase the efficiency of medical testing. Traditionally, works in group testing assume that the infections are i.i.d. However, contagious diseases like COVID-19 are governed by community spread and hence the infections are correlated. This survey presents an overview of recen...
Privacy has become an emerging challenge in both information theory and computer science due to massive (centralized) collection of user data. In this article, we overview privacy-preserving mechanisms and metrics from the lenses of information theory, and unify different privacy metrics, including f-divergences, Rényi divergences, and differential privacy (...
Medical ultrasound imaging is an ongoing research field for digital signal processing. Following decades of developement in the analogue domain, the introduction of high power computation has led to increased activity and research in the fields of digital signal processing, and, most recently, in machine learning, for the sake of delivering higher quality im...
The entropy function plays a central role in information theory. Constraints on the entropy function in the form of inequalities, viz. entropy inequalities (often conditional on certain Markov conditions imposed by the problem under consideration), are indispensable tools for proving converse coding theorems. In this expository article, we give an overview o...
One of Claude Shannon’s best remembered “toys” was his maze-solving machine, created by partitions on a rectangular grid. A mechanical mouse was started at one point in the maze with the task of finding cheese at another point. Relays under the board guided successive moves, each of which were taken in the first open counterclockwise direction from the previ...