Cache-Aided Wireless Networks

The newest viral video could travel a network millions of times, thus creating a heavy load for the network. A solution to reducing network load is to prefetch popular content in local caches of network elements (e.g. small-cell base stations, wireless access points, or end-user devices) and to store that data for future requests of a popular file. Caching information at the wireless network edge is particularly suitable for the delivery of video content which has a stringent delay and bandwidth requirements. Caching is facilitated by storage capacity becoming exceedingly cheap.

The simplest form of caching is to store the most popular files at every edge cache. With edge caching popular files can be locally delivered by edge devices, while other requests are directly served by the source (e.g. macro base station), thus achieving what is referred to as a local caching gain. However, replicating content on many devices can result in an inefficient use of the aggregate cache capacity.

Recent studies have shown that a global caching gain can be enabled in wireless networks by storing different content portions across several caches and capitalizing on the spatial reuse of device-to-device communications, or exploiting globally cached information to multicast coded messages simultaneously useful to many users.

Network caching consists of two phases: the caching phase, during which library content are stored at receiver caches and which takes place at the network setup; and the delivery phase, during which the network is repeatedly used to satisfy the demands made by the receivers.

We study various important aspects of caching on wireless networks, such as:

Distortion-memory tradeoffs

We investigate the joint design of storage and transmission schemes for the efficient delivery of video-based services over next-generation heterogeneous wireless networks, by specifically considering the unique nature of video content and the wireless channel. We consider a scenario in which users store videos at different encoding rates (e.g. video layers in scalable video coding). The goal is to design caching and delivery schemes that, for a given broadcast rate, minimize the average squared error distortion experienced at user devices.

Correlation-Aware Distributed Caching and Coded Delivery

State of the art caching schemes exploit exact content reuse across the network, which, due to the personalized nature of the content consumed by users, can lead to limited efficiency of the scheme.
Distribution of correlated information across a network can lead to unnecessary transmission of redundant information. Paying attention to these redundancies can further help with the efficiency of content delivery to users. We aim to understand the fundamental limits of caching correlated content at base stations or user devices.

Caching, Delay, and Throughput Scaling in Tiered Networks

Caching can aid decrease the network load and transmission delay by having files be transmitted locally in lieu of having to traverse the entire network. An indirect benefit of caching is reduction of interference in the network, which in turn can lead to higher total network throughput. What will the network throughput and delay look like as density increases? How should caching be done to maintain throughput and delay gains as a network scales up in size? Scaling results are theoretical bounds on network performance which can elucidate tradeoffs among network parameters. We seek to understand how caching, delay, and throughput interact in highly dense networks which can lead us to improved network deployments.

People

• Faculty: Elza Erkip
• Postdoc: David Ramirez
• Students: Parisa Hassanzadeh, Çağlar Tunç

Collaborators

• Antonia Tulino, Jaime Llorca (Nokia, Bell Labs)

Publications

CitationResearch AreasDate
P. Hassanzadeh, A. Tulino, J. Llorca, E. Erkip “Rate-Memory Trade-off for the Two-User Broadcast Caching Network with Correlated Sources”, in Proc. IEEE International Symposium Information Theory (ISIT), June 2017. Cache-Aided Wireless Networks2017/06/01
P. Hassanzadeh, A. Tulino, J. Llorca, E. Erkip “Correlation-Aware Distributed Caching and Coded Delivery”, in Proc. IEEE Information Theory Workshop (ITW), September 2016.Cache-Aided Wireless Networks2016/09/11
P. Hassanzadeh, A. Tulino, J. Llorca, E. Erkip “Cache-Aided Coded Multicast for Correlated Sources”, in Proc. IEEE International Symposium on Turbo Codes and Iterative Information Processing (ISTC), September 2016.Cache-Aided Wireless Networks2016/09/01
P. Hassanzadeh, E. Erkip, J. Llorca, A. Tulino “Distortion-Memory Trade-offs in Cache-Aided Wireless Video Delivery”, in Proc. IEEE Annual Allerton Conference on Communication, Control, and Computing, October 2015. Cache-Aided Wireless Networks2015/10/01