To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter investigates the impact of caching in the interference networks. First, we briefly review the basics of some classic interference networks and the corresponding interference management techniques. Then we review an interference network with caches equipped at all transmitters and receivers, termed as cache-aided interference network. The information-theoretic metric normalized delivery time (NDT) is introduced to characterize the system performance. The NDT in the cache-aided interference network is discussed for both single-antenna and multiple-antenna cases. It is shown that with different cache sizes, the network topology can be opportunistically changed to different classic interference networks, which leverages local caching gain, coded multicasting gain, and transmitter cooperation gain (via interference alignment and interference neutralization). Finally, the NDT results are extended to the partially connected interference network.
Edge-caching has received much attention as an efficient technique to reduce delivery latency and network congestion during peak-traffic times by bringing data closer to end users. Existing works usually design caching algorithms separately from physical layer design. In this chapter, we analyze edge-caching wireless networks by taking into account the caching capability when designing the signal transmission. Particularly, we investigate multi-layer caching, where both base station (BS) and users are capable of storing content data in their local cache and analyze the performance of edge-caching wireless networks under two notable uncoded and coded caching strategies. We first calculate backhaul and access throughputs of the two caching strategies for arbitrary values of cache size. The required backhaul and access throughputs are derived as a function of the BS and user cache sizes. Then closed-form expressions for the system energy efficiency (EE) corresponding to the two caching methods are derived. Based on the derived formulas, the system EE is maximized via a precoding vectors design and optimization while satisfying a predefined user request rate. Two optimization problems are proposed to minimize the content delivery time for the two caching strategies.
The wireless edge caching is considered as a promising technique to cope with rapid increase in mobile traffic demand. The fundamental idea of edge caching is to offload the data traffic to local cache memories by dealing with content requests with the pre-fetched contents on network edge nodes. The wireless edge caching consists of two main phases: content placement and content delivery. Since the strategies for these two phases are highly dependent on which devices are capable of caching in the network, the characteristics and types of achievable caching gains appear to vary with the location of cached data. The cached data at the transmitter side can be utilized to reduce the traffic load on backhaul and the latency, while the cached data at the receiver side can be utilized to improve the network resource efficiency and the quality of experience (QoE) of the end-users. This chapter introduces the state-of-the-art wireless edge caching techniques for transmitters and receivers of ultra dense networks and offers a design guideline on reaping the promising gain of wireless edge caching.
Understand both uncoded and coded caching techniques in future wireless network design. Expert authors present new techniques that will help you to improve backhaul, load minimization, deployment cost reduction, security, energy efficiency and the quality of the user experience. Covering topics from high-level architectures to specific requirement-oriented caching design and analysis, including big-data enabled caching, caching in cloud-assisted 5G networks, and security, this is an essential resource for academic researchers, postgraduate students and engineers working in wireless communications.
Email your librarian or administrator to recommend adding this to your organisation's collection.