Journal of Xidian University ›› 2023, Vol. 50 ›› Issue (2): 11-22.doi: 10.19665/j.issn1001-2400.2023.02.002

• nformation and Communications Engineering • Previous Articles     Next Articles

Study on coded caching with parallel transmission

LIN Xiao(),LUO Song(),LIU Nan()   

  1. National Mobile Communications Research Laboratory,Southeast University,Nanjing 211100,China
  • Received:2022-04-20 Online:2023-04-20 Published:2023-05-12

Abstract:

In order to improve the efficiency of network transmission and obtain a better low-latency performance,caching technology appeared.Unlike traditional caching technology,coded caching enables a single broadcast transmission from the server to simultaneously satisfy different demands of users by creating multicast opportunities with a global caching gain obtained.A coded caching network with parallel transmission is considered in which the server can broadcast messages to all users and users can also send messages to each other.An uncoded prefetching coded caching scheme is proposed which is composed of three phases:the pre-caching phase,the allocation phase and the delivery phase,where the optimal delivery time is obtained by pre-allocating different workloads to the server and users.It is shown that the proposed scheme with parallel transmission has a better performance compared with either server-multicast transmission alone or transmission within a D2D network alone.Also,after considering the channel capability gap between the two different channels,the proposed scheme obtains a better performance than when channel transmission capability is ignored.Finally,the proposed cache and delivery scheme with parallel transmission in the case of uncoded prefetching is proved to be optimal when the users’ cache resources are sufficient and the server broadcast channel and the D2D network transmission channel have the same channel capacity.

Key words: coded caching, uncoded prefetching, parallel transmission

CLC Number: 

  • TN911

Baidu
map