Optimized Deep Federated Privacy-Preserving Learning for Hierarchical Caching in Fog Computing
DOI:
https://doi.org/10.37256/cm.6320256278Keywords:
fog computing, internet-of-things, federated learning, privacy-preserving, hierarchical cachingAbstract
Fog Access Networks (F-ANs) have become a potential solution to meet the increasing demand for multimedia services by moving computational and storage functions closer to users at the network's edge. Within these networks, distributing edge caching across Fog Access Points (F-APs) can significantly reduce both network congestion and service delays by keeping frequently accessed content in the local caches rather than relying on remote cloud storage. Considering that F-APs have limited caching resources and that user content demands fluctuate over time and space, various cooperative caching techniques have been created to determine which contents to prioritize and how to store them effectively. However, many of these techniques rely on central servers to collect and analyze data from Internet-of-Things (IoT) devices to predict content popularity, which raises serious concerns regarding privacy. To tackle this challenge, we propose a Federated Learning-based Hierarchical (FLH) caching approach that keeps data within local environments and uses IoT devices for training a shared learning model to estimate content demand. FLH fosters cooperation between neighboring F-APs and engages in vertical collaboration with the Base Band Unit (BBU) pool and F-APs to cache content that varies in popularity. Additionally, FLH implements differential privacy techniques to provide strong privacy protection. Experimental findings show that FLH significantly outperforms five major baseline models in cache hit ratios while ensuring the security of user data.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Javad Mohammadzadeh, et al.

This work is licensed under a Creative Commons Attribution 4.0 International License.
