Search for a command to run...
The use of deep learning in the classification of medical imaging Computed Tomography (CT) images has greatly enhanced the accuracy of the diagnosis in the medical field. Nevertheless, conventional centralized solutions necessitate the transfer of big amounts of medical information to a central point of the server, which results in high bandwidth rates, high latency, and severe issues of privacy, especially in a wireless healthcare scenario. Federated Learning (FL) could be a good solution; it provides the possibility to conduct collaborative training of the model without providing the actual patient data. However, traditional FL approaches have a high communication cost, as they frequently exchange massive model updates, and cannot be used in networks with tight bandwidth constraints. To overcome these difficulties, this paper suggests a Communication-Efficient Federated Learning (CEFL) system to distributed CT image classification. The suggested method combines the gradient sparsification, model quantization, and adaptive scheduling of communications to achieve relatively small and less regular model updates. The framework is executed with the help of a multi-layer structure that includes a medical imaging, edge computing, wireless communication, and federated aggregation layers. LIDC-IDRI CT dataset is put to experiments with simulated bandwidth-constrained conditions. The findings show that the suggested CEFL model can cut down on overhead in communication by up to 40-60% with respect to the traditional FL approaches like FedAvg and that it can attain high-quality approaches of about 90% in classification. Moreover, the latency is considerably lowered, and therefore the system can be used in wireless healthcare in real-time. These results indicate the usefulness of communication-efficient designs in facilitating scalable privacy-preserving medical image analysis.