6
Applications of Federated Learning in Computing Technologies

Sambit Kumar Mishra*, Kotipalli Sindhu, Mogaparthi Surya Teja, Vutukuri Akhil, Ravella Hari Krishna, Pakalapati Praveen and Tapas Kumar Mishra

SRM University, AP, India

Abstract

Federated learning is a technique that trains the knowledge across different decen-tralized devices holding samples of information without exchanging them. The concept is additionally called collaborative learning. In federated learning, the clients are allowed separately to teach the deep neural network models with the local data combined at the deep neural network model at the central server. All the local datasets are uploaded to a minimum of one server, so it assumes that local data samples are identically distributed. It doesn’t transmit the information to the server. Because of its security and privacy concerns, it’s widely utilized in many applications like IoT, cloud computing; Edge computing, Vehicular edge computing, and many more. The details of implementation for the privacy of information in federated learning for shielding the privacy of local uploaded data are described. Since there will be trillions of edge devices, the system efficiency and privacy should be taken with no consideration in evaluating federated learning algorithms in computing technologies. This will incorporate the effectiveness, privacy, and usage of federated learning in several computing technologies. Here, different applications of federated learning, its privacy concerns, and its definition in various fields of computing like IoT, Edge, and Cloud Computing are presented.

Keywords: Federated learning, computing technology, cloud computing, edge computing, IoT, neural network

6.1 Introduction

Google first presents the idea of federated learning in 2016, which is helpful in many applications to require care of privacy policy. Federated learning is used to collaboratively train the global machine learning model, with distributed datasets, protecting user’s data privacy. In this article, we are going to work on different applications of federated learning in computing technology. Here, we divided the Applications of federated learning into different sections.

Section 6.1: Federated Learning in Cloud Computing

This section works on federated learning in cloud computing, its applications in different fields like cloud mobile edge-computing and cloud edge-computing [1, 2].

Section 6.2: Federated Learning in Edge Computing

This section deals with Edge computing, federated learning in edge computing, and applications in various fields like vehicular edge processing and intelligent recommendation [1].

Section 6.3: Federated Learning in Internet of Things (IoT)

This section is on IoT, federated learning in IoT, and its applications in different fields like Correspondence productive Federated Learning for wireless edge devices and blockchain and federated learning for privacy- protected data sharing [3, 4].

Section 6.4: Federated Learning in Medical Fields

This section presents on its different medical fields like Medical Healthcare and Data privacy in healthcare [5, 20].

Section 6.5: Federated Learning in Blockchain

This section deals on Blockchain and its applications in different fields like Blockchain-based federated learning [6, 7].

6.1.1 Federated Learning in Cloud Computing

The consistent development of intelligent gadgets requiring preparation has prompted moving AI calculation from server to edge nodes, leading to the edge processing worldview. Federated learning is a favorable circulated machine learning arrangement that utilizes nearby local computing with the local data to train the Artificial Intelligence (AI) model [8]. Attributable to the restricted limit of edge computing nodes, the presence of famous applications in the edge nodes brings about critical upgrades in client’s fulfillment and administration achievement. Consolidating local data processing and Federated learning can prepare an impressive AI model for guaranteeing local information protection while utilizing versatile customers’ assets. However, the heterogeneity of local information in the gadgets, that is, Non-free and indistinguishable circulation (Non-IID) and lopsidedness of regional information size may prevent the utilization of federated learning in mobile edge computing (MEC) systems. In specific circumstances, federated learning acquired consideration for performing, taking in methods from information dispersed across different clients, keeping the data untouched.

As of late, the consistently expanding scattering in our day-by-day life of Smart gadgets like wearable gadgets, cell phones, smart cards, sensors, etc., has set off the expansion of various conveyed network gadgets creating gigantic amounts of heterogeneous information to be handled and deciphered. Inferable from such excellent proportions of data with sensational advancement designs and the typical private nature of these data, sending all the data to a far cloud gets impossible, unnecessary, and stacked with assurance concerns [9, 21]. Thus, these parts have added to the ascent of the new adaptable, mobile edge processing (MEC) perspective, which handles the progress in the limit and computation breaking point of current devices for pushing, getting ready, and taking care of strategies locally on the simple contraptions. The MEC approach includes the participation of edge nodes with the far-off cloud to give rise to a processing framework ready to help huge scope tasks preparing and dealing with the climate. Inside this specific situation, the proficient and powerful treatment of enormous information brings out data and natural highlights covered up in the datasets, valuable for some application zones.

Applications of federated learning in cloud computing are explained as follows.

6.1.1.1 Cloud-Mobile Edge Computing

Machine Learning (ML) procedures comprise a vast part of considerable information control. A multi-hidden up multi-layered convolutional neural organization is received to serve information validation in a hearty portable group detecting issue, targeting improving detecting unwavering quality, and decreasing the general dormancy [11].

Here, we have a scenario shown in Figure 6.1, where the cloud is in a remote area where the network cannot be reached [10]. Here, we have a set of network elements representing NE and we have a set of edge nodes representing ED. Here once the requested trained model is completed in the ED it passes the model to the network element [11]. If the training model is not present in on edge node it passes the model to the nearest node to perform the task in this way it automatically this task is performed several times and the trained model is sent to NE’s.

 A schematic illustration of cloud–M E C network architecture.

Figure 6.1 Cloud–MEC network architecture [2].

The AI calculations utilized in federated learning are not restricted to edge network calculations yet incorporate different measures, like random forests. Federated learning comprises a variable function and so many edge nodes [12]. The edge nodes are responsible for gathering inclinations moved by each edge node, reviving the parameters of the structure as demonstrated by the smoothing out computation and keeping up the global parameters. The available nodes master nodes gain from the delicate information autonomously and locally. After every cycle, nodes transfer inclinations to the parameter worker, and the worker sums up and refreshes the global parameters [12]. At that point, edge nodes compute the refreshed global parameters outside of the parameter server, overwriting their old local parameters and continuing to the following cycle. During the entire learning measure, nodes just communicate with the parameter server. Learning nodes can’t acquire any data about the excess nodes, except for the global parameters that are kept up, which ensures the classification of the private information. The Federated Learning for Edge Network Computing, which depends on the customary federated learning structure, is shown in Figure 6.2. This method has two layers: parameter server layer and edge node layer. The self-versatile limit slope pressure module is at the edge node layer, and the non-concurrent federated learning module traverses the two layers.

 A schematic illustration of privacy-protected federated learning framework for edge network computing.

Figure 6.2 Privacy-protected federated learning framework for edge network computing [1].

6.1.1.2 Cloud Edge Computing

Edge node application administrations diminish the amounts of information that should be transferred, the ensuing traffic, and the distance that data should travel. It calculates the lower latency and decreases transmission costs. Calculation offloading for constant applications, like facial acknowledgment calculations, showed extensive upgrades accordingly, as shown in early exploration. Further examination showed that utilizing these machines made cloudlets close to portable clients, which offer administrations usually found in the cloud, given upgrades in execution time when a portion of the errands is offloaded to the edge node. Then again, emptying each task may bring about a stop because of move times among gadgets and nodes, so relying upon the responsibility can charac-terize an ideal setup [13].

Another utilization of this engineering is cloud gaming, where a few parts of a game could run in the cloud, while the delivered video is moved to lightweight customers running on gadgets like cell phones, VR glasses, and so forth. This kind of streaming is otherwise called pixel streaming. Other remarkable applications incorporate associated vehicles, self-driving vehicles, smart city areas, automated industry, and home robotization frameworks. The pervasiveness of IoT gadgets and the expanding refinement of cyber-attacks suggest a need to improve existing cyber-attack discovery instruments. As of late, DL has been generally fruitful in cyber-attack recognition [13]. Combined with FL, cyber-attack recognition models can be adapted cooperatively while keeping up client security. Given the calculation and capacity limit limitations of edge workers, some computationally concentrated undertakings of end gadgets must be offloaded to the distant cloud worker for calculation. Moreover, usually mentioned records or then again administrations should be put on edge workers for quicker recovery, i.e., clients don’t need to speak with the far-off cloud when they need to get to these records again benefits. Accordingly, an ideal reserving and offloading calculation plan can be cooperatively educated and advanced with FL.

6.1.2 Federated Learning in Edge Computing

Federated learning is a technique that is similar to machine learning technique. And one more advantage of federated learning is security or privacy, and this is the main reason why federated learning. These types of federated learning techniques train an algorithm throughout the different decentralized aspect devices containing the local information (data) without exchanging the local information (data) to others [14, 15]. Edge Computing is open IT architecture, dispensed those functions mobile computing and technologies that are Internet of Things (IoT), processing power decentralized. In the Edge computing, the devices process the data by itself or by local computers or by any servers. Like capturing traffic signals and calculating the Reduced Latency, Speeds Processing, Optimizing Bandwidth, etc. [14]. These types of analyses are done by devices itself without transmitting the data to datacenters. But in cloud computing captured traffic signals images need to be transmitted into the datacenter and it may lead to data privacy policies.

Federated Learning is a model that is similar to machine learning, here the model, using the locally stored data, may be discovered through some of dispensed nodes. It gives better data privacy or security because these locally stored data or trained information (data) is not sending it to the central (principal) server. The actual technology together with self-driving cars is VR and AR. These are the new architectures for data processing [16]. For this case cloud computing is unsuitable due to the variances in latency and bandwidth [17]. The new model of computation termed edge computing is suitable, because of low latency and high bandwidth. In remote control locations, cloud computing is not better than edge computing, because processing data needs to be done by time to time.

Applications of federated learning in edge computing explained as follows.

6.1.2.1 Vehicular Edge Computing

Federated learning is a well-known machine learning algorithm; wherein customers are enabling to teach the models (local Deep Neural Networks (DNN)) with the usage of district information and at the central server combine together to form a global (DNN) version [14]. The point of the Vehicular Edge Computing (VEC) is at the brink of vehicular organizations misusing the correspondence and calculation assets. In vehicular edge computing federated learning has the capability to satisfy the growing needs of AI packages in Intelligent Connected Vehicles (ICV). Image category is the one of the most important AI programs in VEC. We encouraged a selective aggregation version wherein “fine”, Local DNN fashions are decided on after which dispatched to the central servers by comparing the nearby photo excellence and the computation functionality. For the execution of version choice, the imperative server always aware about the photo pleasant and computation functionality within side the vehicular customers, whose privacy is exactly maintained below federated learning agenda. To triumph over the evidence, we have a 2-Dimensional settlement principle as an allotted framework to recognize the interplay among central (principal) server and vehicular customers [14]. The expressed hassle is then converted right hooked on a dutiful hassle thru sequentially enjoyable and simplifying the constraints, and eventually solved through the usage of a grasping algorithm. Here we’re the usage of datasets that is MNIST & Belgium TSC choosy version combination is used to outclass the unique federated Averaging (Fed-Avg) method in phrases of efficiency and exactness. As compared to the baseline model, our approach has higher utility at the central servers.

6.1.2.2 Intelligent Recommendation

Intelligence advice is a beneficial feature in cell phone or computer programs that expect consumer selections so that customers can without difficulty get entry to and use it. Contrasted and well-known AI draw near, edge federated learning can proficiently prepare adaptable models for suggestion under-takings, because aspect nodes are located in a perfect region and feature comparable obligations for performance and price causes. The prediction of emojis and the evaluation effect shows that the model on every node has an excellent typical overall performance because of the truth that models are adjusted to one of a type of language and traditional pattern in a particular region [16]. Display that the browser alternative proposal version skilled with federated learning can assist customers quickly discovers the internet site they want through coming into fewer characters. The paintings may be progressed in aspect federated learning structures to offer exclusive customers with rather customized fashions with the aid of exploring personal similarities without violating personal privacy and security [16].

6.1.3 Federated Learning in IoT (Internet of Things)

The core plan behind federates learning is decentralized learning, where the user information is rarely sent to the central server. Data is usually created by nervy devices like smart phones or IoT sensors. Federated learning is a solution that permits device machine learning while not transferring the user’s non-public information to a central cloud. For the Internet of things (IoT), together with cloud, fog, dew, edge computing is essential, which provides aggregation and processing from detector networks and individual devices associated with the physical world. It is a network of well-connected devices providing wealthy information [3, 18]. However, it may also be a security nightmare. Federated learning will facilitate come through personalization as well as enhance the performance of devices in IoT applications. Federated learning trains a framework in IoT, big information, and multimedia system communications. Federated learning may be a possible resolution to unravel the issues of knowledge islands, break information barriers, and shield information security and privacy, particularly within the context of the IoT and massive information [19]. Distributed IoT and enormous information users got to collaboratively train a classification or regression model to implement excellent information prediction results while not compromising privacy. Not like privacy-preserving outsourced coaching, instead of submitting information to the centralized cloud server, users train information domestically in the American state. The federate center is barely answerable for aggregating the gradient info (or model parameters) uploaded by users and distributing the worldwide coaching model.

Applications of Federated Learning in IoT are explained as follows.

6.1.3.1 Federated Learning for Wireless Edge Intelligence

The quickly increasing range of IoT devices is generating vast quantities of knowledge; however public concern over information privacy suggests that user’s square measure apprehensive to sending information to a central server. They simply modified the behaviors of edge infrastructure that software-defined networks provide to collate IoT information, wherever federate learning may be performed, while not uploading information to the server. Fed-Avg is an Associate in nursing federated learning algorithmic rule that has been the topic of abundant study. Regardless, it suffers from an extensive range of spherical to confluence with non- independent identically distributed (non-IID) consumer information sets and high communication prices. They proposed adapting Fed-Avg to reduce the number of rounds to convergence and novel compression strategies to provide communication-efficient Fed-Avg (CE-Fed-Avg). They tend to perform intensive experiments with the consumer information, and variable numbers of purchasers, consumer participation rates, and compression rates. The CE-Fed-Avg will converge to target accuracy in up to 6xless rounds than equally compressed Fed-Avg, whereas uploading up to 3xless information, is additionally strong, aggressive compression. Experiments on Associate in Nursing edge-computing like testbed exploitation Raspberry Pi purchasers show that CE-Fed-Avg is ready to succeed in a target accuracy in up to one 7xless real-time than Fed-Avg [3, 22, 23].

6.1.3.2 Federated Learning for Privacy Protected Information

The speedy increase within the volume of knowledge generated from connected devices in the industrial net of things paradigms parades new potentialities for enhancing the standard of service for the rising applications through information sharing. However, security and privacy considerations square measure major obstacles for information suppliers to share their information in wireless networks. The outpouring of personal information will result in severe problems on the far side loss for the suppliers. We tend to initial style a blockchain scepters secure information sharing design for distributed multiple parties during this article. Then, we tend to formulate the information sharing drawback into a machine-learning (ML) drawback by incorporating privacy-preserved federate learning [19]. Privacy is well-maintained by sharing the information model rather than revealing the particular data. Finally, we tend to integrate federate learning within the agreement method of permission blockchain, so the computing work for the agreement may also be used for federate coaching. Numerical results derived from real-world information sets show that the projected data sharing theme achieves sensible accuracy, high potency, and increased security.

6.1.4 Federated Learning in Medical Computing Field

Federated learning is an AI strategy that prepares a calculation across various workers, and it keeps the information tests without trading information. This methodology makes the stands instead of unified, and AI and the informational indexes are in one worker. Machine learning models are helpful without sharing data [4]. This helps us with critical issues such as data privacy, data security, and data access rights and to data. Different industries across the world use this application. A few investigations say that the capability of profound learning in distinguishing complex examples prompting indicative identification of sufficiently large datasets. For training is a significant challenge in medicine and rarely found in individual institutions. Furthermore, the multi- institutional joint efforts dependent on halfway shared patient information face under the security and different difficulties. For information private multi- institutional joint effort model learning all accessible and information without dividing information among the organizations by appropriating the information proprietors and conglomerating their outcomes, clinical reception of united learning is relied upon to prompt models prepared on a dataset of uncommon size and thus it has a synergist sway towards the customized medication.

AI has arisen as a promising methodology for building precise measurable models from the clinical information gathered in tremendous volumes by the cutting-edge medical services frameworks. Existing clinical information isn’t completely abused by ML since its information and protection concerns limit admittance to this information. Notwithstanding, without admittance to adequate information, ML will be kept from arriving at its maximum capacity and eventually making progress from examination to clinical practice. To properly consider key components adding to the issue, investigating how unified learning may give us an answer for the fate of computerized wellbeing features and difficulties in contemplations this should be tended, for instance, preparing an AI-based tumor finder requires an enormous data set incorporating the full range of potential life structures, pathologies, and information types. Information like this is difficult to acquire because wellbeing information is exceptionally delicate, and its utilization is firmly controlled. Regardless of whether information anonymization could sidestep these constraints, it is currently surely known that eliminating metadata, for example, patient name or the date of birth, is frequently insufficient to save privacy. It is, for instance, conceivable to recreate a patient’s face from the processed tomography and attractive reverberation imaging information. Another motivation behind why information sharing isn’t deliberate in medical care is that gathering, curating, and keeping a great informational index takes impressive time, exertion, and cost. Thus, such datasets may have critical business esteem, making them more outlandish they will be openly shared. All things considered; information authorities frequently hold fine-grained command over to information they have accumulated.

Applications of federated learning in medical computing field are explained as follows.

6.1.4.1 Federated Learning in Medical Healthcare

Federated learning holds extraordinary guarantees on medical services information examination. For instance, assembling model for foreseeing emergency clinic readmission hazard with the patient Electronic Health Records and buyer patient-based applications and screening atrial fibrillation with electrocardiograms caught by brilliant, the touchy patient information can remain either in nearby organizations or with singular buyers without going out during the federated learning measure which adequately ensures the patient protection. This paper aims to survey the arrangement of federated learning, examine the overall performances and difficulties, and imagine its applications in medical services [19]. EHRs have arisen as a critical wellspring of genuine medical services information utilized for a blend of significant biomedical examination and machine learning research.

6.1.4.2 Data Privacy in Healthcare

Nowadays, artificial intelligence is facing many challenges. Among the many challenges, two of the major challenges are data in the form of isolated forms in most industries [5]. The other is to increase the strength of data and privacy and security. To solve this problem, we propose a solution to secure federated learning. In 2016 Google introduced the first federated learning. We introduced a better secured federated framework that includes both vertical and horizontal learning and federated transfer learning [5]. The applications of the federated learning frameworks provide a comprehensive survey in this subject in addition to this, and we also propose to build a data network among the different organizations to get an effective solution based on a federated mechanism that allows sharing the knowledge without compromising user privacy.

6.1.5 Federated Learning in Blockchain

It is an open-source technology where a blockchain could be a growing list of records (blocks), which are unit-connected mistreatment cryptography [7]. Every block has a scientific discipline hash of the last block, a time-stamp, and dealing knowledge (delineated as a Merkle tree). By design, a blockchain is a proof against modification of its knowledge. This is often as a result of once recorded, the information in any given block cannot be altered retroactively while no alteration of all subsequent blocks. For each round of federated learning, the user’s update will be added to the buffer with other information on the update process [21]. A committee consisting of several members responsible for verifying the update will conduct the consensus. If more than half of the members confirm an update meets all the criteria, it will be added to the blockchain.

Committee consensus:

We should have to verify these before adding the block to the blockchain.

  1. If the update of local parameters is qualified for aggregation.
  2. If the generated hash of the block is valid and satisfies the difficulty criteria.
  3. If the previous hash stored in the block matches the hash of the latest block in the blockchain [7].

Committee consensus allowed the overall validation of a local model update before being further aggregated into a global model. End-point data corruption with varying degrees of noise and occlusion added was employed to evaluate the adaptability and robustness of the scheme.

Applications of federated learning in Blockchain are as follows.

6.1.5.1 Blockchain-Based Federated Learning Against End-Point Adversarial Data

Researcher considered scenarios, where an adversarial user in the scheme trains the local model with corrupted training data purposely. A scheme consists of three users who train their local convolution neural networks to classify the binary images of handwritten digits from the MNIST dataset. We divided the training set, with a total of 60000 images, randomly into three datasets, as the local dataset of the user.

Salt and pepper noise: Adding random noise to the local data.

Occlusion: Using a randomly positioned circle with various diameters.

6.2 Advantages of Federated Learning

Data privacy and data secrecy: The data is divided into small parts, so it is difficult for others to hack it. The parameters are encoded before sharing between local data rounds for computation. It is more personalized. It has more data protection. It keeps private data local. Information sharing is through model parameters sharing. Parameter aggregation is at the central server for global model generation. It is used in developing self- driving cars, health department, IoT, Cloud computing, Edge computing, and many more [7].

6.3 Conclusion

In this chapter, we have focused on various applications of federated learning in computing technologies. In each section, we have explained multiple applications of federated learning in various fields. Using this theoretical bound, we will implement models related to federated learning to maintain privacy and secrecy. Future work can investigate how to make the most efficient use of federated learning.

References

  1. 1. Lu, X., Liao, Y., Lio, P., Hui, P., Privacy-preserving asynchronous federated learning mechanism for edge network computing. IEEE Access, 8, 48970–48981, 2020.
  2. 2. Fantacci, R. and Picano, B., Federated learning framework for mobile edge computing networks. CAAI Trans. Intell. Technol., 5, 1, 15–21, 2020.
  3. 3. Nguyen, T.D., Marchal, S., Miettinen, M., Fereidooni, H., Asokan, N., Sadeghi, A.D., IoT: A federated self-learning anomaly detection system for IoT, in: 39th IEEE International Conference on Distributed Computing Systems (ICDCS), 2019.
  4. 4. Rieke, N., Hancox, J., Li, W., Milletari, F., Roth, H.R., Albarqouni, S., Cardoso, M.J., The future of digital health with federated learning. NPJ Digital Med., 3, 1, 1–7, 2020.
  5. 5. Choudhury, O., Gkoulalas-Divanis, A., Salonidis, T., Sylla, I., Park, Y., Hsu, G., Das, A., Differential privacy-enabled federated learning for sensitive health data, arXiv preprint arXiv:1910.02578, 2019.
  6. 6. Kim, H., Park, J., Bennis, M., Kim, S.L., Block-chained on-device federated learning. IEEE Commun. Lett., 24, 6, 1279–1283, 2019.
  7. 7. Sun, Y., Esaki, H., Ochiai, H., Blockchain-based federated learning against end-point adversarial data corruption, in: 19th IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 729–734, 2020.
  8. 8. He, K., Zhang, X., Ren, S., Sun, J., Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification. Presented at the IEEE Int. Conf. Comput. Vis. (ICCV), Dec. 2015.
  9. 9. Mishra, S.K., Sahoo, B., Parida, P.P., Load balancing in cloud computing: A big picture. J. King Saud Univ.-Comput. Inf. Sci., 32, 2, 149–158, 2020.
  10. 10. Mishra, S.K., Puthal, D., Sahoo, B., Jena, S.K., Obaidat, M.S., An adaptive task allocation technique for green cloud computing. J. Supercomput., 74, 1, 370–385, 2018.
  11. 11. Wu, Q., He, K., Chen, X., Personalized federated learning for intelligent IoT applications: A cloud-edge based framework. IEEE Open J. Comput. Soc., 1, 34–44, 2020.
  12. 12. Gour, L. and Waoo, A., Implementing fault resilient strategies in cloud computing via federated learning approach. J. Innov. Appl. Res., 4, 1, 2021.
  13. 13. Liu, L., Zhang, J., Song, S.H., Letaief, K.B., Client-edge-cloud hierarchical federated learning, in: IEEE International Conference on Communications (ICC), 2020.
  14. 14. Ye, D., Yu, R., Pan, M., Han, Z., Federated learning in vehicular edge computing: A selective model aggregation approach. IEEE Access, 8, 23920–23935, 2020.
  15. 15. Mishra, S.K., Puthal, D., Sahoo, B., Sharma, S., Xue, Z., Zomaya, A.Y., Energy-efficient deployment of edge data centers for mobile clouds in sustainable IoT. IEEE Access, 6, 56587–56597, 2018.
  16. 16. Vyas, J., Das, D., Das, S.K., Vehicular edge computing based driver recommendation system using federated learning, in: 2020 IEEE 17th International Conference on Mobile Ad Hoc and Sensor Systems (MASS), pp. 675–683, 2020.
  17. 17. Mishra, S.K., Khan, M.A., Sahoo, S., Sahoo, B., Allocation of energy-efficient task in cloud using DVFS. Int. J. Comput. Sci. Eng., 18, 2, 154–163, 2019.
  18. 18. Mishra, S.K., Sahoo, S., Sahoo, B., Jena, S.K., Energy-efficient service allocation techniques in cloud: A survey. IETE Tech. Rev., 37, 4, 339–352, 2020.
  19. 19. Qiu, T., Wang, X., Chen, C., Atiquzzaman, M., Liu, L., TMED: A spider web-like transmission mechanism for emergency data in vehicular ad hoc networks. IEEE Trans. Veh. Technol., 67, 9, 8682–8694, Sep. 2018.
  20. 20. Xu, J., Glicksber, B.S., Su, C., Walker, P., Bian, J., Wang, F., Federated learning for healthcare informatics. J. Healthc. Inform. Res., 5, 1, 1–19, 2021.
  21. 21. Pokhrel, S.R. and Jinho, C., Federated learning with block chain for autonomous vehicles: Analysis and design challenges. IEEE Trans. Commun., 68, 8, 4734–4746, 2020.
  22. 22. Mishra, S.K., Puthal, D., Rodrigues, J.J., Sahoo, B., Dutkiewicz, E., Sustainable service allocation using a metaheuristic technique in a fog server for industrial applications. IEEE Trans. Industr. Inform., 14, 10, 4497–4506, 2018.
  23. 23. Mishra, S.K., Puthal, D., Sahoo, B., Jayaraman, P.P., Jun, S., Zomaya, A.Y., Ranjan, R., Energy-efficient VM-placement in cloud data center. Sustain. Comput.: Inform. Syst., 20, 48–55, 2018.

Note

  1. *Corresponding author: [email protected]
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset