Enable javascript in your browser for better experience. Need to know to enable it? Go here.
radar blip
radar blip

Federated machine learning

Last updated : Oct 26, 2022
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more
Oct 2022
Trial ? Worth pursuing. It is important to understand how to build up this capability. Enterprises should try this technology on a project that can handle the risk.

We're now seeing client projects that use federated machine learning (ML). Traditionally, ML model training has required data to be placed in a centralized location where the relevant training algorithm can be run. From a privacy point of view, this is problematic, especially when the training data contains sensitive or personally identifiable information; users might be reluctant to share data or local data protection legislation may prevent us from moving data to a central location. Federated ML is a decentralized technique for training on a large and diverse set of data that allows the data to remain remote — for example, on a user's device. Network bandwidth and the computational limitations of devices still present significant technical challenges, but we like the way federated ML leaves users in control of their own personal information.

Nov 2019
Assess ? Worth exploring with the goal of understanding how it will affect your enterprise.

Model training generally requires collecting data from its source and transporting it to a centralized location where the model training algorithm runs. This becomes particularly problematic when the training data consists of personally identifiable information. We're encouraged by the emergence of federated learning as a privacy-preserving method for training on a large diverse set of data relating to individuals. Federated learning techniques allow the data to remain on the users' device, under their control, yet contribute to an aggregate corpus of training data. In one such technique, each user device updates a model independently; then the model parameters, rather than the data itself, are combined into a centralized view. Network bandwidth and device computational limitations present some significant technical challenges, but we like the way federated learning leaves users in control of their own personal information.

Published : Nov 20, 2019

Download Technology Radar Volume 29

English | Español | Português | 中文

Stay informed about technology

 

Subscribe now

Visit our archive to read previous volumes