Site icon Towards AI

What’s in the TensorFlow Federated(TFF) box?

Author(s): Bala Priya

Machine Learning

Based on Krzysztof Ostrowski’s talk at OpenMined Privacy Conference 2020

Photo by Josh Calabrese on Unsplash

Krzysztof Ostrowski is a Research Scientist at Google, where he heads the TensorFlow Federated development team. This blog post is inspired by his talk at the OpenMined Privacy Conference.

Outline

What is TensorFlow Federated?

TensorFlow Federated(TFF) is a new development framework for Federated Computations, that typically involve computations on data that is born decentralized and stays decentralized. TFF provides a common framework for federated computations in both research and production and is an open-source project within the TensorFlow ecosystem.

What’s inside the TensorFlow Federated library?

Design Goals for TFF

The TFF library has been designed so as to facilitate an easy path from research to production. In TFF, the code is serialized into a portable, language-independent and platform-independent representation.
The design is, therefore, motivated by the dictum:

Write code once, and deploy it anywhere;
Your code shouldn’t have to change!

Easy path from research to production (Image Source)

Understanding the characteristics of Federated Computations(FC)

In this section, we shall seek to understand some of the characteristics of federated computations.

Illustration of federated broadcast and local processing on client devices (Image Source)

TFF’s Federated Computation API

The program flow is expressed in a Pythonic way for better interpretability. As stated in an earlier section, all code is traced and serialized at definition time to a platform-independent representation. Here’s the code snippet showing the collective operations and on-device processing for the temperature sensor example discussed above.

@tf.function
def exceeds_threshold_fn(reading,threshold):#on-device processing
return tf.to_float(reading > threshold)
@tff.federated_computation
def get_fraction_over_threshold(readings, threshold):
return tff.federated_mean( #collective operations
tff.federated_map( #collective communication
tff.tf_computation(exceeds_threshold_fn),
[readings, tff.federated_broadcast(threshold)]))

TFF’s Canned and Simulation APIs

We shall enumerate some of the features of TFF’s Canned and Simulation APIs.

# Just plug in your Keras model
train = tff.learning.build_federated_averaging_process(...)
state = train.initilaize()
for _ in range(5):
train_data = _ #pick random clients
state, metrics = train.next(state, train_data)
print(metrics.loss)

Canned TFF Executors

Canned TFF Executors provide support for the most common scenarios. A single-machine, multi-threaded execution is spawned by default in the background, and the Canned TFF Executors facilitate remote and distributed execution. TFF executor building blocks are reusable, stackable modules that add to individual capabilities and are specifically designed for extensibility and customizability.

Collaboration opportunities for Open Source contributors

There are opportunities to contribute to the federated algorithms suite, simulation infrastructure and more flexible runtime integrations.

References

[1] https://www.tensorflow.org/federated

[2] Krzysztof Ostrowski, TensorFlow Federated, OpenMined Privacy Conference,2020

[3] Here’s the link to the blog that I wrote for OpenMined.


What’s in the TensorFlow Federated(TFF) box? was originally published in Towards AI on Medium, where people are continuing the conversation by highlighting and responding to this story.

Published via Towards AI

Exit mobile version