Skip to content

federated-learning-experiments/fl-text-models

Repository files navigation

fl-text-models

In this project we pretrain federated text models for next word prediction. See our white paper for details on related work, experiment designs, and results, or check out the video below for an overview of our research.

Video

Data

The main dataset used for these experiments is hosted by Kaggle and made available through the tff.simulation.datasets module in the Tensorflow Federated API. Stack Overflow owns the data and has released the data under the CC BY-SA 3.0 license.

Environment

Experiments were conducted in a Python 3.7 conda environment with the packages in requirements.txt on both GPU and CPU VMs running Ubuntu 16.04.

Running Experiments

To conduct experiments with our code:

  • Clone the repository and replicate our conda environment.
  • Configure the params.json file to set a simulated client data sampling strategy, pretraining approach, and federated model architecture.
  • Execute federated_nwp.py to train a federated text model on Stack Overflow for next word prediction according to the desired parameters. This script applies our methods described in final_research_report/README.md and is based on work from the research section of the Tensorflow Federated API.
  • Model weights, train and validation statistics, plots, and client sample metadata are automatically stored in the experiment_runs directory. Run experiment_runs/training_progress.py to summarize model performance during or after training.
  • See the notebooks directory for additional analysis, experiments, and examples of loading, testing, and comparing trained models.
Example Setup and Execution
git clone https://github.com/federated-learning-experiments/fl-text-models.git
conda create --verbose --yes --name tff python=3.7
conda activate tff
pip install -r requirements.txt
python federated_nwp.py

Due to the matplotlib dependency, you may need to apply the fix recommended here, if Python is not recognized as a framework.

References

This project draws mainly from the following research, but other sources are referenced throughout this repository, particularly code snippets. Special thanks to Keith Rush and Peter Kairouz from Google for their guidance throughout the course of this project.

Contact

About

Federated learning with text DNNs for DATA 591 at University of Washington.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •