-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some FL experiment (learning) parameters not propagated from config file #38
Comments
Hi @AbeleMM, thank you for the report, indeed this should not be the case. I have created a branch 38-loading-configuration-parameters, which you can pull/use to resolve the issue. Note, however, that there may be some issue, as I am busy with writing a test suite for configuration object parsing. |
Thanks for looking into it! |
@AbeleMM It should be fully resolved now. In addition losses are now properly parsed and a typo during instantiation was broken. In addition, I have added a (admittedly somewhat hacky) test-case for both data_parallel and federated learning experiments. Note that the jinja templates require some changes as well |
Got it. Thanks for the update! |
Bug Report
Current Behavior
The value of some learning parameters (e.g., clients per round and epochs) provided in the config of an experiment is seemingly not correctly propagated to the orchestrator (and, subsequently, federator). It appears that the default value from
fltk/util/learning_config.py
'sFedLearningConfig
(e.g.clients_per_round: int = 2
andepochs: int = 1
) is always used. The issue might also affect other parameters, although I have not experimented with all of them.Input Code
Given
configs/federated_tasks/example_arrival_config.json
:Run
helm install flearner charts/orchestrator --namespace test -f charts/fltk-values-abel.yaml --set-file orchestrator.experiment=./configs/federated_tasks/example_arrival_config.json,orchestrator.configuration=./configs/example_cloud_experiment.json
Expected behavior/code
The values of the given config should be correctly reflected within the
config_dict
offltk/core/distributed/orchestrator.py
andself.config
offltk/core/federator.py
after their initialization.The text was updated successfully, but these errors were encountered: