-
Notifications
You must be signed in to change notification settings - Fork 135
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding new parties and retraining #52
Comments
Hi, thanks for using IBM Federated Learning Library!
Yes, that's possible. You can enter
Parties can drop any time during the training, as long as the quorum is met, the training process will keep going. And new parties can join between each
This answer really depends on how the party loads its datasets. If the data handler looks for and loads a local dataset each time when IBM FL local training module uses Let us know if you have further questions. |
Hi, I see you are using |
The logic for IBM FL is that the aggregator will wait until |
Hi again! I uploaded to version 1.0.3 and the problem persists. Reading the previous responses I think I did not explain myself well. What I am trying to say is that I start with 10 parties, I register all of them and then I make the aggragator to start training. Then, the training finished successfully. Now, for instance, I make party 7 to disconect, |
Hi, can you share the |
Hi! I was wondering about what IBMFL can do after a training process. When aggregator.start_training() finish is it possible to make start_training() again and the model leverage what was make before? And between start_training() and start_training() is it possible to add new parties and quit others? And last one, between start_training() and start_training() is it possible to change the data of one party? I mean, maybe that party during the training phase has more data and what to make the next training phases with that data.
Regards and thanks in advance.
The text was updated successfully, but these errors were encountered: