#938 Document IgnoreOutputs exception
Pull request opened by
fg91
When trying to control which of the distributed workers of a PyTorch task returns its values to the subsequent tasks in the workflow (in the case that they are not the same), I couldn't find the information in the documentation and asked on Slack.
In this PR I document the solution I was guided to on Slack in the mnist examples of distributed pytorch, tf, and mpi as this is where I first looked for an answer.
It gives me a bit of sadness to propose to reproduce the same sentence three times, however, I cannot find one overarching documentation page for distributed training and as a user of the respective frameworks, this is where I would first look.
Do you have a better proposition for where to document this?
flyteorg/flytesnacks
GitHub Actions: Mark github pre-release as Release
GitHub Actions: Publish artifacts to github release
GitHub Actions: Create Prerelease
GitHub Actions: Bump Version
✅ 26 other checks have passed
26/30 successful checks