Is there a way to allow tasks to fail? I have the ...
# ask-the-community
Is there a way to allow tasks to fail? I have the following setup:
Copy code
def base_workflow(config: Config):
    for i in [1..10]:
        task_a = a(i=i)
        task_b = b(i=i)
        task_c = c(i=i)
        task_a >> task_b >> task_c
This all starts nicely 10x in parallel (as desired), but as soon as one of the tasks fail (lets say, task_a fails for i=3), they all abort. Is there a Flyte-native way to allow i=3 to fail, but let the others complete?
For a workflow you can set a failure policy, to allow remaining tasks to continue.
That sounds great, unfortunately
doesn’t have that field, but maybe I can find a way to nest a workflow or something to use it.
Setting it on the top level should propagate it downward, so if you wrap your dynamic in a workflow that should work out
Works! Thanks a lot @Maarten de Jong 🙌
we also encountered this, would be nice to add that field to the
decorator as well...
Maybe you know this as well @Maarten de Jong @Felix Ruess since you're also working with dynamic tasks/workflows: Is there a way to add a prefix/label/name to the tasks, instead of the rather cryptic
etc.? That would be nice to quickly see on first glance in the UI on what item a failure occured (ie. did it fail on
, ..?), without having to go into the Inputs
@Geert I have not investigated this, but you may be able to use something like
Copy code
a(i=i).with_overrides(node_name=f"a: i={i}")
This is demonstrated as above in docs about subworkflows (which I realize is different than the above use-case, but may still work for you).
Thanks @Thomas Blom! Interesting, it seems it picks up the names when the task did not run yet. When they are running/have completed, they switch back to a random identifier:
For now this at least gives us some (temporary) visibility, much appreciated 🙏