Hi is it possible to have a `task` fail but still ...
# ask-the-community
n
Hi is it possible to have a
task
fail but still continue the
workflow
? Our use case is a
dynamic
workflow spawns N
tasks
, and it’s ok if some small percentage fail. We’re currently using a try catch to return an empty result (likely could also use an
Optional
), but then the user loses visibility by clearly seeing the
task
as failed in the console.
j
are your dynamic tasks terminal node in your workflow or do you need the outputs of them feed it into downstream task? • if its not then i dont think there is anyway to do that, you could try to use
map_task
over dynamic task that has success ratio parameters • if its terminal then you can set the workflow failure policy so that rest of the dynamic tasks run to completion before workflow is marked as failed
k
Another option is experimental but you can use eager mode
j
hmm so with eager mode, you can catch the failure and respond accordingly right? maybe with try except would that work?
k
Yup
Again experimental right now
Also map task is right for this use case
f
There is also
@workflow(failure_policy=WorkflowFailurePolicy.FAIL_AFTER_EXECUTABLE_NODES_COMPLETE)
but I have only ever used it with subworkflows that I wanted to be able to fail while others continue.
n
Ok thank you. We’ve still been avoiding map_tasks largely because of the 1 input argument limitation, but with
ArrayNode map_tasks
and partial function
map_task
support I think we need to start switching from
dynamic
->
map_task
. I believe both
partial
and
ArrayNode
are still experimental features. Would you recommend one over the other? It seems that
ArrayNode
is the more long term solution? but it’s also newest only released in v1.9
k
Partial is not experimental
Array node is
But we have folks using it
Try dropping in
I think map task will solve your problem
And better to know issues in array node, we are going to ga it soon
n
OK thanks! Will try dropping in ArrayNode map_task and see how it performs
Will provide feedback soon