I have a dynamic task, that has a few chained task...
# ask-the-community
v
I have a dynamic task, that has a few chained tasks (using >> operator), when running I get the following error`RuntimeExecutionError: max number of system retry attempts [51/50] exhausted. Last known status message: [system] unable to read futures file, maybe corrupted, caused by: [system] Failed to read futures protobuf file., caused by: path/to/futures.pb: [LIMIT_EXCEEDED] limit exceeded. 22mb > 10mb.` Any thoughts on what might be going on here? Thanks!
d
Hey @Visak, Flyte has some present limits on blobstore file sizes to mitigate performance issues. The default value is 10mb, which the indication is that you're hitting this here. The
futures.pb
file is just the dynamically compiled workflow from the dynamic task. 22mb is extremely large - can you say a little more about what the underlying workflow from the dynamic looks like?
If you want to update the configuration the value is located at
storage.maxDownloadMBs
an example is here
v
Thanks for the info @Dan Rammer (hamersaw). Here is a little more context on what I am trying to do. In my use cases, there is a need for calling the same task multiple times sequentially inside a dynamic task. To do this I chain tasks like this , when the input size is large I run into this issue, do you have a suggestion how I can do this differently?
Copy code
@task
Def double_nums(num: list[int]):
	return [item*2 for item in nums]

@dynamic
Def process_tasks(inputs: List[List[ints]]): 
	
	seq_tasks = []
	for i in range(len(inputs)):
		doubles = double_nums(inputs[i])
		seq_tasks.append(doubles)

	for i in range(len(inputs)-1):
		seq_tasks[i] >> seq_tasks[i+1]
It works for smaller size inputs and the behavior is exactly like I want. the task 'double_num' gets sequentially executed instead of all at once (which happens If I dont use >> operator)
d
are the sequential tasks actually dependent on each other or do you just only want to run 1 at a time?
v
Just run one at a time
d
So rather than using a dynamic task here you could use map tasks - these offer a number of efficiencies in evaluating tasks where each executes on a single element in a list. How large is the list? Must be 10k+ right? You may have to update map task configuration to allow for these larger runs, but it will scale larger than dynamics.
you'll have to set concurency on the map task to 1 so that items execute one at a time.
v
I see thanks! This is helpful! Quick question, I should be able to call another map_task inside a map_task right? Just wondering if that would be possible
s
I don't think that's possible. Would like @Dan Rammer (hamersaw) to confirm.
d
Correct, currently maptasks can only spawn k8s-based tasks (python function with
@task
decorator, PodPlugin, etc). The proposal for ArrayNodes would add support for mapping over other entities.
v
Thank you both, really helpful info!
164 Views