Hi community, According to the release notes (I th...
# flyte-support
c
Hi community, According to the release notes (I think Flyte 1.13) a memory leak for flyte-binary was fixed. However we still see that Flyte's memory keeps increasing (until killed by k8s):
a
this should have been fixed in 1.14.5 https://github.com/flyteorg/flytekit/releases/tag/v1.14.5
two different changes: in 1.13 it had to do with mitigating the cardinality of prom metrics, in 1.14 is more about the impact of async
f
David async impact is in flytekit - guy is pointing to the binary
I think n this has to be cache or something
a
@curved-petabyte-84246 so this happens all the time or is it triggered by a particular execution?
c
I'll pull longer history
but it's not related to a specific workflow
@average-finland-92144 might be related to a certain type of workflows maybe we did start a large scale run yesterday and the memory started increased this execution is very long running (20 hours) and contains lots of dynamic workflows
a
so looking at this error
Copy code
Failed to cache Dynamic workflow [[CACHE_WRITE_FAILED] Failed to Cache the metadata, caused by: The entry size is larger than 1/1024 of cache size]
it's essentially that: a failure to write to cache due to object size which also means this is not directly connected to the OOM. The change in 1.13 looked to mitigate or decrease the rate at which memory usage grows in relationship with the number of executions. Could you share/describe the structure of the dynamic workflow you're running?