curved-petabyte-84246
01/29/2025, 9:24 AMaverage-finland-92144
01/29/2025, 2:55 PMaverage-finland-92144
01/29/2025, 2:56 PMfreezing-airport-6809
freezing-airport-6809
average-finland-92144
01/29/2025, 3:12 PMcurved-petabyte-84246
01/29/2025, 3:13 PMcurved-petabyte-84246
01/29/2025, 3:13 PMcurved-petabyte-84246
01/29/2025, 3:23 PMaverage-finland-92144
01/29/2025, 5:18 PMFailed to cache Dynamic workflow [[CACHE_WRITE_FAILED] Failed to Cache the metadata, caused by: The entry size is larger than 1/1024 of cache size]
it's essentially that: a failure to write to cache due to object size which also means this is not directly connected to the OOM.
The change in 1.13 looked to mitigate or decrease the rate at which memory usage grows in relationship with the number of executions.
Could you share/describe the structure of the dynamic workflow you're running?