clean-glass-36808
12/18/2024, 6:35 PMQueued
state on the UI. Looking at the flytepropeller
logs all I see is this repeated for over a day. One thing I noticed is that I never see a log for acquired cache reservation
for this execution but I see it for others.
> 2024-12-16 194232.587 {"json":{"exec_id":"f1886719e6ccc600f000","ns":"metrics-development","routine":"worker-72"},"level":"info","msg":"Processing Workflow.","ts":"2024-12-17T034232Z"}
> 2024-12-16 194232.588 {"json":{"exec_id":"f1886719e6ccc600f000","ns":"metrics-development","res_ver":"190011148","routine":"worker-72","wf":"redacteddbt_streaming_sync"},"level":"info","msg":"Handling Workflow [f1886719e6ccc600f000], id: [project:\"metrics\" domain:\"development\" name:\"f1886719e6ccc600f000\"], p [Running]","ts":"2024-12-17T034232Z"}
> 2024-12-16 194232.591 {"json":{"exec_id":"f1886719e6ccc600f000","node":"n0","ns":"metrics-development","res_ver":"190011148","routine":"worker-72","wf":"redacteddbt_streaming_sync"},"level":"info","msg":"Catalog CacheMiss: Artifact not found in Catalog. Executing Task.","ts":"2024-12-17T034232Z"}
> 2024-12-16 194232.604 {"json":{"exec_id":"f1886719e6ccc600f000","ns":"metrics-development","res_ver":"190011148","routine":"worker-72","wf":"redacteddbt_streaming_sync"},"level":"info","msg":"Handling Workflow [f1886719e6ccc600f000] Done","ts":"2024-12-17T034232Z"}
> 2024-12-16 194232.612 {"json":{"exec_id":"f1886719e6ccc600f000","ns":"metrics-development","routine":"worker-72"},"level":"info","msg":"Will not fast follow, Reason: Wf terminated? false, Version matched? true","ts":"2024-12-17T034232Z"}
> 2024-12-16 194232.612 {"json":{"exec_id":"f1886719e6ccc600f000","ns":"metrics-development","routine":"worker-72"},"level":"info","msg":"Streak ended at [0]/Max: [8]","ts":"2024-12-17T034232Z"}
> 2024-12-16 194232.612 {"json":{"exec_id":"f1886719e6ccc600f000","ns":"metrics-development","routine":"worker-72"},"level":"info","msg":"Completed processing workflow.","ts":"2024-12-17T034232Z"}
> 2024-12-16 194232.612 {"json":{"exec_id":"f1886719e6ccc600f000","ns":"metrics-development","routine":"worker-72"},"level":"info","msg":"Successfully synced 'metrics-development/f1886719e6ccc600f000'","ts":"2024-12-17T034232Z"}
> 2024-12-16 194233.584 {"json":{},"level":"info","msg":"==> Enqueueing workflow [metrics-development/f1886719e6ccc600f000]","ts":"2024-12-17T034233Z"}clean-glass-36808
12/18/2024, 6:57 PMreturn catalog.Key{
Identifier: *taskTemplate.Id, //nolint:protogetter
CacheVersion: taskTemplate.GetMetadata().GetDiscoveryVersion(),
CacheIgnoreInputVars: taskTemplate.GetMetadata().GetCacheIgnoreInputVars(),
TypedInterface: *taskTemplate.GetInterface(),
InputReader: nCtx.InputReader(),
}, nil
and this is a task running on a launchplan/cron so I'm pretty sure the cache key will be consistent across runs... Which means someone must be holding onto the reservation.clean-glass-36808
12/18/2024, 7:00 PMWorkflow[metrics:.....dbt_streaming_sync] failed. RuntimeExecutionError: max number of system retry attempts [77/30] exhausted. Last known status message: Workflow[] failed. ErrorRecordingError: failed to publish event, caused by: EventSinkError: Error sending event, caused by [rpc error: code = Unknown desc = failed database operation with server login has been failing, try again later (server_login_retry)]
clean-glass-36808
12/18/2024, 7:00 PMflat-area-42876
12/18/2024, 10:48 PMclean-glass-36808
12/18/2024, 11:12 PMclientSecret
to null
in the helm chart.clean-glass-36808
12/18/2024, 11:12 PMflat-area-42876
12/18/2024, 11:47 PM