strong-quill-48244
02/20/2025, 1:11 PMsilly-toddler-37820
02/20/2025, 4:48 PM--copy
flag in the Pyflyte CLI (docs). However, I don't think you actually need to register everything at the same time you run it - you should be able to create a new execution of an existing workflow (and the same for each subworkflow you want to execute) using FlyteRemote
(docs). Note that if you want to run FlyteRemote inside a task (like in the case of your higiher-level Flyte workflow), you need to make sure the task is authenticated with Flyte.
In terms of observability, agreed this is somewhat limited since Flyte doesn't have a notion that the workflows are related to each other. You can definitely use FlyteRemote
to fetch the execution status of each of your subworkflows, but there is no meta-DAG in the UI.
For your image management, I would look into ImageSpec, which lets you define your requirements in code and specify images at the task level. So you have lots of flexibility there.
If you'd be open to looking at Union, we have built a more native way for workflows to trigger each other called Reactive Workflows (launch blog and docs).strong-quill-48244
02/20/2025, 5:45 PM--copy
and if that could be useful 🙂 As mentioned, FlyteRemote
is our initial thought as well. It's just a shame that we will lose out on observability and have to write a lot of custom logic to orchestrate the workflows.
To me, it feels like this kind of thing would be exactly what should be enabled by an orchestration tool. Building and running workflows separately and being able to chain them together in a neat way.
I'll have a look at what you have built, looks cool!silly-toddler-37820
02/20/2025, 5:52 PMstrong-quill-48244
02/20/2025, 6:58 PMsilly-toddler-37820
02/20/2025, 7:02 PMstrong-quill-48244
02/20/2025, 7:29 PM/workflows/
../orchestration/
.....orchestrator_wf.py
.....Makefile
.....Dockerfile
../subwf1/
.....subwf1_wf.py
.....Makefile
.....Dockerfile
../subwf2/
.....subwf2_wf.py
.....Makefile
.....Dockerfile
In orchestrator_wf.py we would import wfs from subwf1_wf.py
and subwf2_wf.py
and thus we will have full DAGs in UI.
In orchestration/Makefile we would
1. Build three images based on Dockerfiles in orchestration
, subwf1
, and subwf2
2. Package and register them in a way where each workflow would use their own image when launched
This is how I would ideally build this but now sure if we can register in that manner.
Also, this would then enable development of the individual subwfs without worrying about the main wf.