Hey all, anyone here used the <neptune.ai> plugin?...
# flyte-support
b
Hey all, anyone here used the neptune.ai plugin? in my workflow all my tasks should report to the same run_id, but I dont understand how can I configure that other than putting the id manually as an argument in the init_neptune_run decorator
g
cc @flaky-parrot-42438
f
Do you want to associate a whole workflow execution with a single run in neptune?
b
exactly
f
Do you want to set a
custom_run_id
? (According to https://docs.neptune.ai/api/run/ )
b
custom run id as far as i understand is similar to the regular run_id, it's just allowing free text What Im looking for is a common run_id that can be shared between tasks without explicitly passing it as an argument ( because then the init_neptune_run decorator will be redundant and will create dummy runs)
f
I'm trying to see how one would do it manually, and then we can figure out the abstraction to make it easier.
b
manually it would mean that one will create a neptune run, and place that id in all of the task init_neptune_run decorators as __with_id argument_
f
Hmm, what happens when you use
with_id
but the run is not created yet?
b
I think it would raise an error
ideally the workflow will create the id,then push the run_id to a context that can be accessed by the tasks
Moreover, for example I would like map tasks to report to the same neptune run, but place the results in different directories. for that I had to pass a "neptune subdirectory" parameter to each task
A good option could be to use the execution id as the custom_run_id (I believe there is a task accessible execution_id)
f
ideally the workflow will create the id
A normal workflow does not actually run any python code, so it can not generate the id. So we'll need a task to generate it in a task and then pass it around somehow. Can you open an issue on this?
b
Sure thing, could you send me a link? Workflows can run some code as far as I know, isnt that called flyte DSL or something? 🙂
f
https://github.com/flyteorg/flyte/issues Flyte runs code during registration time. Once it reaches the cluster, the workflow is statically compiled to a DAG and it does not run any python code. The workaround is to use dynamic workflow, which is a mixture between task + workflows.
b
I see. thanks! Will write that soon