worried-lighter-79998
09/26/2022, 3:03 PMflytectl update launchplan ... -- version $GITSHA --activate >> --archive-old <<
.
This seems like a basic use case but I have not found any resources on it. Please let me know if I overlooked anythingfreezing-airport-6809
worried-lighter-79998
09/26/2022, 4:19 PMworried-lighter-79998
09/26/2022, 4:22 PMfreezing-airport-6809
thankful-minister-83577
pyflyte package/register
time) and detecting and loading different schedules. I feel like in the past though this hasn’t been a problem because typically users don’t run any schedules on any domain other than prod. And in those cases the activate command is simply omitted.
* I notice that one of the fixed inputs is the s3 location. Just wanted to point out that the Flyte pointer types (flyteschema, structured dataset, and files/folders) already will write to the perhaps poorly named ‘raw output data prefix’ location. This setting is provided by Propeller to tasks at run time and is settable both on the launch plan level and as project/domain specific overrides. If no overrides are present, then Propeller will set a default.
Regarding this question - is there a reason why you want to archive prior launch plans? You shouldn’t have to. For a given launch plan project/domain/name combination, only one version can be active at any time. Activating a new version will automatically deactivate the other versions. The archive process is our soft-delete, a tombstone marker if you will. This is fine, but it shouldn’t be necessary to do. Usually they’re not a distraction for our users because the drop-down list only shows the launch plans relevant to a specific version of a workflow, and users typically update the workflow version at the same time as the launch plan version.
There’s also an related but kinda orthogonal issue here about altering behavior based on env vars, but we feel like this compromises the reproducibility of Flyte.Flyte enables production-grade orchestration for machine learning workflows and data processing created to accelerate local workflows to production.
Powered by