Hi, I need to run some tasks on a different execut...
# announcements
p
Hi, I need to run some tasks on a different execution host, namely IBM LSF. I am able to run the docker container there, and thus I can start the flyte built container, and so inputs are consumed from s3 and then pyflyte-execute runs the task there. Currently I'm not extending the flyte backend, rather I'm using a python decorator for these tasks, so there is a node on Flyte which is waiting for the execution on LSF side to finish. My question is, is there a way to let the serialized output from the container which ran on LSF become the outputs of the node which runs on flyte (with the decorated task), or will I have to deserialize on that flyte node somehow? I would like to not consume any extra memory on flyte side
k
Cc @Sujith Samuel who did something similar
Pyflyte execute should automatically upload the outputs to s3
p
Hi, yes thanks for cc:ing Samuel, we're in the same team
k
p
Yes, pyflyte uploads the output, but the problem I'm having is that the execution looks like this in pseudocode (example) task 1 -> task2 (decorated) -> LSF -> task2 (decorated) finish -> task3..
So the question is, can the outputs from the LSF side container be used as the inputs to task 3, without being loaded in task 2
k
Aah, yes they can be. Are they in a consumable form for Flyte. They have to produce the protobuf
I think can help over a chat if you promise to add to the docs 😍
But today I am out - unless someone else can help
Also can you run pyflyte execute on the remote?
p
Hi, well this sounds very promising then, sure I can add something to the docs if we can get it working! Yeah I'm running pyflyte execute on the remote and that part is working fine. Let's chat on Monday then if you are available, thank you very much for responding
👍 1
161 Views