hello Flyte community! Question concerning the Databricks integration:
If your Databricks task specifies an existing cluster, should it already have the dependencies your workflow needs installed? even if you specify a custom docker image to run your workflow with?
f
freezing-airport-6809
10/10/2023, 2:16 PM
Cc @glamorous-carpet-83516 ?
g
glamorous-carpet-83516
10/10/2023, 9:25 PM
your existing spark cluster should install the packages that workflow needs.
if you specify an existing cluster, you don’t to pass a image. This image is used to creating a new cluster.