hello Flyte community! Question concerning the Dat...
# ask-the-community
g
hello Flyte community! Question concerning the Databricks integration: If your Databricks task specifies an existing cluster, should it already have the dependencies your workflow needs installed? even if you specify a custom docker image to run your workflow with?
k
Cc @Kevin Su ?
k
your existing spark cluster should install the packages that workflow needs. if you specify an existing cluster, you don’t to pass a image. This image is used to creating a new cluster.
g
thank that’s clear