<#2103 [BUG] Failed to run spark job when using fl...
# flytekit
a
#2103 [BUG] Failed to run spark job when using flytekit remote Issue created by pingsutw Describe the bug I am able to run spark jobs when using flytectl to register the workflow and trigger them on the flyte console. However, I got the below error messages when using the flytekit remote client to execute the workflow.
Copy code
E0126 09:21:12.800044       1 workers.go:102] error syncing 'flytesnacks-development/f4fbbb51f24e6436695b': failed at 
Node[n0]. RuntimeExecutionError: failed during plugin execution, caused by: failed to execute handle for plugin [spark]: 
[BadTaskSpecification] invalid TaskSpecification [fields:{key:"sparkConf" value:{struct_value:{fields:
{key:"spark.driver.cores" value:{string_value:"1"}} fields:{key:"spark.driver.memory" value:{string_value:"1000M"}} fields:
{key:"spark.executor.cores" value:{string_value:"1"}} fields:{key:"spark.executor.instances" value:{string_value:"2"}} fields:
{key:"spark.executor.memory" value:{string_value:"1000M"}}}}}]., caused by: either MainApplicationFile or MainClass must 
be set
Expected behavior Remote clients should be able to execute the spark workflow Additional context to reproduce
Copy code
from flytekit.remote import FlyteRemote
import spark_dataframe.workflows.example

remote = FlyteRemote(
        flyte_admin_url=localhost:30081,
        insecure=True,
        default_project="flytesnacks",
        default_domain="development",
        image_config=ImageConfig(
            default_image=Image(name="default", fqn="spark", tag="v1")
        )
    )
remote.execute(spark_dataframe.workflows.example.my_smart_schema, inputs={}, version=version, wait=False)
Screenshots

image

Are you sure this issue hasn't been raised already? ☑︎ Yes Have you read the Code of Conduct? ☑︎ Yes flyteorg/flyte