Another issue this causes is when I run the workfl...
# ask-the-community
f
Another issue this causes is when I run the workflow that has spark tasks, I have to supply --service-account spark e.g.
Copy code
pyflyte run --remote --service-account spark --image spark_image ...
But this will cause other flyte tasks to have permission issues. How can we provide the --service-account spark param in the code for the spark task to avoid this? Will the following work?
Copy code
@task(
    container_image=".../flyte-pyspark:latest",
    task_config=Spark(
        spark_conf={...
        },
        service_account: "spark",
    ),
)
k
IIRC, we don’t support passing the SA to the specific task. mind creating an issue for it?
f
@Kevin Su, I would like to do that. Since this is the first time for me to create an issue, would you mind tell me where and how?
k
[flyte-core]
f
@Kevin Su @Yee @Samhita Alla, what does setting service account to spark do?
s
The spark service account will have the required perms to create executor pods.
y
this will also be doable once the work to add a pod template to the task decorator is one.
let me see if i can dig up the issue for that
f
@Yee @Samhita Alla, thanks. What would be the reasons why the tasks got denied access to AWS resources when spark SA is specified as opposed to default SA, and the possible solutions to this?
y
k8s service accounts in eks are linked to iam roles.
those iam roles determine which aws resources you have access to.
f
Thanks
y
100 Views