https://flyte.org logo
#ask-the-community
Title
# ask-the-community
f

Frank Shen

01/17/2023, 7:51 PM
Another issue this causes is when I run the workflow that has spark tasks, I have to supply --service-account spark e.g.
Copy code
pyflyte run --remote --service-account spark --image spark_image ...
But this will cause other flyte tasks to have permission issues. How can we provide the --service-account spark param in the code for the spark task to avoid this? Will the following work?
Copy code
@task(
    container_image=".../flyte-pyspark:latest",
    task_config=Spark(
        spark_conf={...
        },
        service_account: "spark",
    ),
)
k

Kevin Su

01/17/2023, 9:07 PM
IIRC, we don’t support passing the SA to the specific task. mind creating an issue for it?
f

Frank Shen

01/17/2023, 9:38 PM
@Kevin Su, I would like to do that. Since this is the first time for me to create an issue, would you mind tell me where and how?
k

Kevin Su

01/17/2023, 9:43 PM
[flyte-core]
f

Frank Shen

01/17/2023, 10:13 PM
@Kevin Su @Yee @Samhita Alla, what does setting service account to spark do?
s

Samhita Alla

01/19/2023, 7:05 AM
The spark service account will have the required perms to create executor pods.
y

Yee

01/19/2023, 5:45 PM
this will also be doable once the work to add a pod template to the task decorator is one.
let me see if i can dig up the issue for that
f

Frank Shen

01/19/2023, 6:09 PM
@Yee @Samhita Alla, thanks. What would be the reasons why the tasks got denied access to AWS resources when spark SA is specified as opposed to default SA, and the possible solutions to this?
y

Yee

01/19/2023, 6:10 PM
k8s service accounts in eks are linked to iam roles.
those iam roles determine which aws resources you have access to.
f

Frank Shen

01/19/2023, 6:17 PM
Thanks
y

Yee

01/19/2023, 6:37 PM
2 Views