Another issue this causes is when I run the workfl...
# flyte-support
s
Another issue this causes is when I run the workflow that has spark tasks, I have to supply --service-account spark e.g.
Copy code
pyflyte run --remote --service-account spark --image spark_image ...
But this will cause other flyte tasks to have permission issues. How can we provide the --service-account spark param in the code for the spark task to avoid this? Will the following work?
Copy code
@task(
    container_image=".../flyte-pyspark:latest",
    task_config=Spark(
        spark_conf={...
        },
        service_account: "spark",
    ),
)
g
IIRC, we don’t support passing the SA to the specific task. mind creating an issue for it?
s
@glamorous-carpet-83516, I would like to do that. Since this is the first time for me to create an issue, would you mind tell me where and how?
g
[flyte-core]
s
@glamorous-carpet-83516 created https://github.com/flyteorg/flyte/issues/3241
👍 1
@glamorous-carpet-83516 @thankful-minister-83577 @tall-lock-23197, what does setting service account to spark do?
t
The spark service account will have the required perms to create executor pods.
t
this will also be doable once the work to add a pod template to the task decorator is one.
let me see if i can dig up the issue for that
s
@thankful-minister-83577 @tall-lock-23197, thanks. What would be the reasons why the tasks got denied access to AWS resources when spark SA is specified as opposed to default SA, and the possible solutions to this?
t
k8s service accounts in eks are linked to iam roles.
those iam roles determine which aws resources you have access to.
s
Thanks
t
156 Views