Hi everyone, I was wondering about the following -...
# ask-the-community
j
Hi everyone, I was wondering about the following - is it possible to change ExecutionParameters at execution time? The reason I'm asking this is because I'm creating a new spark session within my task, but the parquet to spark decoder for structured datasets is actually pointing to the old one that shipped with the task. What I would like to achieve is to set
current_context().spark_session
to the new session that I created to work around this. Could it literally be as simple as to update the context and to push the new context with FlyteContextManager?
s
but the parquet to spark decoder for structured datasets is actually pointing to the old one that shipped with the task.
Could you provide more details? What is the reason behind having two Spark sessions? And why are there two of them?
Could it literally be as simple as to update the context and to push the new context with FlyteContextManager?
@Kevin Su, does this work?