Ketan (kumare3)
NikeNano
08/19/2022, 8:48 AMSandra Youssef
Sandra Youssef
Sandra Youssef
Ketan (kumare3)
karthikraj
08/23/2022, 2:29 AMOliver Nguyen
08/23/2022, 2:49 AMKetan (kumare3)
Sandra Youssef
Subhro
08/23/2022, 6:22 PMkarthikraj
08/23/2022, 9:15 PM[1/1] currentAttempt done. Last Error: USER::Pod failed. No message received from kubernetes.
[av6c5trbtctv6nkknd9m-n0-0] terminated with exit code (1). Reason [Error]. Message:
tar: development/fastca4192cff23d37d3fb985a60de82c186.tar.gz: Cannot open: Not a directory
tar: Error is not recoverable: exiting now
Traceback (most recent call last):
File "/usr/local/bin/pyflyte-fast-execute", line 8, in <module>
sys.exit(fast_execute_task_cmd())
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1130, in __call__
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1055, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.7/site-packages/click/core.py", line 760, in invoke
return __callback(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/flytekit/bin/entrypoint.py", line 496, in fast_execute_task_cmd
_download_distribution(additional_distribution, dest_dir)
File "/usr/local/lib/python3.7/site-packages/flytekit/tools/fast_registration.py", line 117, in download_distribution
result.check_returncode()
File "/usr/local/lib/python3.7/subprocess.py", line 444, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['tar', '-xvf', 'development/fastca4192cff23d37d3fb985a60de82c186.tar.gz', '-C', 'development']' returned non-zero exit status 2.
karthikraj
08/24/2022, 12:05 AMflyte]$ tree workflows/
workflows/
├── hello_world.py
├── __init__.py
└── __pycache__
├── hello_world.cpython-37.pyc
└── __init__.cpython-37.pyc
1 directory, 5 files
serializing using pyflyte which creates tar.gz file:
flyte]$ pyflyte register workflows/hello_world.py -d development -p cloudops-max-flyte-demo
Output given as None, using a temporary directory at /tmp/tmplbj3dq0o instead
{"asctime": "2022-08-23 23:56:57,913", "name": "flytekit.remote", "levelname": "WARNING", "message": "Uploading /tmp/tmplbj3dq0o/fastd2800a4ca5d77817f85af258a1ca4eff.tar.gz to https://<my_bucket>.<http://s3.amazonaws.com/io/cloudops-max-flyte-demo/development/MKBVMY42N6USXMYOHFXIKC2DAI%3D%3D%3D%3D%3D%3D/fastd2800a4ca5d77817f85af258a1ca4eff.tar.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=XXXXXXXXXXXXXXXXXXXXXXXXX-Amz-SignedHeaders=content-md5%3Bhost&X-Amz-Signature=XXXXXXXXXXXXXXXXXXXXX|s3.amazonaws.com/io/cloudops-max-flyte-demo/development/MKBVMY42N6USXMYOHFXIKC2DAI%3D%3D%3D%3D%3D%3D/fastd2800a4ca5d77817f85af258a1ca4eff.tar.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=XXXXXXXXXXXXXXXXXXXXXXXXX-Amz-SignedHeaders=content-md5%3Bhost&X-Amz-Signature=XXXXXXXXXXXXXXXXXXXXX> native url s3://<my_bucket>/io/cloudops-max-flyte-demo/development/MKBVMY42N6USXMYOHFXIKC2DAI======/fastd2800a4ca5d77817f85af258a1ca4eff.tar.gz"}
Loading packages ['workflows.hello_world'] under source root /home/kkanagar/cloudops-max-flyte-demo/flyte
Successfully serialized 3 flyte objects
Registering using flytectl:
flyte]$ flytectl register files /tmp/tmplbj3dq0o/fastd2800a4ca5d77817f85af258a1ca4eff.tar.gz -d development -p cloudops-max-flyte-demo --archive
{"json":{},"level":"error","msg":"failed to initialize token source provider. Err: failed to fetch auth metadata. Error: rpc error: code = Unimplemented desc = unknown service flyteidl.service.AuthMetadataService","ts":"2022-08-23T23:57:19Z"}
{"json":{},"level":"warning","msg":"Starting an unauthenticated client because: can't create authenticated channel without a TokenSourceProvider","ts":"2022-08-23T23:57:19Z"}
Error: input package have some invalid files. try to run pyflyte package again [/tmp/register3231572267/workflows/__init__.py /tmp/register3231572267/workflows/hello_world.py]
{"json":{},"level":"error","msg":"input package have some invalid files. try to run pyflyte package again [/tmp/register3231572267/workflows/__init__.py /tmp/register3231572267/workflows/hello_world.py]","ts":"2022-08-23T23:57:19Z"}
Sujith Samuel
08/24/2022, 7:47 AMSujith Samuel
08/24/2022, 7:48 AMNada Saiyed
08/24/2022, 4:13 PMSlackbot
08/25/2022, 4:53 AMSeungTaeKim
08/25/2022, 9:20 AMSandra Youssef
Nada Saiyed
08/25/2022, 8:11 PMDict
as input. One of the values in that Dict
is returned from previous task and Flyte is not able to bind the value.
I get this AssertionError
AssertionError: this should be a Dictionary type and it is not: <class 'dict'> vs union_type {
variants {
map_value_type {
simple: STRING
}
structure {
tag: "Typed Dict"
}
}
variants {
simple: NONE
structure {
tag: "none"
}
}
}
Katrina P
08/25/2022, 8:35 PMSamhita Alla
Pontus Wistbacka
08/27/2022, 4:13 PMRahul Mehta
08/29/2022, 2:38 AMError: rpc error: code = InvalidArgument desc = size of name exceeded length 20 : test-named-workflow-2
This appears to be hard-coded here: https://github.com/flyteorg/flyteadmin/blob/7b48c856e3a1e12c57f1abbf535bf6c29fa59d74/pkg/manager/impl/validation/execution_validator.go#L19. Would folks be open to bumping that to 64 (to match the max length for an argo workflow's name)Samhita Alla
Sathish kumar Venkatesan
08/30/2022, 6:39 AMSathish kumar Venkatesan
08/30/2022, 6:39 AMMehtab Mehdi
08/30/2022, 9:58 AMSandra Youssef
Fabio Grätz
08/30/2022, 4:24 PMStructuredDataset
to pass a Spark Dataframe around between tasks.
I’m following this guide and get this error:
{
"asctime": "2022-08-30 16:15:09,048",
"name": "flytekit",
"levelname": "ERROR",
"message": "Failed to convert return value for var o0 with error <class 'ValueError'>: Failed to find a handler for <class 'pyspark.sql.dataframe.DataFrame'>, protocol gs, fmt parquet"
}
I added this to my spark config but it doesn’t solve the problem:
spark-config-default:
- "spark.jars.packages": "com.google.cloud.bigdataoss:gcs-connector:hadoop3-2.2.2"
- "spark.hadoop.fs.AbstractFileSystem.gs.impl": "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS"
- "spark.hadoop.fs.gs.impl": "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem"
- "spark.hadoop.google.cloud.auth.service.account.enable": "true"
Would be great to get some pointers in case somebody has seen this error before, thanks!