freezing-tailor-85994
08/01/2025, 5:53 PMbfrench@LM-BFRENCH:~/Documents/Code/monorepo$ pyflyte register example/flyteify.py -p 'ml-example' -d 'staging' -v 'bf-2025-08-01'
Running pyflyte register from /Users/bfrench/Documents/Code/monorepo with images ImageConfig(default_image=Image(name='default', fqn='<http://cr.flyte.org/flyteorg/flytekit|cr.flyte.org/flyteorg/flytekit>', tag='py3.10-1.16.3', digest=None), images=[Image(name='default', fqn='<http://cr.flyte.org/flyteorg/flytekit|cr.flyte.org/flyteorg/flytekit>', tag='py3.10-1.16.3', digest=None)]) and image destination folder /root on 1 package(s) ('/Users/bfrench/Documents/Code/monorepo/example/flyteify.py',)
Registering against <http://flyte.COMPANY.net|flyte.COMPANY.net>
Detected Root /Users/bfrench/Documents/Code/monorepo/example, using this to create deployable package...
Loading packages ['flyteify'] under source root /Users/bfrench/Documents/Code/monorepo/example
No output path provided, using a temporary directory at /var/folders/8n/83r7h0kx0xbgnddnmbggtz_m0000gp/T/tmp7_mk_77l instead
AttributeError: 'str' object has no attribute 'labels'
This is code that runs perfectly normally but flyte is complaining about an error with almost no detail provided. Has anyone seen this before?freezing-tailor-85994
08/01/2025, 5:54 PM@workflow
def do_ray(config_loc: str = 'asdf'):
train_model_test(config_loc)
Task:
@task(
task_config=ray_config
)
def train_model_test(config_loc: str = 'asdf'):
s3_client = boto3.client("s3")
parsed_s3 = urlparse(config_loc, allow_fragments=False)
config_bucket = parsed_s3.netloc
config_key = parsed_s3.path[1:]
s3_client.download_file(config_bucket, config_key, "config_python.yaml")
run_config = parse_config(config_path="config_python.yaml")
print(run_config)
ray_main(run_config, disable_ray=False)
freezing-tailor-85994
08/01/2025, 5:55 PMray_main
for NDA reasons, but assume it's a standard model training script that looks a lot like the Ray examplesfreezing-tailor-85994
08/01/2025, 5:55 PMray_config = RayJobConfig(
head_node_config=HeadNodeConfig(pod_template=cpu_pod_template),
worker_node_config=[WorkerNodeConfig(group_name="raygroup", replicas=1, pod_template=gpu_container_image)],
runtime_env={"pip": []},
enable_autoscaling=True,
shutdown_after_job_finishes=True,
ttl_seconds_after_finished=3600,
)
Ray configfreezing-tailor-85994
08/01/2025, 5:55 PMgpu_pod_template=PodTemplate(
annotations={"<http://karpenter.sh/do-not-disrupt|karpenter.sh/do-not-disrupt>": "true"},
pod_spec=V1PodSpec(
containers=[
V1Container(
name="primary",
image=gpu_container_image,
resources=V1ResourceRequirements(
requests={'cpu':"90", 'mem':"750Gi", 'ephemeral_storage':'600Gi', '<http://vpc.amazonaws.com/efa|vpc.amazonaws.com/efa>': '2'},
limits={'cpu':"90", 'mem':"750Gi", 'ephemeral_storage':'600Gi', '<http://vpc.amazonaws.com/efa|vpc.amazonaws.com/efa>': '2'}),
),
],
node_selector={'<http://karpenter.sh/nodepool|karpenter.sh/nodepool>': 'gpu-nodepool'}
),
)
numerous-actor-35946
08/02/2025, 12:03 AMmodels/task.py
your ray_config might be sending string instead of taskobject maybe
thats just a surface guess though; without seeing the entire code its hard to tell