<#3300 [Docs] If workflows can save pipeline artif...
# flyte-github
a
#3300 [Docs] If workflows can save pipeline artifact to s3 instead of tempdir Issue created by esadler-hbo Description Data Scientists developing with cloud notebooks lose temporary files when they shutdown notebooks. Also some companies, like Databricks, create file system abstractions that break the OOS paradigm most software follows. It is possible to use
remote_path
for
StructuredDatasets
, but being able to just run local executions against s3 would mean I would write less code. This might not make sense if the workflow artifacts are hard to interpret out of context. This is a pretty low priority request. Are you sure this issue hasn't been raised already? ☑︎ Yes Have you read the Code of Conduct? ☑︎ Yes flyteorg/flyte