is there any tutorial/code on how to deploy llm[ll...
# flyte-support
t
is there any tutorial/code on how to deploy llm[llama-2 chat] on flyte framework
a
The closest I can find is this tutorial on fine-tuning Code Llama on the Flyte codebase: https://github.com/unionai-oss/llm-fine-tuning/tree/main/flyte_llama Is that helpful?