Has anyone used Flyte for hyperparameter tuning? C...
# ml-and-mlops-questions
w
Has anyone used Flyte for hyperparameter tuning? Curious if anyone has any thoughts on good libraries to use. Katib and Ray Tune seem like popular options on Google. I also see that there's anew flytekit-optuna plugin.
f
Cc @astonishing-eve-54331 has some interesting solution for this
a
Thanks Ketan! Yes, I added the Optuna plugin for Flyte that uses Eager WFs under the hood. Using the flexibility of eager WFs one can reach feature parity of Optuna, but scaling across up to a hundred concurrent trials running.
No DBs or external experiment tracking. Pure python with caching and type checking built in.
w
Thanks guys! This is super awesome! We'll give this a try 🙂
a
Nice! We'll get this fixed up in the docs. It is a little too hidden away. Just ran this as a sort of "hello world" and it works as expected
Copy code
import math

import flytekit as fl
from flytekitplugins.optuna import Optimizer, suggest

image = fl.ImageSpec(packages=["flytekitplugins.optuna"])

@fl.task(container_image=image)
async def objective(x: float, y: int, z: int, power: int) -> float:
    return math.log((((x - 5) ** 2) + (y + 4) ** 4 + (3 * z - 3) ** 2)) ** power

@fl.eager(container_image=image)
async def train(concurrency: int, n_trials: int) -> float:

    optimizer = Optimizer(objective=objective, concurrency=concurrency, n_trials=n_trials)

    await optimizer(
        x=suggest.float(low=-10, high=10),
        y=suggest.integer(low=-10, high=10),
        z=suggest.category([-5, 0, 3, 6, 9]),
        power=2,
    )

    print(optimizer.study.best_value)
    
    return optimizer.study.best_value
Of course, it'll support whatever complexity you need (dynamic suggestions, multi-objective, custom pruners, etc). It is all just a wrapper around Optuna so you can access the
optuna.Study
via
Optimizer.study
. An interesting feature that naturally emerged from eager is that it is also fault tolerant. IE, if one trial fails because of OOM, the Optimizer should learn to avoid recommending other params that will lead to an OOM.