worried-airplane-87065
03/11/2025, 6:44 PMfreezing-airport-6809
freezing-airport-6809
astonishing-eve-54331
03/12/2025, 2:50 AMastonishing-eve-54331
03/12/2025, 2:55 AMworried-airplane-87065
03/12/2025, 5:17 AMastonishing-eve-54331
03/12/2025, 1:45 PMimport math
import flytekit as fl
from flytekitplugins.optuna import Optimizer, suggest
image = fl.ImageSpec(packages=["flytekitplugins.optuna"])
@fl.task(container_image=image)
async def objective(x: float, y: int, z: int, power: int) -> float:
return math.log((((x - 5) ** 2) + (y + 4) ** 4 + (3 * z - 3) ** 2)) ** power
@fl.eager(container_image=image)
async def train(concurrency: int, n_trials: int) -> float:
optimizer = Optimizer(objective=objective, concurrency=concurrency, n_trials=n_trials)
await optimizer(
x=suggest.float(low=-10, high=10),
y=suggest.integer(low=-10, high=10),
z=suggest.category([-5, 0, 3, 6, 9]),
power=2,
)
print(optimizer.study.best_value)
return optimizer.study.best_value
Of course, it'll support whatever complexity you need (dynamic suggestions, multi-objective, custom pruners, etc).
It is all just a wrapper around Optuna so you can access the optuna.Study
via Optimizer.study
.
An interesting feature that naturally emerged from eager is that it is also fault tolerant. IE, if one trial fails because of OOM, the Optimizer should learn to avoid recommending other params that will lead to an OOM.