<@U0165SDP3BQ> have you started working on KerasCh...
# announcements
k
@Sören Brunk have you started working on KerasCheckpoint wrapper? cc @Niels Bantilan?
s
The checkpointing I've done was was not for Keras but for the huggingface Trainer for PyTorch transformer models.
I'm not sure if it's worth it to create a dedicated plugin for this, or if we should just add it to the docs. It's essentially not much more than a callback for saving checkpoints:
Copy code
class SaveCheckpointCallback(TrainerCallback):
    def on_save(self, args: TrainingArguments, state: TrainerState, control: TrainerControl, **kwargs):
        cp: Checkpoint = flytekit.current_context().checkpoint
        checkpoint_path = args.output_dir
        <http://logger.info|logger.info>("Saving checkpoint")
        cp.save(checkpoint_path)
And then some code for restoring the checkpoint:
Copy code
if enable_checkpointing:
        <http://logger.info|logger.info>("Checkpointing enabled. Trying to restore previous checkpoint")
        cp: Checkpoint = flytekit.current_context().checkpoint
        checkpoint_path = cp.restore(_training_args.output_dir)
        last_checkpoint_path = get_last_checkpoint(checkpoint_path)
        <http://logger.info|logger.info>(
            "Restored checkpoint" if last_checkpoint_path else "No checkpoint found")
    else:
        last_checkpoint_path = None

    trainer.train(resume_from_checkpoint=last_checkpoint_path)
172 Views