Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecated sg_logger_params #1531

Closed
wants to merge 5 commits into from

Conversation

shaydeci
Copy link
Contributor

No description provided.

"save_tensorboard_remote": True,
"save_logs_remote": True,
}
"sg_logger": {"dagshub_sg_logger":
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we also introduce logger names consistent with class names and deprecate old snake_case names? E.g "DagsHubSGLogger" instead of "dagshub_sg_logger"?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking about that too actually.
Should I? Then since we don't want to break (not actually breaking since this is a new format) we would also deprecate the old way?
Or should we just leave both options ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be great, add the class names, and deprecate the old names.

save_logs_remote: False # upload log files to s3
monitor_system: True # Monitor and write to tensorboard the system statistics, such as CPU usage, GPU, ...
sg_logger:
base_sg_logger:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How would one "unset" the base_sg_logger once it is already in defaults?

I feel this may be the problem if someone would like to use non-default logger:

defaults:
  training_hyperparams: default_train_params

training_hyperparams:
  sg_logger:
    wandb_sg_logger:
      ...

In this case the resolved config would include:

training_hyperparams:
  sg_logger:
    base_sg_logger:
      ...
    wandb_sg_logger:
      ...

And that would lead to an exception in factory.

Second use case that would not work is using W&B via Makefile (To keep YAML files intact):

WANDB_PARAMS = training_hyperparams.sg_logger=wandb_sg_logger +training_hyperparams.sg_logger_params.api_server=https://wandb.research.deci.ai +training_hyperparams.sg_logger_params.entity=super-gradients training_hyperparams.sg_logger_params.launch_tensorboard=false training_hyperparams.sg_logger_params.monitor_system=true +training_hyperparams.sg_logger_params.project_name=PoseEstimation training_hyperparams.sg_logger_params.save_checkpoints_remote=true training_hyperparams.sg_logger_params.save_logs_remote=true training_hyperparams.sg_logger_params.save_tensorboard_remote=false training_hyperparams.sg_logger_params.tb_files_user_prompt=false

coco2017_yolo_nas_s:
	python src/super_gradients/train_from_recipe.py --config-name=coco2017_yolo_nas_s $(WANDB_PARAMS)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Strong point right there...
I will think of something.

@shaydeci shaydeci closed this Oct 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants