You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
NB: names in the yaml below may not match the tree above as that's ChatGPT generated to give the rough structure, not the full tree.
# inference.yamldefaults:
- trainer/trainer_args@trainer_args: inference
- trainer/model_config@_here_: model
- trainer/preprocessor@preprocessor: default
- trainer/data_collator@data_collator: seq2seq
- trainer/callbacks@callbacks: temp
- project/toolbox@data_preparation.text2text_conversion.intent_fmt.config.toolbox: default
- data_preparation: online_multiline
- trainer/constraints@constraints: model
- _self_# set to abs path to checkpoint dir (ending in# `checkpoint-*`) of the checkpoint to be decodedmodel_name_or_path: ???output_dir: ${model_name_or_path}# nb: this is not used if `predicted_history_inference = true`, for now.# path to the test.parquet file (for next action prediction)test_file: ???checkpoint: ???config:
pretrained_model_name_or_path: ${model_name_or_path}# this config group can be used to override any argument# to the huggingface `Trainer`trainer_args:
output_dir: ${output_dir}test_file: ${make_absolute:${test_file}}hyps_dir: ${root:hyps}metrics_dir: ${root:metrics}predicted_history_inference: falsesplit: 'test'# inference data version. Used to load# the dialogue history files that are used# for transcript generationqversion: ???data_preparation:
data_path: ${root:data/finetuning/sgd/${version}/${split}}dataset_builder:
split: ${split}log_prefix: "predict_${checkpoint}"hydra:
run:
dir: ${output_dir}job_logging:
handlers:
file:
filename: ${log_prefix}_${trainer_args.data_variant}_${hydra.job.name}.log# allows us to compose the project interpreter# toolbox config in `defaults`searchpath:
- file://${root:src/project/configs}
However, if I pass `constraints.model.pretrained_model_name_or_path=google/flant5 in the CLI I get
hydra.errors.ConfigCompositionException: Could not override 'constraints.model.pretrained_model_name_or_path'.
To append to your config use +constraints.model.pretrained_model_name_or_path=google/flant5
This is unexpected. If I do as told, I get:
Either remove + prefix: 'constraints.model.pretrained_model_name_or_path=google/flant5'
Or add a second + to add or override 'constraints.model.pretrained_model_name_or_path': '++constraints.model.pretrained_model_name_or_path=google/flant5'
I think that this is happening because I am trying to override a pretrained_model_name_or_path twice: once by attempting to change it inside constraints.model.pretrained_model_name_or_path and once by specifying the pretrained_model_name_or_path for the actual model (which the constraints config group references via interpolation) itself.
My intention ideal solution would be to compose model , tokenizer and preprocessor as defaults and be able to override them as appropriate, but I could not figure out how to specify the defaults list correctly inside constraints/model.yaml. My best go was:
defaults:
- finetuning/trainer/model@model: model
but this lead to
hydra.errors.MissingConfigException: In 'trainer/constraints/model': Could not find 'trainer/constraints/finetuning/trainer/model/model'
Config search path:
provider=hydra, path=pkg://hydra.conf
provider=main, path=file:///Users/alexandrucoca/work/dev/?/src/?/configs/finetuning
provider=hydra.searchpath in main, path=file:///Users/alexandrucoca/work/dev/?/src/?/configs
provider=schema, path=structured://
It seems like the config specified is sought only in the folder where the config is? Is there a way to get around this @Jasha10 ?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I was wondering if someone could help me understand some weird behaviour I am observing.
I have a trainer configured by
hydra
as follows:NB: names in the yaml below may not match the tree above as that's ChatGPT generated to give the rough structure, not the full tree.
Inside
constraints/model.yaml
I have:When I print my config I get, as expected, something like:
However, if I pass `constraints.model.pretrained_model_name_or_path=google/flant5 in the CLI I get
hydra.errors.ConfigCompositionException: Could not override 'constraints.model.pretrained_model_name_or_path'. To append to your config use +constraints.model.pretrained_model_name_or_path=google/flant5
This is unexpected. If I do as told, I get:
Adding the second + is not what we want:
I think that this is happening because I am trying to override a pretrained_model_name_or_path twice: once by attempting to change it inside constraints.model.pretrained_model_name_or_path and once by specifying the pretrained_model_name_or_path for the actual model (which the constraints config group references via interpolation) itself.
My intention ideal solution would be to compose
model
,tokenizer
andpreprocessor
as defaults and be able to override them as appropriate, but I could not figure out how to specify the defaults list correctly insideconstraints/model.yaml
. My best go was:but this lead to
It seems like the config specified is sought only in the folder where the config is? Is there a way to get around this @Jasha10 ?
Beta Was this translation helpful? Give feedback.
All reactions