Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trying to export a bark model, that is a custom or unsupported architecture, but no custom export configuration was passed as custom_export_configs. #846

Open
RibalBaghdadi opened this issue Jul 26, 2024 · 0 comments

Comments

@RibalBaghdadi
Copy link

RibalBaghdadi commented Jul 26, 2024

from transformers import AutoProcessor, AutoModel
from optimum.intel import OVModelForCausalLM

model_id = "suno/bark-small"
processor = AutoProcessor.from_pretrained("suno/bark-small")
model = OVModelForCausalLM.from_pretrained(model_id, export=True)

Framework not specified. Using pt to export the model.

ValueError Traceback (most recent call last)
in <cell line: 1>()
----> 1 model = OVModelForCausalLM.from_pretrained(model_id, export=True)

2 frames
/usr/local/lib/python3.10/dist-packages/optimum/exporters/openvino/main.py in main_export(model_name_or_path, output, task, device, framework, cache_dir, trust_remote_code, pad_token_id, subfolder, revision, force_download, local_files_only, use_auth_token, token, model_kwargs, custom_export_configs, fn_get_submodels, compression_option, compression_ratio, ov_config, stateful, convert_tokenizer, library_name, **kwargs_shapes)
228 custom_architecture = True
229 if custom_export_configs is None:
--> 230 raise ValueError(
231 f"Trying to export a {model_type} model, that is a custom or unsupported architecture, but no custom export configuration was passed as custom_export_configs. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum-intel/issues if you would like the model type {model_type} to be supported natively in the OpenVINO export."
232 )

ValueError: Trying to export a bark model, that is a custom or unsupported architecture, but no custom export configuration was passed as custom_export_configs. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum-intel/issues if you would like the model type bark to be supported natively in the OpenVINO export.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant