You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**In DM/modules/text.py
line 20
TOKENIZER = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', 'bert-base-cased')
when i tried to download the model
the output shows that the model is missing, as follows:**
Using cache found in C:\Users\patri/.cache\torch\hub\huggingface_pytorch-transformers_main
Traceback (most recent call last):
File "C:\Users\patri/.cache\torch\hub\huggingface_pytorch-transformers_main\src\transformers\utils\versions.py", line 102, in require_version
got_ver = importlib.metadata.version(pkg)
File "C:\Users\patri\anaconda3\envs\radar1\lib\importlib\metadata\__init__.py", line 946, in version
return distribution(distribution_name).version
File "C:\Users\patri\anaconda3\envs\radar1\lib\importlib\metadata\__init__.py", line 919, in distribution
return Distribution.from_name(distribution_name)
File "C:\Users\patri\anaconda3\envs\radar1\lib\importlib\metadata\__init__.py", line 518, in from_name
raise PackageNotFoundError(name)
importlib.metadata.PackageNotFoundError: No package metadata was found for regex
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\patri\Desktop\exp3\data\test.py", line 5, in <module>
model = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', 'bert-base-cased')
File "C:\Users\patri\anaconda3\envs\radar1\lib\site-packages\torch\hub.py", line 404, in load
model = _load_local(repo_or_dir, model, *args, **kwargs)
File "C:\Users\patri\anaconda3\envs\radar1\lib\site-packages\torch\hub.py", line 430, in _load_local
hub_module = _import_module(MODULE_HUBCONF, hubconf_path)
File "C:\Users\patri\anaconda3\envs\radar1\lib\site-packages\torch\hub.py", line 76, in _import_module
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "C:\Users\patri/.cache\torch\hub\huggingface_pytorch-transformers_main\hubconf.py", line 23, in <module>
from transformers import (
File "C:\Users\patri/.cache\torch\hub\huggingface_pytorch-transformers_main\src\transformers\__init__.py", line 26, in <module>
from . import dependency_versions_check
File "C:\Users\patri/.cache\torch\hub\huggingface_pytorch-transformers_main\src\transformers\dependency_versions_check.py", line 57, in <module>
require_version_core(deps[pkg])
File "C:\Users\patri/.cache\torch\hub\huggingface_pytorch-transformers_main\src\transformers\utils\versions.py", line 117, in require_version_core
return require_version(requirement, hint)
File "C:\Users\patri/.cache\torch\hub\huggingface_pytorch-transformers_main\src\transformers\utils\versions.py", line 104, in require_version
raise importlib.metadata.PackageNotFoundError(
importlib.metadata.PackageNotFoundError: No package metadata was found for The 'regex!=2019.12.17' distribution was not found and is required by this application.
Try: `pip install transformers -U` or `pip install -e '.[dev]'` if you're working with git main
The text was updated successfully, but these errors were encountered:
**In DM/modules/text.py
line 20
TOKENIZER = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', 'bert-base-cased')
when i tried to download the model
the output shows that the model is missing, as follows:**
The text was updated successfully, but these errors were encountered: