runtime error

Exit code: 1. Reason: ponse = _request_wrapper( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 304, in _request_wrapper hf_raise_for_status(response) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 458, in hf_raise_for_status raise _format(RepositoryNotFoundError, message, response) from e huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-67ece8d1-1d27c9011fb5629d3adbd06a;ed0c4238-36b5-46cc-960f-97048c9546e3) Repository Not Found for url: https://huggingface.co/meta-llama/Llama-3.2-3b-base/resolve/main/tokenizer_config.json. Please make sure you specified the correct `repo_id` and `repo_type`. If you are trying to access a private or gated repo, make sure you are authenticated. Invalid username or password. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 7, in <module> tokenizer = AutoTokenizer.from_pretrained(model_name) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 910, in from_pretrained tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 742, in get_tokenizer_config resolved_config_file = cached_file( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 266, in cached_file file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 456, in cached_files raise EnvironmentError( OSError: meta-llama/Llama-3.2-3b-base is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`

Container logs:

Fetching error logs...