老师您好,打扰您了,切换模型对话时,发现报这个错误,麻烦您帮忙看看,多谢

[2025-07-23 20:07:59,581.581] _internal.py -> _log line:97 [ERROR]: Error on request:
Traceback (most recent call last):
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/werkzeug/serving.py", line 370, in run_wsgi
execute(self.server.app)
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/werkzeug/serving.py", line 333, in execute
for data in application_iter:
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/werkzeug/wsgi.py", line 256, in __next__
return self._next()
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/werkzeug/wrappers/response.py", line 32, in _iter_encoded
for item in iterable:
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/flask/helpers.py", line 125, in generator
yield from gen
File "/Users/zhangnan/PycharmProjects/llmops-api/pkg/response/response.py", line 87, in generate
yield from response
File "/Users/zhangnan/PycharmProjects/llmops-api/internal/service/app_service.py", line 524, in debug_chat
history = token_buffer_memory.get_history_prompt_messages(
File "/Users/zhangnan/PycharmProjects/llmops-api/internal/core/memory/token_buffer_memory.py", line 56, in get_history_prompt_messages
return trim_messages(
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/messages/utils.py", line 345, in wrapped
return func(messages, **kwargs)
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/messages/utils.py", line 832, in trim_messages
return _last_max_tokens(
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/messages/utils.py", line 947, in _last_max_tokens
reversed_ = _first_max_tokens(
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/messages/utils.py", line 862, in _first_max_tokens
if token_counter(messages[:-i] if i else messages) <= max_tokens:
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/language_models/base.py", line 377, in get_num_tokens_from_messages
return sum([self.get_num_tokens(get_buffer_string([m])) for m in messages])
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/language_models/base.py", line 377, in <listcomp>
return sum([self.get_num_tokens(get_buffer_string([m])) for m in messages])
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/language_models/base.py", line 364, in get_num_tokens
return len(self.get_token_ids(text))
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/language_models/base.py", line 351, in get_token_ids
return _get_token_ids_default_method(text)
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/language_models/base.py", line 80, in _get_token_ids_default_method
tokenizer = get_tokenizer()
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/langchain_core/language_models/base.py", line 74, in get_tokenizer
return GPT2TokenizerFast.from_pretrained("gpt2")
File "/opt/anaconda3/envs/llmops/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2020, in from_pretrained
raise EnvironmentError(
OSError: Can't load tokenizer for 'gpt2'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'gpt2' is the correct path to a directory containing all relevant files for a GPT2TokenizerFast tokenizer.
正在回答 回答被采纳积分+1
- 参与学习 528 人
- 解答问题 411 个
全流程打造你自己的(Coze/Dify)低代码智能体开发平台;2025年入行正当时,企业急需,人才稀缺,竞争小;无论入行还是转行,首选口碑好课,门槛低、成长高
了解课程
恭喜解决一个难题,获得1积分~
来为老师/同学的回答评分吧
0 星