GLM-4-9B-Chat WebDemo 部署报错:ValueError: too many values to unpack
2024/7/16 21:02:42
本文主要是介绍GLM-4-9B-Chat WebDemo 部署报错:ValueError: too many values to unpack,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
用开源大模型食用指南 self-llm项目的 GLM-4-9B-Chat WebDemo 部署文档部署时遇到如下错误:
ValueError: too many values to unpack (expected 2) Traceback: File "/root/miniconda3/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script exec(code, module.__dict__) File "/root/autodl-tmp/ChatBot.py", line 51, in <module> generated_ids = model.generate(model_inputs.input_ids, max_new_tokens=512) File "/root/miniconda3/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/root/miniconda3/lib/python3.10/site-packages/transformers/generation/utils.py", line 1914, in generate result = self._sample( File "/root/miniconda3/lib/python3.10/site-packages/transformers/generation/utils.py", line 2651, in _sample outputs = self( File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(*args, **kwargs) File "/root/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 1005, in forward transformer_outputs = self.transformer( File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(*args, **kwargs) File "/root/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 901, in forward hidden_states, presents, all_hidden_states, all_self_attentions = self.encoder( File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(*args, **kwargs) File "/root/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 726, in forward layer_ret = layer( File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(*args, **kwargs) File "/root/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 629, in forward attention_output, kv_cache = self.self_attention( File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(*args, **kwargs) File "/root/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 494, in forward cache_k, cache_v = kv_cache
经排查报错原因是官方的bug导致,最新的包有问题。
重新安装下transformers的包并重启问题就可以解决
pip install transformers==4.40.2
注意:
1、下载模型的第一行代码导包书写有误,需要自行更改下。源代码如下:
import torch from modelscope import snapshot_download, AutoModel, AutoTokenizer import os model_dir = snapshot_download('ZhipuAI/glm-4-9b-chat', cache_dir='/root/autodl-tmp', revision='master')
2、注意模型的路径,将路径改为绝对路径。
mode_name_or_path = '/root/autodl-tmp/ZhipuAI/glm-4-9b-chat'
完整部署文档详见:
https://github.com/datawhalechina/self-llm/blob/master/GLM-4/03-GLM-4-9B-Chat%20WebDemo.md
这篇关于GLM-4-9B-Chat WebDemo 部署报错:ValueError: too many values to unpack的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2025-01-01使用 SVN合并操作时,怎么解决冲突的情况?-icode9专业技术文章分享
- 2025-01-01告别Anaconda?试试这些替代品吧
- 2024-12-31自学记录鸿蒙API 13:实现人脸比对Core Vision Face Comparator
- 2024-12-31自学记录鸿蒙 API 13:骨骼点检测应用Core Vision Skeleton Detection
- 2024-12-31自学记录鸿蒙 API 13:实现人脸检测 Core Vision Face Detector
- 2024-12-31在C++中的双端队列是什么意思,跟消息队列有关系吗?-icode9专业技术文章分享
- 2024-12-31内存泄漏(Memory Leak)是什么,有哪些原因和优化办法?-icode9专业技术文章分享
- 2024-12-31计算机中的内存分配方式堆和栈有什么关系和特点?-icode9专业技术文章分享
- 2024-12-31QT布局器的具体使用原理和作用是什么?-icode9专业技术文章分享
- 2024-12-30用PydanticAI和Gemini 2.0构建Airflow的AI助手